Victor Agostinelli

1721282870822.jpg

Victor Agostinelli is a PhD candidate in the School of Electrical Engineering and Computer Science at Oregon State University, advised by Lizhong Chen. His research interests cover the intersection between efficient deep learning architectures and the design of custom hardware to accelerate those architectures in the form of software and hardware (SW/HW) co-design.

Previously, Victor earned his Bachelor of Science degree from Oregon State University in 2020 and his Master of Science degree from Oregon State University in 2023. Currently, he actively collaborates with Antonino Tumeo at Pacific Northwest National Laboratory as part of the Distinguished Graduate Research Program to improve upon the optimization of specialized hardware for artificial intelligence.

news

Apr 18, 2025 Thrilled to be taking an internship with Meta this summer as part of their AI Efficiency Insights team in Bellevue, Washington!
Mar 26, 2025 UltraFormer, a still-developing hyper-efficient transformer architecture featuring hybrid linear-sparse attention and ternary linear projections, has been accepted for presentation as a poster at FCCM ‘25!
Jan 10, 2025 Proud to be a co-organizer of IWSLT ‘25, specifically the simultaneous track! To be co-hosted at ACL ‘25 in Vienna, Austria.
Sep 20, 2024 SimulMask (official implementation here), which is built on Simul-LLM, has been accepted at EMNLP ‘24! To be presented as a poster at the conference.

selected publications

  1. EMNLP ’24
    Simultaneous Masking, Not Prompting Optimization: A Paradigm Shift in Fine-tuning LLMs for Simultaneous Translation
    Matthew Raffel, Victor Agostinelli, and Lizhong Chen
    In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, Nov 2024
  2. ACL ’24
    Simul-LLM: A Framework for Exploring High-Quality Simultaneous Translation with Large Language Models
    Victor Agostinelli, Max Wild, Matthew Raffel, and 2 more authors
    In Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), Aug 2024
  3. ICML ’24
    LeaPformer: Enabling Linear Transformers for Autoregressive and Simultaneous Tasks via Learned Proportions
    Victor Agostinelli, Sanghyun Hong, and Lizhong Chen
    In Forty-first International Conference on Machine Learning, Aug 2024