Department of Computer Science, Northwestern University.

Address: Mudd Hall, 2233 Tech Drive, Third Floor, Evanston, IL 60208

Email: [email protected].

Hi, I'm a first-year PhD student at Northwestern University in the MAGICS lab advised by Prof. Han Liu. I focus on advancing the understanding and application of foundation models. My research explores the theoretical foundations of these models, including their universal approximation capabilities and potential to perform complex algorithms. I also work on extending their applications beyond language and vision, particularly in time series analysis and scientific domains.

website_self_2.png


Publication

  1. Outlier-Efficient Hopfield Layers for Large Transformer-Based Models. Jerry Yao-Chieh Hu, Pei-Hsuan Chang, Haozheng Luo, Hong-Yu Chen, Weijian Li, Wei-Po Wang, Han Liu. In Forty-first International Conference on Machine Learning, 2024 Paper ****https://github.com/MAGICS-LAB/OutEffHop

  2. Learning Spectral Methods by Transformers. He, Yihan, Yuan Cao, Hong-Yu Chen, Dennis Wu, Jianqing Fan, and Han Liu.  arXiv preprint arXiv: 2501.01312 (2025).

    [Paper](<https://arxiv.org/abs/2501.01312>)
    
  3. Transformers Simulate MLE for Sequence Generation in Bayesian Networks. Cao, Yuan, Yihan He, Dennis Wu, Hong-Yu Chen, Jianqing Fan, and Han Liu. arXiv preprint arXiv: 2501.02547 (2025). Paper