Hi!😄 I’m Xinyi Liu, a third year PH.D. student of School of Computer Science in Peking University and a member of DAIR lab, led by professor Bin Cui. I received my B.Sc. degree from School of Computer Science and Technology, Harbin Institute of Technology, Weihai in June 2023. I am the main developer of Hetu-Galvatron, an open-source automatic distributed system for efficient transformer training.

My research interest lies in deep learning systems, especially in automatic parallelism and MoE training acceleration. Recently, I am working on optimization of multi-modality MoE models and reinforcement learning on MoE architectures.

I am actively seeking collaborations and open to discussions on related topics. Feel free to reach out!

📖 Educations

🔥 News

📝 Publications

2026

  • [ASPLOS] Xinyi Liu, Yujie Wang, Fangcheng Fu, Xuefeng Xiao, Huixia Li, Jiashi Li, Bin Cui. “LAER-MoE: Load-Adaptive Expert Re-layout for Efficient Mixture-of-Experts Training.”

2025

🚀 Systems

Hetu-Galvatron

Hetu-Galvatron - An Automatic Distributed Training System for Transformer Models

Main Developer | GitHub | Documentation

A high-performance automatic distributed training system for Transformer models and LLMs, independently developed and open-sourced by PKU-DAIR Lab.

  • Automatic Parallelism Optimization: Efficiently search for optimal strategies via cost modeling
  • Fine-grained Hybrid Parallelism: Layer-wise flexible configuration (DP, SDP/ZeRO, PP, TP, SP, CKPT)
  • Workload Versatility: BERT, GPT, T5, LLaMA, Vision Transformers, multi-modality models

🎖 Honors and Awards

  • Kunpeng Ascend Outstanding Young Contributor in Scientific Research and Innovation (2025)

  • National Scholarship (2022)

  • Candidate for the 13th China Youth Science and Technology Innovation Award (2022)

  • 7th place(0.2%), The 26th CCF Certified Software Professional (2022)

  • Runner Up, Gold Medal, The 12th Shandong Provincial ICPC Collegiate Programming Contest (2022)

  • Outstanding Students of Higher Education Institutions (2021)

  • Gold Medal, 2021 China Collegiate Programming Contest, Harbin Site (2021)

  • Silver Medal, The 46th ICPC Asia Regional Contest Jinan Site (2021)

💬 Invited Talks

  • Galvatron: An Automatic Distributed System for Efficient Large-Scale Transformer Training. PyTorch Day China, Beijing, China, June, 2025

  • Galvatron: An Automatic Distributed System for Efficient Large-Scale Transformer Training. Kunpeng Ascend Developer Conference (KADC), Beijing, China, May, 2025

💻 Internships

  • [2025.02 - present] ByteDance, Seed Group.

📚 Teaching Assistant

  • Introduction to Computing A (For undergraduate students, Fall, 2023)