Xiaoyuan Zhang
Department of Computer Science, City University of Hong Kong.
 
  Kowloon Tong, Hong Kong SAR China
I am a Ph.D. candidate at the Department of Computer Science, CityUHK. I will graduate in April 2025 under the supervision of Chair Prof. Zhang Qingfu (IEEE Fellow). Before joining CityUHK, I received my B.E. (2017) and M.E. (2020) degrees from Shanghai Jiao Tong University, where I was advised by Prof. Qi Chenkun. I have also been fortunate to maintain long-term collaborations with Prof. Yang Yaodong at Peking University and Prof. Han Zhao at the University of Illinois at Urbana-Champaign.
I focus on the theory, software implementation, and engineering applications of multi-objective optimization. I have made many theoretical advances in Pareto set learning and developed LibMOON, the first open-source, gradient-based library for large-scale multi-objective optimization. Collaborating with the team from PKU, I applied this technique to train large language models (LLMs) with tens of billions of parameters, with results published at NeurIPS 2024.
I will graduate in the spring of 2025 and am currently on the job market (CV, 中文简历).
Academic service
- Conference Reviews:
    - ICLR 2025, AISTATS 2025, AAAI 2025
- NeurIPS 2024, ICLR 2024, ICML 2024
- ICLR 2023, NeurIPS 2023, ICML 2023
- ICLR 2022, NeurIPS 2022, ICML 2022
- ICML 2021
 
- Journal Reviews:
    - Swarm Evolutionary Computation
 
Professional Talks
- MOO: from a Single Solution, to a Set of Solutions, and to Infinite Solutions. Institute for Artificial Intelligence, Peking University, Beijing. Host: Prof. Yang Yaodong. Mar. 8, 2024.
- Pareto Machine Learning: Theories, Systems, and Applications. OIST, Japan. Host: Prof. Han Zhao, Makoto Yamada. July 18, 2024.
news
| Sep 26, 2024 | “Gliding over the Pareto Front with Uniform Designs” is accepted to NeurIPS 2024, first author. | 
|---|---|
| Sep 26, 2024 | “Panacea: Pareto Alignment via Preference Adaptation for LLMs” is accepted to NeurIPS 2024, co-first author. | 
| Sep 26, 2024 | “LibMOON: A Gradient-based MultiObjective OptimizatioN Library in PyTorch” is accepted to NeurIPS 2024, first author. | 
| Jul 16, 2024 | “PMGDA: A Preference-based Multiple Gradient Descent Algorithm” is accepted to IEEE Transactions on Emerging Topics in Computing (TETCI),first author, IF=5.3. | 
| Feb 8, 2024 | Get married with Yingying Yu. | 
| Sep 22, 2023 | “Hypervolume Maximization: A Geometric View of Pareto Set Learning” is accepted to NeurIPS 2023, first author. | 
selected publications
- 
      NeurIPSHypervolume maximization: A geometric view of pareto set learningAdvances in Neural Information Processing Systems, 2023
- 
      NeurIPSLibMOON: A Gradient-based MultiObjective OptimizatioN Library in PyTorchAdvances in Neural Information Processing Systems, 2024
- 
      TETCIPMGDA: A Preference-based Multiple Gradient Descent AlgorithmIEEE Transactions on Emerging Topics in Computational Intelligence, 2024
- 
      NeurIPSPanacea: Pareto Alignment via Preference Adaptation for LLMsAdvances in Neural Information Processing Systems, 2024
- 
      NeurIPSGliding over the Pareto Front with Uniform DesignsAdvances in Neural Information Processing Systems, 2024