I am a third-year PhD student at Tsinghua University, advised by Prof. Chun Yuan and Li Shen. My research focuses on efficient machine learning, particularly data-free learning. Training high-performing models typically demands extensive data and computational resources. The open-source trend has made numerous pretrained models readily available in the community. Data-free learning (i.e., learning from models) presents a promising alternative by addressing challenges related to data accessibility and privacy, while also circumventing the computational burden of pretraining. Given the typically limited computational resources, this direction is especially practical for academia.

My preferred research paradigm involves observing empirical phenomena, forming explanations, constructing theoretical frameworks, and ultimately deriving effective methods. As LLM and MLLM continue to gain traction, I am committed to scaling models as much as possible within my available resources, as exemplified by my line of work. I welcome opportunities for collaboration.

πŸ“ Selected Publications

My work explores data-free learning from three angles, including:

  • Model merging directly in parameter space
  • Synthetic data based on generative models
  • Model inversion based on discriminative models

Model merging directly in parameter space:

  • OptMerge: Unifying Multimodal LLM Capabilities and Modalities via Model Merging. ICLR 2026
    Yongxian Wei, Runxi Cheng, Weike Jin, Enneng Yang, Li Shen, Lu Hou, Sinan Du, Chun Yuan, Xiaochun Cao, Dacheng Tao
    [Paper] [Code]
  • Modeling Multi-Task Model Merging as Adaptive Projective Gradient Descent. ICML 2025
    Yongxian Wei, Anke Tang, Li Shen, Zixuan Hu, Chun Yuan, Xiaochun Cao
    [Paper] [Code]
  • Whoever Started the Interference Should End It: Guiding Data-Free Model Merging via Task Vectors. ICML 2025
    Runxi Cheng*, Feng Xiong*, Yongxian Wei, Wanyun Zhu, Chun Yuan
    [Paper] [Code]

Synthetic data based on generative models:

  • Learning to Pose Problems: Reasoning-Driven and Solver-Adaptive Data Synthesis for Large Reasoning Models. Preprint
    Yongxian Wei, Yilin Zhao, Li Shen, Xinrui Chen, Runxi Cheng, Sinan Du, Hao Yu, Gang Liu, Jiahong Yan, Chun Yuan, Dian Li
    [Paper] [Code]
  • Synthetic Data Beyond Generative Models: A Comprehensive Survey of How, Why, and Where. Preprint
    Zixuan Hu*, Yongxian Wei*, Guozheng Ma, Jiahe Wang, Sinan Du, Runxi Cheng, Qianpu Sun, Haotian Luo, Chun Yuan, Li Shen, Dacheng Tao
    [Paper] [Code]
  • Meta-Learning without Data via Unconditional Diffusion Models. IEEE TCSVT
    Yongxian Wei, Zixuan Hu, Li Shen, Zhenyi Wang, Chun Yuan
    [Paper] [Code]
  • Adaptive Defense against Harmful Fine-Tuning via Bayesian Data Scheduler. NeurIPS 2025 (Spotlight)
    Zixuan Hu, Li Shen, Zhenyi Wang, Yongxian Wei, Dacheng Tao
    [Paper] [Code]

Model inversion based on discriminative models:

  • Open-Vocabulary Customization from CLIP via Data-Free Knowledge Distillation. ICLR 2025 (Oral)
    Yongxian Wei, Zixuan Hu, Li Shen, Zhenyi Wang, Chun Yuan, Dacheng Tao
    [Paper] [Code]
  • Task-Distributionally Robust Data-Free Meta-Learning. IEEE TPAMI
    Zixuan Hu, Yongxian Wei, Li Shen, Zhenyi Wang, Baoyuan Wu, Chun Yuan, Dacheng Tao
    [Paper] [Code]
  • Unlocking Tuning-Free Few-Shot Adaptability in Visual Foundation Models by Recycling Pre-Tuned LoRAs. CVPR 2025
    Zixuan Hu, Yongxian Wei, Li Shen, Chun Yuan, Dacheng Tao
    [Paper] [Code]
  • Task Groupings Regularization: Data-Free Meta-Learning with Heterogeneous Pre-trained Models. ICML 2024
    Yongxian Wei, Zixuan Hu, Li Shen, Zhenyi Wang, Chun Yuan, Dacheng Tao
    [Paper] [Code]
  • Sparse Model Inversion: Efficient Inversion of Vision Transformers for Data-Free Applications. ICML 2024
    Zixuan Hu, Yongxian Wei, Li Shen, Zhenyi Wang, Chun Yuan, Dacheng Tao
    [Paper] [Code]
  • FREE: Faster and Better Data-Free Meta-Learning. CVPR 2024
    Yongxian Wei, Zixuan Hu, Zhenyi Wang, Li Shen, Chun Yuan, Dacheng Tao
    [Paper] [Code]

πŸ”₯ Honors

  • National Scholarship for Graduate Students (for the second time), 2025.10
  • National Scholarship for Graduate Students (at my first year), 2024.10
  • Undergraduate President’s Medal (Top 10 university-wide), 2022.10
  • National Scholarship for Undergraduate Students, 2021.10
  • The First Runner-Up (2/116) of ICCV LargeFineFoodAI, 2021.10

πŸ“– Education

  • PhD student, Tsinghua University, 2023 – Present
  • Undergrad, Nanjing University of Science and Technology, 2019 – 2023

πŸ’Ό Academic Service

Reviewer for ICML/ICLR/NeurIPS/CVPR/ICCV/ECCV/AAAI