About me

I am a PhD student in the Department of Computing at Imperial College London, learning to train energy-based models under the supervision of Yingzhen Li. I completed my undergraduate studies in the School of Computer Science and Engineering at Sun Yat-sen University and previously worked as a research intern at:

My PhD research is deep generative modelling, and currently focusing on diffusion post-training and test-time scaling, as well as few-step generation and distillation:

  • Test-time scaling and post-training for diffusion models [paper]
  • Few-step generation with optimal covariance matching [paper]; one-step distillation [paper1, paper2]
  • Neural flow sampler for unnormalized distributions [paper1, paper2]
  • Training energy-based models without MCMC [paper1, paper2]

Publications

* denotes equal contribution; check the full list here

Services

  • Conference Reviewing: ICLR (2024-2026), ICML (2022-2025), NeurIPS (2022-2025), AISTAT (2023-2026), IJCAI (2022-2024), ACL (2022), NAACL (2022)