Yuchi Wang

Yuchi Wang

Master Student of Artificial Intelligence

Lanco Lab

Peking University

Biography

Hi! I’m a final year master’s student at the AAIS (Academy for Advanced Interdisciplinary Studies), Peking University. Currently, I am a member of Lanco Lab, lead by Prof. Xu Sun. Previously, I earned my Bachelor’s degree from the School of Data Science, Fudan University.

My research interests encompass (1) Multimodal learning, including visual understanding, text-guided visual generation (e.g., image/video generation or editing, talking face generation), etc (2) Generative Models such as diffusion models and VAEs (3) LLMs (especially multimodal large language models) and their applications like embodied AI.

I’m currently seeking a potential PhD position. Feel free to reach out if you are interested!

Interests
  • Multimodal learning
  • Generativa AI (AIGC)
  • Large language models
  • Diffusion models
Education
  • Master in Data Science, Peking University, 2025 (Expected)

    Lanco Lab; Advisor: Xu Sun; AAIS

  • BSc in Data Science, Fudan University, 2022

    School of Data Science; GPA Rank: 3/85

News

  • [2024.05] One paper accepted by ACL 2024 (Findings)
  • [2024.03] One paper accepted by NAACL 2024
  • [2024.01] One paper accepted by ICLR 2024
  • [2023.10] We release GAIA demo!
  • [2023.10] One paper accepted by FMDM@NeurIPS 2023
  • [2023.05] Starting internship at MSRA ML Group
  • [2022.09] Joining in Lanco Lab, Peking University

All Publications

Work Experience

 
 
 
 
 
Microsoft Research Asia  (MSRA)
Research Intern
May 2023 – Present Beijing, China
  • Excited to join the ML group, advised by Junliang Guo and Xu Tan.
  • During my internship up to now, I have primarily focused on generative learning, including talking avatar generation, video generation/editing.

Academic Service

Teaching Assistant

I am currently serving as a teaching assistant for the course Large Language Model in Decision Intelligence (PKU Class 2024 Spring). This course is tailored for undergraduates, aiming to provide them with a foundational understanding of large language models and effective strategies for their utilization.