Welcome to my homepage

I’m a Senior Research Scientist on the Post-Training Applied Research team at NVIDIA, where I focus on supervised fine-tuning of large language models (LLMs), with an emphasis on synthetic data generation. Before NVIDIA, I worked at AWS AI Labs, where I focused on code generation for Amazon Q Developer.

I obtained my PhD in Computer Science at University of California Los Angeles, supervised by Dr. Kai-Wei Chang. I was fortunate to work as a research intern at Meta AI, Yahoo Research, Microsoft Research, and Walmart Labs during my PhD.

News and Announcements

  1. [10.2025] We introduce GenCluster, achieving IOI Gold with open-weight LLMs.
  2. [10.2025] We released BigCodeArena, check this out!
  3. [08.2025] We released Nemotron-Nano-v2.
  4. [04.2025] We released Nemotron-H, a family of Mamba-Transformer models.
  5. [04.2025] We released OpenCodeInstruct and OpenCodeReasoning.
  6. [03.2025] I will serve as a senior area chair for EMNLP and IJCNLP-AACL 2025.
  7. [01.2025] LibEvolutionEval got accepted at NAACL 2025.