Welcome to my homepage
I’m a Senior Research Scientist on the Post-Training Applied Research team at NVIDIA, where I focus on supervised fine-tuning of large language models (LLMs), with an emphasis on synthetic data generation. Before NVIDIA, I worked at AWS AI Labs, where I focused on code generation for Amazon Q Developer.
I obtained my PhD in Computer Science at University of California Los Angeles, supervised by Dr. Kai-Wei Chang. I was fortunate to work as a research intern at Meta AI, Yahoo Research, Microsoft Research, and Walmart Labs during my PhD.
News and Announcements
- [10.2025] We introduce GenCluster, achieving IOI Gold with open-weight LLMs.
- [10.2025] We released BigCodeArena, check this out!
- [08.2025] We released Nemotron-Nano-v2.
- [04.2025] We released Nemotron-H, a family of Mamba-Transformer models.
- [04.2025] We released OpenCodeInstruct and OpenCodeReasoning.
- [03.2025] I will serve as a senior area chair for EMNLP and IJCNLP-AACL 2025.
- [01.2025] LibEvolutionEval got accepted at NAACL 2025.
