Yongtong Wu | 吴永彤

Ph.D. Student @Peking University

wuyongtong@stu.pku.edu.cn

🫡 Hi everyone!

I am currently a Ph.D. student at Peking University (PKU) fortunate to be advised by Prof. Xin Jin on System Software, especially for LLM infrastructure.

Previously, I earned my bachelor's degree in Informatics and Computing Science at Peking University in 2025, where I was mentored by Prof. Qun Huang on RDMA middleware development.

Feel free to drop me an email anytime for anything.

Experiences

DeepSeek-AI

DeepSeek-AI
2025/07-Present
System Group

My role at DeepSeek-AI is dedicated to help building inference infrastructure for the next generation of DeepSeek models. A key part of my work is optimizing our large-scale internal software systems to ensure peak performance on various hardware platforms.

Tencent WXG

Tencent WXG
2025/02-2025/07 @Beijing
Research Intern

I work with the most excellent peers and engineers to build incredible inference system for SOTA large language models like DeepSeek-R1. Our system deployed for WeChat serves a billion of users.

We also work closely with open-source communities, such as SGLang and vLLM, to contribute our industrial insight and optimizations for extreme large deployment.

Syslab, University of Washington

Syslab, University of Washington
2024/06-2024/10 @Seattle
Research Intern

Microsoft Research Asia

Microsoft Research Asia
2023/04-2023/07 @Beijing
Research Intern

Teaching

Talks

Publications

  1. DeepSeek-V3.2: Pushing the Frontier of Open Large Language Models

    [arxiv]2025-12
  2. High-level Programming for Application Networks

    Xiangfeng Zhu, Yuyao Wang, Banruo Liu, Yongtong Wu, Nikola Bojanic, Jingrong Chen, Gilbert Bernstein, Arvind Krishnamurthy, Sam Kumar, Ratul Mahajan, Danyang Zhuo

    NSDI 2025, accept rate: 55/401=13.7%
  3. RB^2: Narrow the Gap between RDMA Abstraction and Performance via a Middle Layer

    Haifeng Sun, Yixuan Tan, Yongtong Wu, Jiaqi Zhu, Qun Huang, Xin Yao, Gong Zhang

    INFOCOM 2024, accept rate: ~20%

    [PDF]