Chenze Shao

I am a researcher at WechatAI, Tencent Inc.

I received my Ph.D. in Computer Science from Institute of Computing Technology, Chinese Academy of Sciences, under the advisement of Professor Yang Feng.

I develop generative models for language modeling.

latest posts

selected publications

  1. arXiv
    Continuous Autoregressive Language Models
    Chenze Shao, Darren Li, Fandong Meng, and 1 more author
    In arXiv preprint arXiv:2510.27688, 2025
  2. ICLR
    Patch-Level Training for Large Language Models
    Chenze Shao, Fandong Meng, and Jie Zhou
    In International Conference on Learning Representations, 2025
  3. CL
    Sequence-level Training for Non-autoregressive Neural Machine Translation
    Chnze Shao, Yang Feng, Jinchao Zhang, and 2 more authors
    In Computational Linguistics, 2021