博士研究生
邵晨泽
  • 邮  箱:
  • 研究方向:

    机器翻译、自然语言处理

  • 个人网页:

教育背景

2018-present 中科院计算技术研究所, 博士在读

2014-2018 中国科学院大学,工学学士

论文列表

[CL2021] Chenze Shao, Yang Feng, Jinchao Zhang, Fandong Meng, Jie Zhou. Sequence-Level Training for Non-Autoregressive Neural Machine Translation

[NAACL2022] Chenze Shao, Xuanfu Wu, Yang Feng. One Reference Is Not Enough: Diverse Distillation with Reference Selection for Non-Autoregressive Translation

[ACL2022] Chenze Shao, Yang Feng. Overcoming Catastrophic Forgetting beyond Continual Learning: Balanced Training for Neural Machine Translation

[AAAI2020] Chenze Shao, Jinchao Zhang, Yang Feng, Fandong Meng, Jie Zhou. Minimizing the Bag-of-Ngrams Difference for Non-Autoregressive Neural Machine Translation

[ACL2019] Chenze Shao, Yang Feng, Jinchao Zhang, Fandong Meng, Xilin Chen, Jie Zhou. Retrieving Sequential Information for Non-Autoregressive Neural Machine Translation

[EMNLP2018] Chenze Shao, Yang Feng, Xilin Chen. Greedy Search with Probabilistic N-gram Matching for Neural Machine Translation

[中文信息学报2020] 冯洋, 邵晨泽. 神经机器翻译前沿综述

[IJCNN2021] Yong Shan, Yang Feng, Chenze Shao. Modeling Coverage for Non-Autoregressive Neural Machine Translation

[ACL2021] Yang Feng, Shuhao Gu, Dengji Guo, Zhengxin Yang, Chenze Shao. Guiding Teacher Forcing with Seer Forcing for Neural Machine Translation

[EMNLP2020] Xuanfu Wu, Yang Feng, Chenze Shao. Generating Diverse Translation from Model Distribution with Dropout

[AAAI2020] Yang Feng, Wanying Xie, Shuhao Gu, Chenze Shao, Wen Zhang, Zhengxin Yang, Dong Yu. ModelingFluency and Faithfulness for Diverse Neural Machine Translation

获奖及荣誉

承担科研项目情况