1.
Chenze Shao, Yang Feng, Jinchao Zhang, Fandong Meng, Jie Zhou. Sequence-Level Training for Non-Autoregressive Neural Machine Translation. Computational Linguistics (CL), 47(4):891–925.pdfpdf
2.
Jicheng Li,Pengzhi Gao,Xuanfu Wu,Yang Feng,Zhongjun He,Hua Wu, Haifeng Wang. Mixup Decoding for Diverse Machine Translation. The 2021 Conference on Empirical Methods in Natural Language Processing(Findings of EMNLP 2021), November 7-11, 2021, 312–320, Online and Punta Cana, Dominican Republicpdfpdf
3.
Shaolei Zhang, Yang Feng. Universal Simultaneous Machine Translation with Mixture-of-Experts Wait-k Policy. The 2021 Conference on Empirical Methods in Natural Language Processing (EMNLP 2021), November 7-11, 2021, 7306–7317, Online and Punta Cana, Dominican Republicpdfpdf
4.
Shaolei Zhang, Yang Feng. Modeling Concentrated Cross-Attention for Neural Machine Translation with Gaussian Mixture Model. The 2021 Conference on Empirical Methods in Natural Language Processing(Findings of EMNLP 2021), November 7-11, 2021, 1401–1411, Online and Punta Cana, Dominican Republicpdfpdf
5.
Lei Shen, Fandong Meng, Jinchao Zhang, Yang Feng and Jie Zhou. GTM: A Generative Triple-wise Model for Conversational Question Generation. The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (ACL-IJCNLP 2021), August 1-6, 2021, 3495–3506, onlinepdfpdf
6.
Zekang Li, Jinchao Zhang, Zhengcong Fei, Yang Feng, Jie Zhou. Addressing Inquiries about History: An Efficient and Practical Framework for Evaluating Open-domain Chatbot Consistency. The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Findings of ACL-IJCNLP 2021), August 1-6, 2021, 1057–1067, onlinepdfpdf
7.
Zekang Li, Jinchao Zhang, Zhengcong Fei, Yang Feng, Jie Zhou. Conversations Are Not Flat: Modeling the Dynamic Information Flow across Dialogue Utterances. The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (ACL-IJCNLP 2021), August 1-6, 2021, 128–138, onlinepdfpdf
8.
Wanying Xie, Yang Feng, Shuhao Gu and Dong Yu. Importance-based Neuron Allocation for Multilingual Neural Machine Translation. The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (ACL-IJCNLP 2021), August 1-6, 2021, 5725–5737, onlinepdfpdf
9.
Yang Feng, Shuhao Gu, Dengji Guo, Zhengxin Yang and Chenze Shao. Guiding Teacher Forcing with Seer Forcing for Neural Machine Translation. The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (ACL-IJCNLP 2021), August 1-6, 2021, 2862–2872, onlinepdfpdf
10.
Jicheng Li, Yang Feng and Jiao Ou. SE-DAE: Style-Enhanced Denoising Auto-Encoder for Unsupervised Text Style Transfer. The International Joint Conference on Neural Networks (IJCNN 2021), July 18-22, 2021, 1-8, 2021, onlinepdfpdf
11.
Jiao Ou, Yang Feng. Better Learning and Fusing Multi-Granularity Context Representations for Relevant Response Generation. The International Joint Conference on Neural Networks (IJCNN 2021), July 18-22, 2021, 1-8, 2021, onlinepdfpdf
12.
Yong Shan, Yang Feng, Chenze Shao. Modeling Coverage for Non-Autoregressive Neural Machine Translation. The International Joint Conference on Neural Networks (IJCNN 2021), July 18-22, 2021, 1-8, 2021, onlinepdfpdf
13.
Shuhao Gu, Yang Feng, Wanying Xie. Pruning-then-Expanding Model for Domain Adaptation of Neural Machine Translation. The 2021 Annual Conference of the North American Chapter of the Association for Computational Linguistics (NAACL 2021), June 6–11, 2021, 3942–3952, 2021, onlinepdfpdf