机器学习实验室博士生系列论坛(第十六期)—— Recent Progress on Non-Autoregressive Machine Translation
报告人:Hao Cheng (PKU)
时间:2021-10-13 15:10-16:10
地点:大阳城2138理科一号楼1513会议室 & 腾讯会议 761 4699 1810
Abstract: The autoregressive neural machine translation (AT) model based on encoder-decoder framework has achieved great success. The decoder predicts target tokens step by step conditioned on source tokens and previously predicted tokens. Such dependency between target tokens inevitably leads to the decoding latency. Non-autoregressive neural machine translation (NAT) models remove the dependency between tokens in the target sentence and generate all tokens in parallel, significantly improving the inference speed. The assumption of conditional independence in target tokens makes it more difficult for NAT models to learn the target distribution. The translation of NAT models is often incomplete or repetitive, especially for long sentences. In this talk, we will provide an overview of different schools of thought and approaches from a variety of perspectives in NAT, including model structures, training methods, loss functions, and decoding methods.