NEWS

Machine Learning

RIT wins one of shared tasks at WAT 2020

2020.12.08

RIT researcher Dongzhe Wang presented his co-authored paper, “Goku’s Participation in WAT 2020” at the 7th Workshop on Asian Translation of AACL-IJCNLP 2020 conference, an open machine translation evaluation campaign focusing on Asian languages. Dongzhe presented RIT’s participation (team id: goku20) of shared task III: JPC/BSD translation tasks (i.e., short for Japanese Patent Corpus and Business Scene Dialogue, respectively. RIT’s submission overwhelmed strong competitors from the University of Tokyo (Japan), Dublin City University (Germany), TATA Consultancy Services (IT company from India), and won first place on the BSD document-level translation task leaderboard. This presentation was attended by a wide range of researchers from industry and academia, and the recorded presentation video clip can be found here.

RIT’s research explored how robust Transformer models perform in translation from sentence-level to document-level, from resource-rich to low-resource languages. Previous studies have mainly focused on traditional sentence-level neural machine translation (NMT) systems, whose performance tends to be saturated. However, RIT’s exploration strived to make the most of context and improve the discourse of document-level NMT systems. Besides, RIT’s participation in the Mix-domain task of WAT 2020 significantly outperformed the best results they achieved last year at WAT 2019 in the English to Myanmar translation task. In the future, RIT will be committed to applying new research accomplishments to production. For instance, document-level NMT can contribute to the scenarios of subtitle translation in Rakuten VIKI,  and chat translation in Rakuten LINK. The paper was the outcome of RIT’s sustained efforts to push the frontiers of machine translation over the past several years, led by the RIT Singapore team and involving co-author Ohnmar Htun.

WAT 2020 was held as an online event using Zoom for research papers, shared tasks presentations, as well as Q&A sessions and Rocket.Chat for paper discussions. Notably, several new tasks from the WAT 2020 workshop attracted much attention. Document-level translation tasks for Japanese from and to English enriched the translation content from sentence-level to document-level; more low-resource language tasks were introduced for Bengali/Hindi/Tamil/Telugu/Marathi/Gujarati/Thai/Malay/Indonesian from and to English, as well as the itnroduction of image-aided translation tasks for Japanese from and to English with the new release of multimodel dataset. Overall, WAT 2020 provided an excellent annual platform for the researchers in the machine translation and NLP community to share their research achievements, methodologies, and insights for the study of Asian languages. 

RIT would like to thank WAT and all attendees.

Copied! instagram