Mimicking Infants' Bilingual Language Acquisition for Domain Specialized Neural Machine Translation

Research output: Contribution to journalJournal articleResearchpeer-review

Documents

  • Fulltext

    Final published version, 1.37 MB, PDF document

  • Chanjun Park
  • Woo Young Go
  • Sugyeong Eo
  • Hyeonseok Moon
  • Seolhwa Lee
  • Heuiseok Lim

Existing methods of training domain-specialized neural machine translation (DS-NMT) models are based on the pretrain-finetuning approach (PFA). In this study, we reinterpret existing methods based on the perspective of cognitive science related to cross language speech perception. We propose the cross communication method (CCM), a new DS-NMT training approach. Inspired by the learning method of infants, we perform DS-NMT training by configuring and training DC and GC concurrently in batches. Quantitative and qualitative analysis of our experimental results show that CCM can achieve superior performance compared to the conventional methods. Additionally, we conducted an experiment considering the DS-NMT service to meet industrial demands.

Original languageEnglish
JournalIEEE Access
Volume10
Pages (from-to)38684-38693
ISSN2169-3536
DOIs
Publication statusPublished - 2022

Bibliographical note

Publisher Copyright:
© 2013 IEEE.

    Research areas

  • cross communication method, deep learning, Domain-specialized neural machine translation, neural machine translation

ID: 309124658