vastwebdesign.blogg.se

Transee in engleza
Transee in engleza




Association for Computational Linguistics. In Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing, pages 557–575, Suzhou, China. English Intermediate-Task Training Improves Zero-Shot Cross-Lingual Transfer Too. Anthology ID: 2020.aacl-main.56 Volume: Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing Month: December Year: 2020 Address: Suzhou, China Venue: AACL SIG: Publisher: Association for Computational Linguistics Note: Pages: 557–575 Language: URL: DOI: Bibkey: phang-etal-2020-english Cite (ACL): Jason Phang, Iacer Calixto, Phu Mon Htut, Yada Pruksachatkun, Haokun Liu, Clara Vania, Katharina Kann, and Samuel R.

transee in engleza

We also investigate continuing multilingual MLM during intermediate-task training and using machine-translated intermediate-task data, but neither consistently outperforms simply performing English intermediate-task training. Using our best intermediate-task models for each target task, we obtain a 5.4 point improvement over XLM-R Large on the XTREME benchmark, setting the state of the art as of June 2020. MNLI, SQuAD and HellaSwag achieve the best overall results as intermediate tasks, while multi-task intermediate offers small additional improvements. We see large improvements from intermediate training on the BUCC and Tatoeba sentence retrieval tasks and moderate improvements on question-answering target tasks. Using nine intermediate language-understanding tasks, we evaluate intermediate-task transfer in a zero-shot cross-lingual setting on the XTREME benchmark.

transee in engleza transee in engleza

We investigate whether English intermediate-task training is still helpful on non-English target tasks. Abstract Intermediate-task training-fine-tuning a pretrained model on an intermediate task before fine-tuning again on the target task-often improves model performance substantially on language understanding tasks in monolingual English settings.






Transee in engleza