![]() On the Cross-lingual Transferability of Monolingual Representations We also release XQuAD as a more comprehensive cross-lingual benchmark, which comprises 240 paragraphs and 1190 question-answer pairs from SQuAD v1.1 translated into ten languages by professional translators.", Our results contradict common beliefs of the basis of the generalization ability of multilingual models and suggest that deep monolingual models learn some abstractions that generalize across languages. However, we show that it is competitive with multilingual BERT on standard cross-lingual classification benchmarks and on a new Cross-lingual Question Answering Dataset (XQuAD). This approach does not rely on a shared vocabulary or joint training. More concretely, we first train a transformer-based masked language model on one language, and transfer it to a new language by learning a new embedding matrix with the same masked language modeling objective, freezing parameters of all other layers. We evaluate this hypothesis by designing an alternative approach that transfers a monolingual model to new languages at the lexical level. This generalization ability has been attributed to the use of a shared subword vocabulary and joint training across multiple languages giving rise to deep multilingual abstractions. Publisher = "Association for Computational Linguistics",Ībstract = "State-of-the-art unsupervised multilingual models (e.g., multilingual BERT) have been shown to generalize in a zero-shot cross-lingual setting. Cite (Informal): On the Cross-lingual Transferability of Monolingual Representations (Artetxe et al., ACL 2020) Copy Citation: BibTeX Markdown MODS XML Endnote More options… PDF: Video: Code deepmind/xquadĪdditional community code Data XQuAD, MLDoc, PAWS-X, SQuAD, WiC, = "On the Cross-lingual Transferability of Monolingual Representations",īooktitle = "Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics", ![]() Association for Computational Linguistics. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 4623–4637, Online. On the Cross-lingual Transferability of Monolingual Representations. Anthology ID: 2020.acl-main.421 Volume: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics Month: July Year: 2020 Address: Online Venue: ACL SIG: Publisher: Association for Computational Linguistics Note: Pages: 4623–4637 Language: URL: DOI: 10.18653/v1/2020.acl-main.421 Bibkey: artetxe-etal-2020-cross Cite (ACL): Mikel Artetxe, Sebastian Ruder, and Dani Yogatama. We also release XQuAD as a more comprehensive cross-lingual benchmark, which comprises 240 paragraphs and 1190 question-answer pairs from SQuAD v1.1 translated into ten languages by professional translators. Abstract State-of-the-art unsupervised multilingual models (e.g., multilingual BERT) have been shown to generalize in a zero-shot cross-lingual setting.
0 Comments
Leave a Reply. |