文件列表:
基于转写的多语言大规模语言模型适应【英文版】.pdf |
下载文档 |
资源简介
>
英文标题:Romanization-based Large-scale Adaptation of Multilingual Language Models中文摘要:本文研究了利用大量转写大幅提高多语种预训练语言模型在少资源语言中的性能,并且发现使用 UROMAN 基于的转写方法可以在许多语言中提供强大的性能,特别是在对未见到的语言脚本和数据量有限的情况下。英文摘要:Large multilingual pretrained language models (mPLMs) have become the defacto state of the art for cross-lingual transfer in NLP. However, theirlarge-scale deployment to many languages, besides pretraining data scarcity, isalso hindered by the increase in vocabulary size and limitations in theirparameter budget.
加载中...
已阅读到文档的结尾了