文件列表:
通过模块化线性化注意力提高自回归 NLP 任务的性能【英文版】.pdf |
下载文档 |
资源简介
>
英文标题:Improving Autoregressive NLP Tasks via Modular Linearized Attention中文摘要:本文提出基于模块化线性化注意力(MLA)的自然语言处理技术,通过结合多种高效的注意力机制,并验证其在自回归任务上显著提升了推理质量和效率。英文摘要:Various natural language processing (NLP) tasks necessitate models that areefficient and small based on their ultimate application at the edge or in otherresource-constrained environments. While prior research has reduced the size ofthese models, increasing computational efficiency without considerableperformance impacts remains difficult, especia
加载中...
本文档仅能预览20页