×
img

通过模块化线性化注意力提高自回归 NLP 任务的性能(英文版)

发布者:wx****46
2023-04-21
4 MB 36 页
人工智能(AI)
文件列表:
通过模块化线性化注意力提高自回归 NLP 任务的性能【英文版】.pdf
下载文档
英文标题:Improving Autoregressive NLP Tasks via Modular Linearized Attention中文摘要:本文提出基于模块化线性化注意力(MLA)的自然语言处理技术,通过结合多种高效的注意力机制,并验证其在自回归任务上显著提升了推理质量和效率。英文摘要:Various natural language processing (NLP) tasks necessitate models that areefficient and small based on their ultimate application at the edge or in otherresource-constrained environments. While prior research has reduced the size ofthese models, increasing computational efficiency without considerableperformance impacts remains difficult, especia

加载中...

本文档仅能预览20页

继续阅读请下载文档

网友评论>