2023-03-18 【深層学習】Memory Compressed Transformer Index Index Memory Compressed Transformer 参考 Memory Compressed Transformer ドキュメントの要約 / Summarize のタスクにおける手法. 文章要約 yhayato1320.hatenablog.com Memory Compressed Attention を導入. 参考 Generating Wikipedia by Summarizing Long Sequences [2018] 2 RELATED WORK 2.3 TRANSFORMER MODELS 3 ENGLISH WIKIPEDIA AS A MULTI-DOCUMENT SUMMARIZATION DATASET 4 METHODS AND MODELS 4.2 ABSTRACTIVE STAGE 4.2.4 TRANSFORMER DECODER WITH MEMORY-COMPRESSED ATTENTION (T-DMCA) Local attention Memory-compressed attention arxiv.org