Bootstrap

Attention入门

1.一步一步,理解Self-Attention
https://www.cnblogs.com/jclian91/p/12846772.html
这片文章是翻译的外文,链接为:
Illustrated Self-Attention, Step-by-step guide to self-attention with illustrations and code
2.The Illustrated Transformer【译】
https://blog.csdn.net/yujianmin1990/article/details/85221271
这片文章同样是翻译的外文,链接为:
https://jalammar.github.io/illustrated-transformer/
3. Harvard’s NLP group created a guide annotating the paper with PyTorch implementation.
http://nlp.seas.harvard.edu/2018/04/03/attention.html
机器之心对这篇文章的翻译:
https://www.jiqizhixin.com/articles/2018-11-06-10
对其中的MultiHeadAttention实现详解:
https://finisky.github.io/2020/05/25/multiheadattention/
4.深度学习中的注意力模型(2017版)——张俊林
https://zhuanlan.zhihu.com/p/37601161

;