关于注意力机制学习的相关链接
Transformer:https://www.bilibili.com/video/BV17y4y1m737?from=search&seid=6849796848863853866
RNN网络+Attention+self-Attention+Transformer的讲解(王树森):https://youtu.be/XhWdv7ghmQQ
课件:https://github.com/wangshusen/DeepLearning
2、自然语言处理(NLP):16 图解self-attention原理。感觉这篇讲自注意力讲的清晰明了
https://wenjie.blog.youkuaiyun.com/article/details/106523650?utm_medium=distribute.pc_relevant.none-task-blog-BlogCommendFromBaidu-5.control&depth_1-utm_source=distribute.pc_relevant.none-task-blog-BlogCommendFromBaidu-5.control
3、Transformer模型详解
https://blog.youkuaiyun.com/u012526436/article/details/86295971
4、超细节的 Self-Attention 知识点 - 附源码解析
https://blog.youkuaiyun.com/songyunli1111/article/details/109255242?utm_medium=distribute.pc_relevant.none-task-blog-title-7&spm=1001.2101.3001.4242