attention
Self-Attention Algorithm
Self-Attention algorithm helper functions and demonstration vignettes of increasing depth on how to construct the Self-Attention algorithm, this is based on Vaswani et al. (2017) <doi:10.48550/arXiv.1706.03762>, Dan Jurafsky and James H. Martin (2022, ISBN:978-0131873216) <https://web.stanford.edu/~jurafsky/slp3/> "Speech and Language Processing (3rd ed.)" and Alex Graves (2020) <https://www.youtube.com/watch?v=AIiwuClvH6k> "Attention and Memory in Deep Learning".
Versions across snapshots
| Version | Repository | File | Size |
|---|---|---|---|
0.4.0 |
2026-04-09 windows/windows R-4.5 | attention_0.4.0.zip |
34.6 KiB |