![]() ![]() The annotated Transformer by Harvard NLP, and the Attention is All You Need paper.Peter Bloem has a nice from-scratch implementation of the transformer in PyTorch.Lilian Weng has a nice blog with a few posts on attention and transformers.Yannick Kilcher has lot's of videos on deep learning papers, including a playlist for NLP.Jay Allamar's blog has several posts on attentions and transformers. #Torch permute series
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |