Transformers from Scratch
This project consists of code for Transformer Block, Single Head Attention and Multi-head attention and Casual Mask from Scratch.
Model Details
Model Description
To solidify knowledge and for reference, attention block is based on paper "Attention is all you need".
- Developed by: Michael Peres
- Model type: Vanilla Transformer from Scratch
- Language(s) (NLP): English
- License: MIT
Model Sources
- Paper [Attention is all you need]: https://arxiv.org/abs/1706.03762
Uses
[More Information Needed]
How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type: RTX 3070Ti
- Hours used: 0.1hr
Model Architecture and Objective
Objective in this model was to understand Transformers, and the basic self attention module. Self Attention, Multi-Head Attention and Casual Mask and Transformer Block