Decoder Language Model
Ein kleiner autoregressiver Decoder-only Transformer, trainiert auf Tiny Shakespeare.
Architektur
- d_model=128, num_layers=2, nhead=4
- ~500k Parameter
Metriken
- Loss (Train): 0.6342
- Perplexity (Train): 1.8854
Laden
from transformers import GPT2Tokenizer
import torch
from model import DecoderLanguageModel
tokenizer = GPT2Tokenizer.from_pretrained("ahmadisakina/decoder-language-model")
model = DecoderLanguageModel(vocab_size=tokenizer.vocab_size, d_model=128, nhead=4, num_layers=2)
model.load_state_dict(torch.load("pytorch_model.bin"))
model.eval()
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support