YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

HarryPotterGPT

A GPT model trained on Harry Potter books, created by Camilo Vega, AI Consultant and Professor. The model generates text in the style of the Harry Potter saga.

Model Information

  • Architecture: GPT (Decoder-only Transformer)
  • Training: The model was trained from scratch on Harry Potter books
  • Tokenizer: SentencePiece (unigram model)
  • Parameters: Approx. 124M (12 layers, 768 embedding dimensions, 12 attention heads)

Usage

from transformers import AutoModelForCausalLM, AutoTokenizer

# Load model and tokenizer
tokenizer = AutoTokenizer.from_pretrained("CamiloVega/HarryPotterGPT-v2")
model = AutoModelForCausalLM.from_pretrained("CamiloVega/HarryPotterGPT-v2")

# Generate text
prompt = "Harry looked at Hermione and"
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(**inputs, max_length=100, temperature=0.7, top_k=50)
generated_text = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(generated_text)

Examples

  • "Harry looked at Hermione and saw that she was already searching through her books."
  • "The castle of Hogwarts was illuminated by the moonlight, its towers reaching into the night sky."
  • "Ron took out his wand and pointed it at the creature, his hand trembling slightly."
  • "Dumbledore's eyes twinkled as he gazed at Harry over his half-moon spectacles."

Limitations

This model was trained exclusively on Harry Potter books, so its knowledge is limited to that context. It works best with prompts related to the Harry Potter universe.

Original Project

This model is part of an educational project on building language models from scratch. More details available at https://github.com/CamiloVga/HarryPotterGPT

Downloads last month
11
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support