GPT2 in Swahili

This model was trained using HuggingFace's Flax framework and is part of the JAX/Flax Community Week organized by HuggingFace. All training was done on a TPUv3-8 VM sponsored by the Google Cloud team.

How to use

from transformers import AutoTokenizer, AutoModelWithLMHead
  
tokenizer = AutoTokenizer.from_pretrained("flax-community/gpt2-swahili")

model = AutoModelWithLMHead.from_pretrained("flax-community/gpt2-swahili")

print(round((model.num_parameters())/(1000*1000)),"Million Parameters")

124 Million Parameters

Training Data:

This model was trained on Swahili Safi

More Details:

For more details and Demo please check HF Swahili Space

Downloads last month
217
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Dataset used to train flax-community/gpt2-swahili