File size: 276 Bytes
29114a6
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
---
license: apache-2.0
datasets:
- HuggingFaceTB/smollm-corpus
language:
- en
base_model:
- mittagessen/bytellama_random
---

This is a [ByteLlama](https://github.com/mittagessen/bytellama) 101M model pretrained on the Cosmopedia v2 portion of the SmolLM corpus for 2 epochs.