|
--- |
|
license: apache-2.0 |
|
datasets: |
|
- enjalot/fineweb-edu-sample-10BT-chunked-500-nomic-text-v1.5 |
|
language: |
|
- en |
|
--- |
|
# Latent SAE |
|
|
|
A series of SAEs trained on embeddings from [nomic-embed-text-v1.5](https://huggingface.co/nomic-ai/nomic-embed-text-v1.5) |
|
|
|
The SAEs were trained on the 100BT sample of Fineweb-EDU, see an example of the [10BT sample of Fineweb-Edu](https://huggingface.co/datasets/enjalot/fineweb-edu-sample-10BT-chunked-500). |
|
|
|
Run the models or train your own with [Latent SAE](https://github.com/enjalot/latent-sae) which is heavily borrowing from https://github.com/EleutherAI/sae |
|
|
|
# Training |
|
|
|
The models were trained using Modal Labs infrastructure with the command: |
|
```bash |
|
modal run train_modal.py --batch-size 512 --grad-acc-steps 4 --k 64 --expansion-factor 32 |
|
``` |
|
|
|
Error and dead latents charts can be seen here: |
|
 |
|
|
|
|