English
sae-pythia-70m-32k / README.md
norabelrose's picture
Create README.md
63999ba verified
metadata
license: mit
datasets:
  - EleutherAI/pile
language:
  - en

These SAEs were trained on the outputs of each of the MLPs in EleutherAI/pythia-70m. We used 8.2 billion tokens from the Pile training set at a context length of 2049. The number of latents is 32,768.