tim-lawson commited on
Commit
2d4eb0c
·
verified ·
1 Parent(s): aecd78c

Push model using huggingface_hub.

Browse files
Files changed (3) hide show
  1. README.md +41 -3
  2. config.json +1 -1
  3. model.safetensors +1 -1
README.md CHANGED
@@ -1,9 +1,47 @@
1
  ---
 
 
 
2
  tags:
 
3
  - model_hub_mixin
4
  - pytorch_model_hub_mixin
5
  ---
6
 
7
- This model has been pushed to the Hub using the [PytorchModelHubMixin](https://huggingface.co/docs/huggingface_hub/package_reference/mixins#huggingface_hub.PyTorchModelHubMixin) integration:
8
- - Library: [More Information Needed]
9
- - Docs: [More Information Needed]
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
+ language: en
3
+ library_name: mlsae
4
+ license: mit
5
  tags:
6
+ - arxiv:2409.04185
7
  - model_hub_mixin
8
  - pytorch_model_hub_mixin
9
  ---
10
 
11
+ # Model Card for tim-lawson/sae-pythia-160m-deduped-x64-k32-layers-11
12
+
13
+ A Multi-Layer Sparse Autoencoder (MLSAE) trained on the residual stream activation
14
+ vectors from [EleutherAI/pythia-160m-deduped](https://huggingface.co/EleutherAI/pythia-160m-deduped) with an
15
+ expansion factor of R = 64 and sparsity k = 32, over 1 billion
16
+ tokens from [monology/pile-uncopyrighted](https://huggingface.co/datasets/monology/pile-uncopyrighted).
17
+
18
+
19
+ This model is a PyTorch TopKSAE module, which does not include the underlying
20
+ transformer.
21
+
22
+
23
+ ### Model Sources
24
+
25
+ - **Repository:** <https://github.com/tim-lawson/mlsae>
26
+ - **Paper:** <https://arxiv.org/abs/2409.04185>
27
+ - **Weights & Biases:** <https://wandb.ai/timlawson-/mlsae>
28
+
29
+ ## Citation
30
+
31
+ **BibTeX:**
32
+
33
+ ```bibtex
34
+ @misc{lawson_residual_2024,
35
+ title = {Residual {{Stream Analysis}} with {{Multi-Layer SAEs}}},
36
+ author = {Lawson, Tim and Farnik, Lucy and Houghton, Conor and Aitchison, Laurence},
37
+ year = {2024},
38
+ month = oct,
39
+ number = {arXiv:2409.04185},
40
+ eprint = {2409.04185},
41
+ primaryclass = {cs},
42
+ publisher = {arXiv},
43
+ doi = {10.48550/arXiv.2409.04185},
44
+ urldate = {2024-10-08},
45
+ archiveprefix = {arXiv}
46
+ }
47
+ ```
config.json CHANGED
@@ -5,5 +5,5 @@
5
  "k": 32,
6
  "n_inputs": 768,
7
  "n_latents": 49152,
8
- "standardize": true
9
  }
 
5
  "k": 32,
6
  "n_inputs": 768,
7
  "n_latents": 49152,
8
+ "standardize": false
9
  }
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:47837fa0b51039b32e6da02f5b839d6a92c0f0e750f16cbfb27ba8271ad6b096
3
  size 301993232
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:044c358d9def910f9138a1d8e880211eb8b06d493e24e456d93fdd0461d21d2a
3
  size 301993232