Bochkov commited on
Commit
9b1f048
·
verified ·
1 Parent(s): 6aab3eb

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +10 -0
README.md CHANGED
@@ -63,6 +63,16 @@ If you use this model or the underlying concepts in your research, please cite o
63
  primaryClass={cs.CL},
64
  url={https://arxiv.org/abs/2507.04886},
65
  }
 
 
 
 
 
 
 
 
 
 
66
  ```
67
 
68
  This work demonstrates that transformer blocks, not token embeddings, carry the semantic burden in LLMs — a step toward modular, fusable, multilingual LMs.
 
63
  primaryClass={cs.CL},
64
  url={https://arxiv.org/abs/2507.04886},
65
  }
66
+
67
+ @misc{bochkov2025growingtransformersmodularcomposition,
68
+ title={Growing Transformers: Modular Composition and Layer-wise Expansion on a Frozen Substrate},
69
+ author={A. Bochkov},
70
+ year={2025},
71
+ eprint={2507.07129},
72
+ archivePrefix={arXiv},
73
+ primaryClass={cs.LG},
74
+ url={https://arxiv.org/abs/2507.07129},
75
+ }
76
  ```
77
 
78
  This work demonstrates that transformer blocks, not token embeddings, carry the semantic burden in LLMs — a step toward modular, fusable, multilingual LMs.