File size: 310 Bytes
f408142 ddeaeed 11719ba 5da59e5 11719ba |
1 2 3 4 5 6 7 8 9 10 |
---
license: apache-2.0
datasets:
- cerebras/SlimPajama-627B
language:
- en
---
Model of the paper [MoM: Linear Sequence Modeling with Mixture-of-Memories](https://arxiv.org/abs/2502.13685).
The model was trained on a sample of SlimPajama with 15B tokens. We use Gated-Deltanet as the memory update mechanism. |