--- license: apache-2.0 datasets: - cerebras/SlimPajama-627B language: - en --- Model of the paper [MoM: Linear Sequence Modeling with Mixture-of-Memories](https://arxiv.org/abs/2502.13685) and [Gated Delta Networks: Improving Mamba2 with Delta Rule](https://arxiv.org/abs/2412.06464). The model was trained on a sample of SlimPajama with 15B tokens.