Create README.md
Browse files
README.md
ADDED
|
@@ -0,0 +1,46 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
license: mit
|
| 3 |
+
task_categories:
|
| 4 |
+
- fill-mask
|
| 5 |
+
tags:
|
| 6 |
+
- pretraining
|
| 7 |
+
- encoder
|
| 8 |
+
- multilingual
|
| 9 |
+
---
|
| 10 |
+
|
| 11 |
+
# mmBERT Training Data (Ready-to-Use)
|
| 12 |
+
|
| 13 |
+
[](https://opensource.org/licenses/MIT)
|
| 14 |
+
[](https://arxiv.org/abs/2509.06888)
|
| 15 |
+
[](https://huggingface.co/collections/jhu-clsp/mmbert-a-modern-multilingual-encoder-68b725831d7c6e3acc435ed4)
|
| 16 |
+
[](https://github.com/jhu-clsp/mmBERT)
|
| 17 |
+
|
| 18 |
+
> **Complete Training Dataset**: Pre-randomized and ready-to-use multilingual training data (3T tokens) for encoder model pre-training.
|
| 19 |
+
|
| 20 |
+
This dataset is part of the complete, pre-shuffled training data used to train the [mmBERT encoder models](https://huggingface.co/collections/jhu-clsp/mmbert-a-modern-multilingual-encoder-68b725831d7c6e3acc435ed4). Unlike the individual phase datasets, this version is ready for immediate use but **the mixture cannot be modified easily**. The data is provided in **decompressed MDS format** ready for use with [ModernBERT's Composer](https://github.com/mosaicml/composer) and the [ModernBERT training repository](https://github.com/answerdotai/ModernBERT).
|
| 21 |
+
|
| 22 |
+
## Licensing & Attribution
|
| 23 |
+
|
| 24 |
+
This dataset aggregates multiple open-source datasets under permissive licenses. See individual source datasets for specific attribution requirements.
|
| 25 |
+
|
| 26 |
+
## Related Resources
|
| 27 |
+
|
| 28 |
+
- **Models**: [mmBERT Model Suite](https://huggingface.co/collections/jhu-clsp/mmbert-a-modern-multilingual-encoder-68b725831d7c6e3acc435ed4)
|
| 29 |
+
- **Individual Phases**: [Pre-training](https://huggingface.co/datasets/jhu-clsp/mmbert-pretrain-p1-fineweb2-langs) | [Mid-training](https://huggingface.co/datasets/jhu-clsp/mmbert-midtraining) | [Decay](https://huggingface.co/datasets/jhu-clsp/mmbert-decay)
|
| 30 |
+
- **Checkpoints**: [Training Checkpoints](https://huggingface.co/datasets/jhu-clsp/mmbert-checkpoints)
|
| 31 |
+
- **Paper**: [Arxiv link](https://arxiv.org/abs/2509.06888)
|
| 32 |
+
- **Code**: [GitHub Repository](https://github.com/jhu-clsp/mmBERT)
|
| 33 |
+
|
| 34 |
+
## Citation
|
| 35 |
+
|
| 36 |
+
```bibtex
|
| 37 |
+
@misc{marone2025mmbertmodernmultilingualencoder,
|
| 38 |
+
title={mmBERT: A Modern Multilingual Encoder with Annealed Language Learning},
|
| 39 |
+
author={Marc Marone and Orion Weller and William Fleshman and Eugene Yang and Dawn Lawrie and Benjamin Van Durme},
|
| 40 |
+
year={2025},
|
| 41 |
+
eprint={2509.06888},
|
| 42 |
+
archivePrefix={arXiv},
|
| 43 |
+
primaryClass={cs.CL},
|
| 44 |
+
url={https://arxiv.org/abs/2509.06888},
|
| 45 |
+
}
|
| 46 |
+
```
|