orionweller commited on
Commit
6efda93
·
verified ·
1 Parent(s): 54e5531

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +46 -0
README.md ADDED
@@ -0,0 +1,46 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ task_categories:
4
+ - fill-mask
5
+ tags:
6
+ - pretraining
7
+ - encoder
8
+ - multilingual
9
+ ---
10
+
11
+ # mmBERT Training Data (Ready-to-Use)
12
+
13
+ [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
14
+ [![Paper](https://img.shields.io/badge/Paper-Arxiv-red)](https://arxiv.org/abs/2509.06888)
15
+ [![Models](https://img.shields.io/badge/🤗%20Hugging%20Face-2%20Models-blue)](https://huggingface.co/collections/jhu-clsp/mmbert-a-modern-multilingual-encoder-68b725831d7c6e3acc435ed4)
16
+ [![GitHub](https://img.shields.io/badge/GitHub-Code-black)](https://github.com/jhu-clsp/mmBERT)
17
+
18
+ > **Complete Training Dataset**: Pre-randomized and ready-to-use multilingual training data (3T tokens) for encoder model pre-training.
19
+
20
+ This dataset is part of the complete, pre-shuffled training data used to train the [mmBERT encoder models](https://huggingface.co/collections/jhu-clsp/mmbert-a-modern-multilingual-encoder-68b725831d7c6e3acc435ed4). Unlike the individual phase datasets, this version is ready for immediate use but **the mixture cannot be modified easily**. The data is provided in **decompressed MDS format** ready for use with [ModernBERT's Composer](https://github.com/mosaicml/composer) and the [ModernBERT training repository](https://github.com/answerdotai/ModernBERT).
21
+
22
+ ## Licensing & Attribution
23
+
24
+ This dataset aggregates multiple open-source datasets under permissive licenses. See individual source datasets for specific attribution requirements.
25
+
26
+ ## Related Resources
27
+
28
+ - **Models**: [mmBERT Model Suite](https://huggingface.co/collections/jhu-clsp/mmbert-a-modern-multilingual-encoder-68b725831d7c6e3acc435ed4)
29
+ - **Individual Phases**: [Pre-training](https://huggingface.co/datasets/jhu-clsp/mmbert-pretrain-p1-fineweb2-langs) | [Mid-training](https://huggingface.co/datasets/jhu-clsp/mmbert-midtraining) | [Decay](https://huggingface.co/datasets/jhu-clsp/mmbert-decay)
30
+ - **Checkpoints**: [Training Checkpoints](https://huggingface.co/datasets/jhu-clsp/mmbert-checkpoints)
31
+ - **Paper**: [Arxiv link](https://arxiv.org/abs/2509.06888)
32
+ - **Code**: [GitHub Repository](https://github.com/jhu-clsp/mmBERT)
33
+
34
+ ## Citation
35
+
36
+ ```bibtex
37
+ @misc{marone2025mmbertmodernmultilingualencoder,
38
+ title={mmBERT: A Modern Multilingual Encoder with Annealed Language Learning},
39
+ author={Marc Marone and Orion Weller and William Fleshman and Eugene Yang and Dawn Lawrie and Benjamin Van Durme},
40
+ year={2025},
41
+ eprint={2509.06888},
42
+ archivePrefix={arXiv},
43
+ primaryClass={cs.CL},
44
+ url={https://arxiv.org/abs/2509.06888},
45
+ }
46
+ ```