|
--- |
|
language: |
|
- en |
|
license: mit |
|
task_categories: |
|
- text-generation |
|
- fill-mask |
|
- text-classification |
|
- retrieval |
|
tags: |
|
- pretraining |
|
- language-modeling |
|
- encoder |
|
- decoder |
|
- foundation-model |
|
- transformer |
|
--- |
|
|
|
# Ettin Pre-training Data |
|
|
|
[](https://opensource.org/licenses/MIT) |
|
[](https://arxiv.org/abs/2507.11412) |
|
[](https://huggingface.co/jhu-clsp) |
|
[](https://github.com/jhu-clsp/ettin-encoder-vs-decoder) |
|
|
|
> **Phase 1 of 3**: Diverse pre-training data mixture (1.7T tokens) used to train the Ettin model suite. |
|
|
|
This dataset contains the pre-training phase data used to train all [Ettin encoder and decoder models](https://huggingface.co/jhu-clsp). The data is provided in **MDS format** ready for use with [Composer](https://github.com/mosaicml/composer) and the [ModernBERT training repository](https://github.com/answerdotai/ModernBERT). |
|
|
|
## π Data Composition |
|
|
|
| Data Source | Tokens (B) | Percentage | Description | |
|
|:------------|:-----------|:-----------|:------------| |
|
| DCLM | 837.2 | 49.1% | High-quality web crawl data | |
|
| CC Head | 356.6 | 20.9% | Common Crawl head documents | |
|
| Starcoder | 263.9 | 15.5% | Code repositories and files | |
|
| Reddit | 80.3 | 4.7% | Social discussion threads | |
|
| PeS2o | 57.3 | 3.4% | Scientific papers | |
|
| Arxiv | 28.0 | 1.6% | Academic preprints | |
|
| StackExchange | 19.6 | 1.2% | Q&A forums | |
|
| Tulu Flan | 16.6 | 1.0% | Instruction-following data | |
|
| Open-Web-Math | 12.7 | 0.7% | Mathematical content | |
|
| Algebraic StackExchange | 12.6 | 0.7% | Math Q&A | |
|
| CC News | 7.3 | 0.4% | News articles | |
|
| Wikipedia | 7.3 | 0.4% | Encyclopedia articles | |
|
| **Total** | **1,704.7** | **100.0%** | Diverse mixture for foundation training | |
|
|
|
## π Usage |
|
|
|
For pre-training, see the ModernBERT repo: https://github.com/AnswerDotAI/ModernBERT |
|
|
|
### Direct Access |
|
|
|
```python |
|
from streaming import StreamingDataset |
|
|
|
# Load the streaming dataset |
|
dataset = StreamingDataset( |
|
remote='https://huggingface.co/datasets/jhu-clsp/ettin-pretraining-data', |
|
local='/tmp/ettin-pretraining-data', |
|
shuffle=True |
|
) |
|
|
|
# Access samples |
|
for sample in dataset: |
|
text = sample['text'] |
|
# Process your data... |
|
``` |
|
|
|
## π Structure |
|
|
|
Each folder contains one data source in MDS (Mosaic Data Shard) format: |
|
- `arxiv/` - Academic papers from ArXiv |
|
- `books/` - Literature and reference books |
|
- `cc_head/` - High-quality Common Crawl documents |
|
- `cc_news/` - News articles from Common Crawl |
|
- `dclm/` - DataComp-LM filtered web data |
|
- `open_web_math/` - Mathematical web content |
|
- `algebraic_stackexchange/` - Math Q&A from StackExchange |
|
- `pes2o/` - Scientific papers (PeS2o dataset) |
|
- `reddit/` - Reddit discussion threads |
|
- `stackexchange/` - General StackExchange Q&A |
|
- `starcoder/` - Code from GitHub repositories |
|
- `tulu_flan/` - Instruction-following examples |
|
- `wikipedia/` - Wikipedia articles |
|
|
|
## π Related Resources |
|
|
|
- **Models**: [Ettin Model Suite](https://huggingface.co/collections/jhu-clsp/encoders-vs-decoders-the-ettin-suite-686303e16142257eed8e6aeb) (17M-1B parameters) |
|
- **Phase 2**: [Mid-training Data](https://huggingface.co/datasets/jhu-clsp/ettin-extension-data) (250B tokens) |
|
- **Phase 3**: [Decay Phase Data](https://huggingface.co/datasets/jhu-clsp/ettin-decay-data) (50B tokens) |
|
- **Training Order**: [Batch-level Data Order](https://huggingface.co/datasets/jhu-clsp/ettin-data-order) |
|
- **Paper**: [Arxiv link](https://arxiv.org/abs/2507.11412) |
|
- **Code**: [GitHub Repository](https://github.com/jhu-clsp/ettin-encoder-vs-decoder) |
|
|
|
## Citation |
|
|
|
```bibtex |
|
@misc{weller2025seqvsseqopen, |
|
title={Seq vs Seq: An Open Suite of Paired Encoders and Decoders}, |
|
author={Orion Weller and Kathryn Ricci and Marc Marone and Antoine Chaffin and Dawn Lawrie and Benjamin Van Durme}, |
|
year={2025}, |
|
eprint={2507.11412}, |
|
archivePrefix={arXiv}, |
|
primaryClass={cs.CL}, |
|
url={https://arxiv.org/abs/2507.11412}, |
|
} |
|
``` |