Datasets:

Languages:
English
ArXiv:
License:
orionweller commited on
Commit
33ba201
·
verified ·
1 Parent(s): a70b848

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -6
README.md CHANGED
@@ -6,13 +6,13 @@ language:
6
  # Ettin Decay Phase Data
7
 
8
  [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
9
- [![Paper](https://img.shields.io/badge/Paper-Coming%20Soon-red)](https://github.com/jhu-clsp/ettin-encoder-vs-decoder)
10
  [![Models](https://img.shields.io/badge/🤗%20Hugging%20Face-12%20Models-blue)](https://huggingface.co/jhu-clsp)
11
  [![GitHub](https://img.shields.io/badge/GitHub-Code-black)](https://github.com/jhu-clsp/ettin-encoder-vs-decoder)
12
 
13
  > **Phase 3 of 3**: Premium data sources for final training phase (100B tokens) following the ProLong recipe.
14
 
15
- This dataset contains the decay phase data used to train all [ETTIN encoder and decoder models](https://huggingface.co/jhu-clsp). This final phase uses **premium data sources** with emphasis on **long-form content** and **educational materials**. The data is provided in **MDS format** ready for use with [Composer](https://github.com/mosaicml/composer) and the [ModernBERT training repository](https://github.com/answerdotai/ModernBERT).
16
 
17
  ## 📊 Data Composition
18
 
@@ -89,17 +89,19 @@ This decay phase data is also used for **cross-objective training** experiments:
89
  - **Phase 1**: [Pre-training Data](https://huggingface.co/datasets/jhu-clsp/ettin-pretraining-data) (1.7T tokens)
90
  - **Phase 2**: [Mid-training Data](https://huggingface.co/datasets/jhu-clsp/ettin-extension-data) (250B tokens)
91
  - **Training Order**: [Batch-level Data Order](https://huggingface.co/datasets/jhu-clsp/ettin-data-order)
92
- - **Paper**: Coming Soon
93
  - **Code**: [GitHub Repository](https://github.com/jhu-clsp/ettin-encoder-vs-decoder)
94
 
95
  ## Citation
96
 
97
  ```bibtex
98
- @misc{weller2025seqvsseq,
99
  title={Seq vs Seq: An Open Suite of Paired Encoders and Decoders},
100
  author={Orion Weller and Kathryn Ricci and Marc Marone and Antoine Chaffin and Dawn Lawrie and Benjamin Van Durme},
101
  year={2025},
102
- note={Paper coming soon},
103
- url={https://github.com/jhu-clsp/ettin-encoder-vs-decoder},
 
 
104
  }
105
  ```
 
6
  # Ettin Decay Phase Data
7
 
8
  [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
9
+ [![Paper](https://img.shields.io/badge/Paper-Arxiv-red)](https://arxiv.org/abs/2507.11412)
10
  [![Models](https://img.shields.io/badge/🤗%20Hugging%20Face-12%20Models-blue)](https://huggingface.co/jhu-clsp)
11
  [![GitHub](https://img.shields.io/badge/GitHub-Code-black)](https://github.com/jhu-clsp/ettin-encoder-vs-decoder)
12
 
13
  > **Phase 3 of 3**: Premium data sources for final training phase (100B tokens) following the ProLong recipe.
14
 
15
+ This dataset contains the decay phase data used to train all [Ettin encoder and decoder models](https://huggingface.co/jhu-clsp). This final phase uses **premium data sources** with emphasis on **long-form content** and **educational materials**. The data is provided in **MDS format** ready for use with [Composer](https://github.com/mosaicml/composer) and the [ModernBERT training repository](https://github.com/answerdotai/ModernBERT).
16
 
17
  ## 📊 Data Composition
18
 
 
89
  - **Phase 1**: [Pre-training Data](https://huggingface.co/datasets/jhu-clsp/ettin-pretraining-data) (1.7T tokens)
90
  - **Phase 2**: [Mid-training Data](https://huggingface.co/datasets/jhu-clsp/ettin-extension-data) (250B tokens)
91
  - **Training Order**: [Batch-level Data Order](https://huggingface.co/datasets/jhu-clsp/ettin-data-order)
92
+ - **Paper**: [Arxiv link](https://arxiv.org/abs/2507.11412)
93
  - **Code**: [GitHub Repository](https://github.com/jhu-clsp/ettin-encoder-vs-decoder)
94
 
95
  ## Citation
96
 
97
  ```bibtex
98
+ @misc{weller2025seqvsseqopen,
99
  title={Seq vs Seq: An Open Suite of Paired Encoders and Decoders},
100
  author={Orion Weller and Kathryn Ricci and Marc Marone and Antoine Chaffin and Dawn Lawrie and Benjamin Van Durme},
101
  year={2025},
102
+ eprint={2507.11412},
103
+ archivePrefix={arXiv},
104
+ primaryClass={cs.CL},
105
+ url={https://arxiv.org/abs/2507.11412},
106
  }
107
  ```