Bachstelze's picture
Update README.md
d927bfa verified
metadata
license: agpl-3.0

BabyLM 10M pretraining corpus

Raw data is from: https://babylm.github.io/guidelines.html

Short lines are merged into one single line separated by a tab. All lines are shuffled before split into train, validation and test files. Validation and test have only 128 lines each.

The original text files are: 'open_subtitles.train', 'switchboard.train', 'simple_wiki.train', 'gutenberg.train', 'childes.train', 'bnc_spoken.train'