Text Generation
Transformers
PyTorch
Safetensors
English
mistral
text-generation-inference
PDS-470M / README.md
t1101675's picture
Clarify Model Description and Add Project Page Link (#2)
70ffe59 verified
metadata
datasets:
  - togethercomputer/RedPajama-Data-1T
language:
  - en
library_name: transformers
license: apache-2.0
pipeline_tag: text-generation

PDS-470M

paper | code | project page

PDS-470M is a 470M parameter Mistral architecture model pretrained from scratch using the PDS framework on data selected from the CC split of Redpajama.

The PDS framework is based on the Pontryagin's maximum principle for optimal pre-training data selection, offering strong theoretical support and scalability for training large language models.

Please refer to our paper for more details.

Overview of the theory:

Overview of the PDS framework:

Evaluation

PDS-selected data improves the performance of language models pre-trained from scratch and saves pre-training comptation. The improvement scales up to large model sizes.

Baseline

Conventional Pre-training

Citation

@article{gu2024data,
  title={Data Selection via Optimal Control for Language Models},
  author={Gu, Yuxian and Dong, Li and Wang, Hongning and Hao, Yaru and Dong, Qingxiu and Wei, Furu and Huang, Minlie},
  journal={arXiv preprint arXiv:2410.07064},
  year={2024}
}