Upload README.md with huggingface_hub
Browse files
README.md
ADDED
@@ -0,0 +1,39 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
language:
|
3 |
+
- en
|
4 |
+
tags:
|
5 |
+
- causal-lm
|
6 |
+
library_name: transformers
|
7 |
+
license: apache-2.0
|
8 |
+
datasets:
|
9 |
+
- allenai/dolma
|
10 |
+
---
|
11 |
+
|
12 |
+
# Gemstone-512x16_lr_ablation
|
13 |
+
Gemstone-512x16_lr_ablation is part of the [Gemstone Suite of Models](https://huggingface.co/collections/tomg-group-umd/gemstone-models-679408ee3f19f1d4d00e8b10). A set of models trained with varying widths and depths. This particular version, denoted by the `_lr_ablation` postfix, corresponds to an ablation detailed in the paper where we train the same suite of models but with a learning rate that is half of the original.
|
14 |
+
|
15 |
+
## Training
|
16 |
+
We train using [litgpt](https://github.com/Lightning-AI/litgpt) and [AxoNN](https://github.com/axonn-ai/litgpt) using AMD MI250X GPUs on [Frontier](https://www.olcf.ornl.gov/olcf-resources/compute-systems/frontier/) at Oak Ridge National Laboratory with a global batch size of 2048.
|
17 |
+
|
18 |
+
## Data
|
19 |
+
Train and validation data is taken from non-overlapping subsets of [dolma](https://huggingface.co/datasets/allenai/dolma). As such it is _not_ an instruction model.
|
20 |
+
This model is trained for 100 billion tokens (in contrast to the main suite, which is trained to 350 billion tokens), we upload checkpoints every 2 billion tokens (477 steps).
|
21 |
+
|
22 |
+
## Using Gemstone-512x16_lr_ablation
|
23 |
+
The Gemstones are based on the [gemma-2b](https://huggingface.co/google/gemma-2b) architecture and use [modeling_gemma.py](https://github.com/huggingface/transformers/blob/main/src/transformers/models/gemma/modeling_gemma.py) to run using the transformers library.
|
24 |
+
|
25 |
+
## Licence
|
26 |
+
This model is released under the [apache-2.0](https://choosealicense.com/licenses/apache-2.0/) licence.
|
27 |
+
|
28 |
+
## Contact
|
29 |
+
Please, feel free to contact us with any questions, or open a discussion thread.
|
30 |
+
|
31 |
+
# Citation
|
32 |
+
```
|
33 |
+
@article{mcleish2024gemstones
|
34 |
+
title={Gemstones: A Model Suite for Multi-Faceted Scaling Laws},
|
35 |
+
author={Sean McLeish and John Kirchenbauer and David Yu Miller and Siddharth Singh and Abhinav Bhatele and Micah Goldblum and Ashwinee Panda and Tom Goldstein},
|
36 |
+
journal={arXiv preprint arXiv:2502.},
|
37 |
+
year={2025}
|
38 |
+
}
|
39 |
+
```
|