File size: 2,074 Bytes
98ba653
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
---
language:
- en
tags:
- causal-lm
library_name: transformers
license: apache-2.0
datasets:
- allenai/dolma
---

# Gemstone-512x16_lr_ablation
Gemstone-512x16_lr_ablation is part of the [Gemstone Suite of Models](https://huggingface.co/collections/tomg-group-umd/gemstone-models-679408ee3f19f1d4d00e8b10). A set of models trained with varying widths and depths. This particular version, denoted by the `_lr_ablation` postfix, corresponds to an ablation detailed in the paper where we train the same suite of models but with a learning rate that is half of the original.

## Training
We train using [litgpt](https://github.com/Lightning-AI/litgpt) and [AxoNN](https://github.com/axonn-ai/litgpt) using AMD MI250X GPUs on [Frontier](https://www.olcf.ornl.gov/olcf-resources/compute-systems/frontier/) at Oak Ridge National Laboratory with a global batch size of 2048.

## Data
Train and validation data is taken from non-overlapping subsets of [dolma](https://huggingface.co/datasets/allenai/dolma). As such it is _not_ an instruction model.
This model is trained for 100 billion tokens (in contrast to the main suite, which is trained to 350 billion tokens), we upload checkpoints every 2 billion tokens (477 steps).

## Using Gemstone-512x16_lr_ablation
The Gemstones are based on the [gemma-2b](https://huggingface.co/google/gemma-2b) architecture and use [modeling_gemma.py](https://github.com/huggingface/transformers/blob/main/src/transformers/models/gemma/modeling_gemma.py) to run using the transformers library.

## Licence
This model is released under the [apache-2.0](https://choosealicense.com/licenses/apache-2.0/) licence.

## Contact
Please, feel free to contact us with any questions, or open a discussion thread.

# Citation
```
@article{mcleish2024gemstones
    title={Gemstones: A Model Suite for Multi-Faceted Scaling Laws}, 
    author={Sean McLeish and John Kirchenbauer and David Yu Miller and Siddharth Singh and Abhinav Bhatele and Micah Goldblum and Ashwinee Panda and Tom Goldstein},
    journal={arXiv preprint arXiv:2502.},
    year={2025}
}
```