Text Generation
Transformers
PyTorch
English
olmo2
conversational
Inference Endpoints
hamishivi commited on
Commit
fd51358
·
verified ·
1 Parent(s): 296bb96

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -2
README.md CHANGED
@@ -4,15 +4,17 @@ language:
4
  - en
5
  pipeline_tag: text-generation
6
  base_model:
7
- - allenai/OLMo-2-1124-7B-DPO
8
  library_name: transformers
 
 
9
  ---
10
 
11
  <img src="https://allenai.org/olmo/olmo-7b-animation.gif" alt="OLMo Logo" width="800" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
12
 
13
  # OLMo-2-1124-7B-DPO
14
 
15
- OLMo-2 7B DPO November 2024 is finetuned variant of the [OLMo-2 7B November 2024](https://huggingface.co/allenai/OLMo2-7B-1124) model, which has undergone supervised finetuning on the [Tülu 3 dataset](https://huggingface.co/datasets/allenai/tulu-3-sft-mixture) and further DPO training.
16
  Tülu 3 is designed for state-of-the-art performance on a diversity of tasks in addition to chat, such as MATH, GSM8K, and IFEval.
17
  Check out [the OLMo-2 paper](https://TODO) or [Tülu 3 paper](https://arxiv.org/abs/2411.15124) for more details!
18
 
 
4
  - en
5
  pipeline_tag: text-generation
6
  base_model:
7
+ - allenai/OLMo-2-1124-7B-SFT
8
  library_name: transformers
9
+ datasets:
10
+ - allenai/olmo-2-1124-7b-preference-mix
11
  ---
12
 
13
  <img src="https://allenai.org/olmo/olmo-7b-animation.gif" alt="OLMo Logo" width="800" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
14
 
15
  # OLMo-2-1124-7B-DPO
16
 
17
+ OLMo-2 7B DPO November 2024 is finetuned variant of the [OLMo-2 7B November 2024](https://huggingface.co/allenai/OLMo2-7B-1124) model, which has undergone supervised finetuning on the [Tülu 3 dataset](https://huggingface.co/datasets/allenai/tulu-3-sft-mixture) and further DPO training on [this dataset](allenai/olmo-2-1124-7b-preference-mix).
18
  Tülu 3 is designed for state-of-the-art performance on a diversity of tasks in addition to chat, such as MATH, GSM8K, and IFEval.
19
  Check out [the OLMo-2 paper](https://TODO) or [Tülu 3 paper](https://arxiv.org/abs/2411.15124) for more details!
20