License: Apache 2.0

This model is the result of continued pre-training on WorldPM-72B, using a multilingual dataset of mixed code and text.

We are providing this model to the community to serve as a "base" model for further SFT, this model is not intended for direct inference.

Downloads last month
6
Safetensors
Model size
9B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for OpenBuddy/OpenBuddy-WorldPM-72B-Base

Base model

Qwen/Qwen2.5-72B
Finetuned
Qwen/WorldPM-72B
Finetuned
(4)
this model
Finetunes
1 model