License: Apache 2.0
This model is the result of continued pre-training on WorldPM-72B, using a multilingual dataset of mixed code and text.
We are providing this model to the community to serve as a "base" model for further SFT, this model is not intended for direct inference.
- Downloads last month
- 6
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support