CharLLama-1.3B Pretrained Language Model

This language model is similar to CharLLama-2.6B in terms of training process and usage, but with a smaller capacity. It has 1,369,634,304 parameters.

For detailed information, including pretraining data, tokenization, and usage examples, please refer to the CharLLama-2.6B documentation.

Downloads last month
0
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.