A fishy model
Trained with the ChatML format with a max context length of 32k.
Average length in datasets is around 4-8k tokens.
Uploaded model
- Developed by: TheTsar1209
- License: apache-2.0
- Finetuned from model : unsloth/llama-3-8b-bnb-4bit
This llama model was trained 2x faster with Unsloth and Huggingface's TRL library.
- Downloads last month
- 3
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for TheTsar1209/llama3-carp-v0.1
Base model
meta-llama/Meta-Llama-3-8B
Quantized
unsloth/llama-3-8b-bnb-4bit