This is a tiny random Llama model derived from "meta-llama/Llama-2-7b-hf".
See make_tiny_model.py for how this was done.
This is useful for functional testing (not quality generation, since its weights are random and the tokenizer has been shrunk to 3k items)
- Downloads last month
- 249
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.