This is a tiny random Llama model derived from "meta-llama/Llama-2-7b-hf".

See make_tiny_model.py for how this was done.

This is useful for functional testing (not quality generation, since its weights are random and the tokenizer has been shrunk to 3k items)

Downloads last month
249
Safetensors
Model size
104k params
Tensor type
BF16
ยท
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Spaces using stas/tiny-random-llama-2 2