Merged LLaMA Model

This is a merged version of the LLaMA2-13b model based on hyperboloid projections. The model retains 31 layers with significant performance retention across all benchmarks.

Downloads last month
0
Safetensors
Model size
10.2B params
Tensor type
FP16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for namannn/llama2-13b-hyperbolic-cluster-pruned

Finetuned
(6)
this model
Quantizations
1 model

Space using namannn/llama2-13b-hyperbolic-cluster-pruned 1

Evaluation results