ReluLLaMA-7B-PowerInfer-GGUF

This model is the downstream distribution of SparseLLM/ReluLLaMA-7B in PowerInfer GGUF format consisting of the LLM model weights and predictor weights.

Downloads last month
49
GGUF
Hardware compatibility
Log In to view the estimation

We're not able to determine the quantization variants.

Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Space using PowerInfer/ReluLLaMA-7B-PowerInfer-GGUF 1