lince-zero-7b-GGUF / README.md
alvarobartt's picture
alvarobartt HF Staff
Update README.md
9fefb4e
metadata
license: apache-2.0
language:
  - es
tags:
  - falcon-fine-tune
  - gguf
  - llama.cpp
  - lince-zero-quantized
model_name: LINCE-ZERO
base_model: clibrain/lince-zero
inference: false
model_creator: Clibrain
model_type: falcon
pipeline_tag: text-generation
prompt_template: >
  A continuación hay una instrucción que describe una tarea, junto con una
  entrada que proporciona más contexto. Escriba una respuesta que complete
  adecuadamente la solicitud.\n\n### Instrucción: {prompt}\n\n### Respuesta:
quantized_by: alvarobartt

Model Card for LINCE-ZERO-7B-GGUF

LINCE-ZERO is a fine-tuned LLM for instruction following of Falcon 7B. The team/org leading the fine-tune is Clibrain, and the datasets used are both Alpaca and Dolly datasets, both translated into Spanish and augmented to 80k examples (as Clibrain claims in its model card).

This model contains the quantized variants using the GGUF format, introduced by the llama.cpp team.

Some curious may ask, why don't you just use TheBloke/lince-zero-GGUF? Well, you can use those via llama.cpp to run inference over LINCE-ZERO on low resources, but in case you want to use it via LM Studio in MacOS you will encounter some issues, as it may only work with q4_k_s, q4_k_m, q5_k_s, and q5_k_m quantization formats, and those are not included in TheBloke's.

Model Details

Model Description

Model Sources

Model Files

Name Quant method Bits Size Max RAM required Use case
lince-zero-7b-q4_k_s.gguf Q4_K_S 4 7.41 GB 9.91 GB small, greater quality loss
lince-zero-7b-q4_k_m.gguf Q4_K_M 4 7.87 GB 10.37 GB medium, balanced quality - recommended
lince-zero-7b-q5_k_s.gguf Q5_K_S 5 8.97 GB 11.47 GB large, low quality loss - recommended
lince-zero-7b-q5_k_m.gguf Q5_K_M 5 9.23 GB 11.73 GB large, very low quality loss - recommended

Note: the above RAM figures assume no GPU offloading. If layers are offloaded to the GPU, this will reduce RAM usage and use VRAM instead.

Uses

Direct Use

[More Information Needed]

Downstream Use [optional]

[More Information Needed]

Recommendations

Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.

How to Get Started with the Model

Use the code below to get started with the model.

[More Information Needed]

Training Details

All the training details can be found at Falcon 7B - Training Details, and the fine-tuning details at LINCE-ZERO - Training Details.