SipánGPT 0.5 Llama 3.2 1B GGUF

  • Modelo pre-entrenado para responder preguntas de la Universidad Señor de Sipán de Lambayeque, Perú.
  • Pre-trained model to answer questions from the Señor de Sipán University of Lambayeque, Peru.

Testing the model

  • Entrenado con 304000 conversaciones, el modelo puede generar alucinaciones.
  • Trained with 304000 conversations, the model can generate hallucinations

Uploaded model

  • Developed by: ussipan
  • License: apache-2.0
  • Finetuned from model : unsloth/llama-3.2-1b-instruct-bnb-4bit

This llama model was trained 2x faster with Unsloth and Huggingface's TRL library.


SipánGPT 0.5 Llama 3.2 1B GGUF

Hecho con ❤️ por Jhan Gómez P.
Downloads last month
149
GGUF
Model size
1.24B params
Architecture
llama

4-bit

5-bit

8-bit

16-bit

Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Dataset used to train ussipan/Llama-3.2-SipanGPT-v0.5-GGUF