qrit-2

Model Details

Model Description

  • Developed by: samdak93
  • Model type: Causal Language Model
  • Language(s): English
  • License: MIT
  • Finetuned from model: openai-community/gpt2

This model generates food recipes with instructions based on the user's nutritional preferences, such as "around 400 calories, high protein, low fat".

Model Sources

Uses

Direct Use

The model can be used to generate recipes directly via text prompts like:

Generate a high-protein, low-fat recipe with around 400 calories.

Out-of-Scope Use

This model is not intended for medical diagnosis, treatment planning, or diet prescriptions requiring professional approval.

Bias, Risks, and Limitations

The model was trained on a custom dataset built by the author. It may not generalize well to all types of cuisines, dietary needs, or nutritional guidelines. It does not replace professional dietary advice.

Recommendations

Always consult a certified nutritionist or dietitian before following specific diets, especially if you have health conditions.

How to Get Started with the Model

from transformers import pipeline

generator = pipeline("text-generation", model="samdak93/qrit-2")
prompt = "Healthy dinner recipe under 400 calories, high protein"
output = generator(prompt, max_new_tokens=200)
print(output[0]["generated_text"])

Training Details

Training Data

The model was trained on a custom dataset of food recipes with nutrition tags and instructions built by the author.

Training Procedure

  • Platform: Google Colab (free tier)
  • Compute: Colab-provided GPU and RAM
  • Training regime: fp16 mixed precision

Evaluation

The model's output was evaluated manually for relevance, nutrition tag accuracy, and coherence of recipe instructions.

Environmental Impact

  • Hardware Type: Google Colab (free tier GPU)
  • Hours used: Approx. 6 hours
  • Cloud Provider: Google
  • Compute Region: Unknown
  • Carbon Emitted: Low (estimated via shared environment and short training time)

Technical Specifications

Model Architecture and Objective

The model is a fine-tuned version of GPT-2 (openai-community/gpt2) trained to generate nutrition-based recipes.

Compute Infrastructure

  • Hardware: Google Colab free GPU
  • Software: Python, Transformers, PyTorch

Citation

BibTeX:

@misc{qrit2,
  author = {samdak93},
  title = {qrit-2: Nutrition-based Recipe Generator},
  year = {2025},
  howpublished = {\url{https://huggingface.co/samdak93/qrit-2}},
}

Model Card Contact

Downloads last month
26
Safetensors
Model size
124M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ 1 Ask for provider support

Model tree for samdak93/qrit-2

Finetuned
(1693)
this model

Dataset used to train samdak93/qrit-2

Space using samdak93/qrit-2 1