You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

Gemma-3-1B Event-Planner (4-bit QLoRA)

Adapter-only repo for a culturally sensitive event-planning assistant fine-tuned via LoRA on google/gemma-3-1b-it.
This adapter (~50 MB) can be applied to the 4-bit base model at inference time, so you don’t need to ship multi-GB merged weights.

Base model: google/gemma-3-4b-it
Fine-tuned with: LoRA r=8, Ξ±=32, dropout=0.05, 4-bit NF4 quant.

Intended use

Generates culturally sensitive event plans (weddings, baby-naming, college fests …).
Asks clarifying questions about culture, guest count, budget, dietary needs.

Training data

  • dair-ai/emotion (3 k / 0.5 k)
  • ciol-research/global-festivals-wiki (9 k / 1 k)
  • corbt/all-recipes (15 k / 1.5 k)
  • WorkWithData/cities (6 k / 1 k)
  • Yelp/yelp_review_full (12 k / 2 k)

Model Details

  • Base model: google/gemma-3-1b-it (4 B parameters, instruction-tuned)
  • Quantization: 4-bit NF4 via bitsandbytes
  • LoRA config:
    • rank r = 8
    • Ξ± = 32
    • dropout = 0.05
    • target modules = ["q_proj","v_proj"]
  • Trainable params: ~0.75 M (0.07 % of base)

Fine-tuning data (β‰ˆ 75 k examples total)

Domain Dataset Train / Val Why included
Emotion & Tone dair-ai/emotion 3 k / 0.5 k Adapt style & follow-up questioning
Cultural Festivals ciol-research/global-festivals-wiki 9 k / 1 k Rituals, symbols, dates across cultures
Cuisine & Menus corbt/all-recipes 15 k / 1.5 k Authentic recipes for menu planning
Venue / Geodata WorkWithData/cities 6 k / 1 k Real cities + coords for location tips
Vendors & Services Yelp/yelp_review_full 12 k / 2 k Business vocabulary & recommendation tone

Local Usage

# Install runtime dependencies:
pip install accelerate==1.7.0 bitsandbytes==0.45.5 peft==0.15.2 sentencepiece==0.2.0 torch==2.7.0 transformers==4.51.3 trl==0.17.0

# Load the 4-bit base + adapter:
import torch
from transformers import (
    AutoTokenizer,
    AutoModelForCausalLM,
    BitsAndBytesConfig,
    pipeline,
)
from peft import PeftModel

# 1. Quant config
bnb_cfg = BitsAndBytesConfig(
    load_in_4bit              = True,
    bnb_4bit_quant_type       = "nf4",
    bnb_4bit_use_double_quant = True,
    bnb_4bit_compute_dtype    = torch.float16,
)

# 2. Tokenizer
BASE = "google/gemma-3-1b-it"
tokenizer = AutoTokenizer.from_pretrained(
    BASE,
    trust_remote_code=True,
    use_auth_token=True,
)

# 3. Base model
base = AutoModelForCausalLM.from_pretrained(
    BASE,
    quantization_config=bnb_cfg,
    device_map="auto",
    torch_dtype=torch.float16,
    trust_remote_code=True,
    use_auth_token=True,
)

# 4. LoRA adapter
ADAPTER = "PranavKeshav/event-planner-gemma-4bit"
model   = PeftModel.from_pretrained(
    base,
    ADAPTER,
    device_map="auto",
    torch_dtype=torch.float16,
    use_auth_token=True,
)
model.eval()

# 5. Pipeline
pipe = pipeline(
    "text-generation",
    model=model,
    tokenizer=tokenizer,
    device_map="auto",
    max_new_tokens=150,
    temperature=0.7,
    top_p=0.9,
)

# 6. Test
print(pipe("Plan a Gujarati wedding for 120 guests in Ahmedabad.")[0]["generated_text"])

Model Card & Citation

  1. Intended use: Generate culturally sensitive event plans; ask clarifying questions about dates, budgets, dietary needs.
  2. Limitations: May hallucinate or miss rare cultural details; verify all critical recommendations.
  3. License: Same as google/gemma-3-1b-it (Apache-2.0) + dataset licenses; see individual datasets.
  4. Citation:
    @misc{gemma_event_planner_2025,
       title = {Gemma-3-4B Event-Planner LoRA Adapter},
       author = {Keshav, Pranav},
       year = {2025},
       howpublished = {\url{https://huggingface.co/YOUR_USERNAME/event-planner-gemma-4bit}}
     }
    

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for PranavKeshav/event-planner-gemma-4bit

Finetuned
(132)
this model

Datasets used to train PranavKeshav/event-planner-gemma-4bit