πŸ•‰οΈ Murli Assistant - DistilGPT-2 Ultra-Lite

An ultra-lightweight spiritual AI assistant trained on Brahma Kumaris murli content. Perfect for free Colab and low-resource environments!

🎯 Why This Model?

  • 82M parameters (30x smaller than Phi-2)
  • RAM: ~1-2 GB (fits easily in free Colab)
  • Fast inference: 0.5-1 second per response
  • No quantization needed: Runs in full precision
  • Perfect for free tier: No crashes, no OOM errors

Model Details

  • Base Model: DistilGPT-2 (82M parameters)
  • Fine-tuning: LoRA (Low-Rank Adaptation)
  • Training Data: 150 authentic murlis
  • Training Examples: 153+
  • Max Length: 256 tokens
  • LoRA Rank: 4

Usage

Quick Start (Colab)

from transformers import AutoTokenizer, AutoModelForCausalLM
from peft import PeftModel

# Load base model
tokenizer = AutoTokenizer.from_pretrained("distilgpt2")
base_model = AutoModelForCausalLM.from_pretrained("distilgpt2")

# Load LoRA adapter
model = PeftModel.from_pretrained(base_model, "eswarankrishnamurthy/murli-assistant-distilgpt2-lite")

# Chat function
def chat(message):
    prompt = f"Q: {message}\nA:"
    inputs = tokenizer(prompt, return_tensors="pt")
    outputs = model.generate(**inputs, max_new_tokens=150)
    return tokenizer.decode(outputs[0], skip_special_tokens=True)

# Try it
response = chat("Om Shanti")
print(response)

Use in Production

See the full Colab notebook: murli-distilgpt2-colab.ipynb

Comparison with Other Models

Model Parameters RAM Inference Colab Free
DistilGPT-2 (This) 82M ~1-2 GB 0.5-1s βœ… Perfect
Phi-2 2.7B ~10 GB 1-3s ❌ Crashes
Phi-2 (4-bit) 2.7B ~3-4 GB 1-3s ⚠️ Tight fit

Advantages

βœ… Ultra-Lightweight: 30x smaller than Phi-2 βœ… Low RAM: Only 1-2 GB needed βœ… Fast Training: 5-10 minutes βœ… Fast Inference: Sub-second responses βœ… Free Colab: Perfect fit, no crashes βœ… Easy Deployment: Simple integration βœ… Good Quality: Excellent for basic Q&A

Training Details

[ "30x smaller than Phi-2", "Fits in free Colab RAM easily", "Fast training (5-10 min)", "Fast inference", "Good for basic Q&A" ]

Example Responses

Q: Om Shanti A: Om Shanti, sweet child! πŸ™ I'm your Murli Helper. How can I guide you today?

Q: What is soul consciousness? A: Soul consciousness is experiencing yourself as an eternal, pure soul with peace, love, and purity. Om Shanti πŸ™

Q: Who is Baba? A: Baba is the Supreme Soul, the Ocean of Knowledge who teaches Raja Yoga through Brahma. Om Shanti πŸ™

Limitations

  • Shorter context (256 tokens vs Phi-2's 512)
  • Simpler responses compared to larger models
  • Best for focused Q&A, not long essays
  • Limited reasoning compared to billion-parameter models

License

MIT License - Free to use and modify

Citation

@misc{murli-distilgpt2-lite,
  author = {eswarankrishnamurthy},
  title = {Murli Assistant - DistilGPT-2 Ultra-Lite},
  year = {2025},
  publisher = {HuggingFace},
  url = {https://huggingface.co/eswarankrishnamurthy/murli-assistant-distilgpt2-lite}
}

Acknowledgments

  • Brahma Kumaris World Spiritual University for murli teachings
  • HuggingFace for model hosting
  • DistilGPT-2 team for the base model

Om Shanti! πŸ™

Downloads last month
26
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for eswarankrishnamurthy/murli-assistant-distilgpt2-lite

Adapter
(62)
this model