miktse-chatbot / README.md
michaeltsegaye's picture
Update README.md
538aed1 verified
---
title: '# MIK TSE β€” TikTok Viral Mastery Assistant'
sdk: gradio
emoji: πŸ“š
colorFrom: blue
colorTo: green
pinned: false
---
# MIK TSE β€” TikTok Viral Mastery Assistant
Welcome to MIK TSE, your AI-powered chatbot for TikTok growth, onboarding, and creator support. Built by Michael using Gradio and custom JSON flows, this assistant helps Ethiopian creators navigate viral strategies, FAQs, and community tools.
## πŸ”₯ Features
- 🎯 TikTok growth tips
- πŸ“¦ Onboarding flows
- πŸ’¬ Telegram support integration
- 🧠 Local slang and brand tone
## πŸš€ How to Use
Type your question in the chat box β€” MIK TSE will respond instantly with branded advice and support.
## πŸ“‘ Connect
- Telegram: [https://t.me/MikTse](@Miktsegrp)
- Website: [https://miktse.gumroad.com/l/TikTokViralMastery]
## πŸ› οΈ Tech Stack
- Gradio
- Python
- Hugging Face Spaces
- # TikTok Viral Mastery β€” Question-Answering Chatbot (Hugging Face Space)
A Gradio-based chatbot that answers user questions using your **TikTok Viral Mastery** course content as its knowledge base.
This repo contains:
- `app.py` β€” the Gradio app and retrieval + generation pipeline
- `knowledge/` β€” put your course files here (plain `.txt`, `.md`, or `.pdf` converted to text)
- `.gitattributes`
---
## Features
- Simple TF-IDF retrieval over course chunks.
- Optional LLM generation via Hugging Face Inference API for fluent answers.
- Works offline (returns top-k context excerpts) if you don't provide an API key.
- Upload course files via `/knowledge` (for local usage) or push to the repo.
---
## Quick start (recommended for Hugging Face Spaces)
1. Create a new Space (Gradio) on Hugging Face.
2. Add this repo's files.
3. Add your course text files inside a `knowledge/` directory at the repo root (e.g. `knowledge/course.txt`, `knowledge/chapter1.md`).
4. (Optional) In the Space settings, set secrets:
- `HF_TOKEN` β€” your Hugging Face API token (required for calling Inference API)
- `HF_MODEL` β€” model ID to use for generation (default `google/flan-t5-large` if not set)
5. Run the Space. The app will index `knowledge/` on startup.
---
## Environment & Requirements
Install locally for testing:
```bash
python -m venv venv
source venv/bin/activate
pip install -r requirements.txt
python app.py # runs a local Gradio server
---
Created by Michael β€” student, entrepreneur, and designer empowering creators through automation.