Spaces:
Runtime error
Runtime error
title: '# MIK TSE β TikTok Viral Mastery Assistant' | |
sdk: gradio | |
emoji: π | |
colorFrom: blue | |
colorTo: green | |
pinned: false | |
# MIK TSE β TikTok Viral Mastery Assistant | |
Welcome to MIK TSE, your AI-powered chatbot for TikTok growth, onboarding, and creator support. Built by Michael using Gradio and custom JSON flows, this assistant helps Ethiopian creators navigate viral strategies, FAQs, and community tools. | |
## π₯ Features | |
- π― TikTok growth tips | |
- π¦ Onboarding flows | |
- π¬ Telegram support integration | |
- π§ Local slang and brand tone | |
## π How to Use | |
Type your question in the chat box β MIK TSE will respond instantly with branded advice and support. | |
## π‘ Connect | |
- Telegram: [https://t.me/MikTse](@Miktsegrp) | |
- Website: [https://miktse.gumroad.com/l/TikTokViralMastery] | |
## π οΈ Tech Stack | |
- Gradio | |
- Python | |
- Hugging Face Spaces | |
- # TikTok Viral Mastery β Question-Answering Chatbot (Hugging Face Space) | |
A Gradio-based chatbot that answers user questions using your **TikTok Viral Mastery** course content as its knowledge base. | |
This repo contains: | |
- `app.py` β the Gradio app and retrieval + generation pipeline | |
- `knowledge/` β put your course files here (plain `.txt`, `.md`, or `.pdf` converted to text) | |
- `.gitattributes` | |
--- | |
## Features | |
- Simple TF-IDF retrieval over course chunks. | |
- Optional LLM generation via Hugging Face Inference API for fluent answers. | |
- Works offline (returns top-k context excerpts) if you don't provide an API key. | |
- Upload course files via `/knowledge` (for local usage) or push to the repo. | |
--- | |
## Quick start (recommended for Hugging Face Spaces) | |
1. Create a new Space (Gradio) on Hugging Face. | |
2. Add this repo's files. | |
3. Add your course text files inside a `knowledge/` directory at the repo root (e.g. `knowledge/course.txt`, `knowledge/chapter1.md`). | |
4. (Optional) In the Space settings, set secrets: | |
- `HF_TOKEN` β your Hugging Face API token (required for calling Inference API) | |
- `HF_MODEL` β model ID to use for generation (default `google/flan-t5-large` if not set) | |
5. Run the Space. The app will index `knowledge/` on startup. | |
--- | |
## Environment & Requirements | |
Install locally for testing: | |
```bash | |
python -m venv venv | |
source venv/bin/activate | |
pip install -r requirements.txt | |
python app.py # runs a local Gradio server | |
--- | |
Created by Michael β student, entrepreneur, and designer empowering creators through automation. |