Spaces:
Sleeping
A newer version of the Streamlit SDK is available:
1.43.2
title: Llm Bill Chat
emoji: ๐ฅธ๐งฎ
colorFrom: indigo
colorTo: red
sdk: streamlit
sdk_version: 1.41.1
app_file: One_model.py
pinned: false
license: apache-2.0
short_description: 'LLM app '
LLM Bill Chat App
This project is a proof of concept for a chat application utilizing a Large Language Model (LLM) to assist users with their telecom billing inquiries. The application is built using Python and Streamlit, providing an interactive web interface for users to engage with.
Features
- Maintain chat conversation context (history)
- Allow users to query their billing information
- Compare the last bills and provide insights
- Respond exclusively with the user's own billing information
- Augment the prompt instructions with user's text recognized entities - NER.
- Save user information and conversation history
Project Structure
llm-bill-chat-app โโโ src โ โโโ chat โ โ โโโ init.py โ โ โโโ context.py โ โ โโโ bill_comparison.py โ โ โโโ user_info.py โ โ โโโ conversation.py โ โโโ utils โ โโโ init.py โโโ bill.py # Streamlit app initialization for utils module โโโ requirements.txt โโโ README.md
Installation
Clone the repository:
git remote add origin https://github.com/serbantica/llm-bill-chat.git cd llm-chat-app
Create and activate a virtual environment (Windows example):
python -m venv .venv .venv\Scrips\activate
Install the required dependencies:
pip install -r requirements.txt
Usage
To run the application, execute the following command:
streamlit run bill.py
Open your web browser and navigate to http://localhost:8501
to interact with the chat application.
Contributing
Contributions are welcome! Please feel free to submit a pull request or open an issue for any suggestions or improvements.
License
This project is licensed under the MIT License. See the LICENSE file for more details.