π§ Audi AI Diagnosis Agent
AI-powered assistant for diagnosing chronic issues in Audi vehicles.
Built with Streamlit, powered by Transformers, and fine-tuned on real-world repair patterns.
π What is this?
Audi AI Diagnosis helps identify possible chronic faults in Audi vehicles.
Using a fine-tuned mBART model, the app turns natural language symptom descriptions into likely diagnoses.
Features
- Optimized for Audi-specific problem phrases
- Responds in natural, technical language
- Real-time inference with Hugging Face Transformers
π Try it out
π§© Model
This app uses the publicly available model:
Architecture: facebook/mbart-large-50
Task: Text2Text generation (Symptom β Diagnosis)
π¦ Dependencies
Create a requirements.txt with:
streamlit>=1.36
transformers>=4.41
torch>=2.2
sentencepiece>=0.2 # Required for mBART tokenization
accelerate>=0.31 # Optional but recommended (for device_map="auto")
π» Example app.py
import streamlit as st
import torch
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
MODEL_ID = "MahmutCanBoran/mbart-audi-diagnosis-agent"
@st.cache_resource(show_spinner=True)
def load_model():
tokenizer = AutoTokenizer.from_pretrained(MODEL_ID)
model = AutoModelForSeq2SeqLM.from_pretrained(MODEL_ID)
if torch.cuda.is_available():
model = model.to("cuda")
return tokenizer, model
st.set_page_config(page_title="Audi AI Diagnosis", page_icon="π")
st.title("π Audi AI Diagnosis Agent")
st.caption("mBART-50 based: symptom β likely diagnosis")
symptom = st.text_area(
"Enter symptom:",
height=110,
placeholder="in my A4 40 TDI there is a rattling noise during acceleration"
)
if st.button("Diagnose", type="primary", use_container_width=True):
tokenizer, model = load_model()
inputs = tokenizer(symptom.strip(), return_tensors="pt")
if torch.cuda.is_available():
inputs = {k: v.to("cuda") for k, v in inputs.items()}
with torch.inference_mode():
outputs = model.generate(**inputs, max_new_tokens=96)
st.success("Likely diagnosis")
st.write(tokenizer.decode(outputs[0], skip_special_tokens=True))
π§ Local Setup & Run
# Clone repo
git clone https://huggingface.co/spaces/MahmutCanBoran/audi-ai-diagnosis
cd audi-ai-diagnosis
# Install dependencies
pip install -r requirements.txt
# Run the app
streamlit run app.py
- Downloads last month
- 10
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
π
Ask for provider support
Model tree for MahmutCanBoran/mbart-audi-diagnosis-agent
Base model
facebook/mbart-large-50-many-to-many-mmt