πŸ”§ Audi AI Diagnosis Agent

Audi AI Diagnosis

AI-powered assistant for diagnosing chronic issues in Audi vehicles.
Built with Streamlit, powered by Transformers, and fine-tuned on real-world repair patterns.


🚘 What is this?

Audi AI Diagnosis helps identify possible chronic faults in Audi vehicles.
Using a fine-tuned mBART model, the app turns natural language symptom descriptions into likely diagnoses.

Features

  • Optimized for Audi-specific problem phrases
  • Responds in natural, technical language
  • Real-time inference with Hugging Face Transformers

πŸš€ Try it out

Open in Spaces


🧩 Model

This app uses the publicly available model:

πŸ”— MahmutCanBoran/mbart-audi-diagnosis-agent

Architecture: facebook/mbart-large-50
Task: Text2Text generation (Symptom ➝ Diagnosis)


πŸ“¦ Dependencies

Create a requirements.txt with:

streamlit>=1.36
transformers>=4.41
torch>=2.2
sentencepiece>=0.2  # Required for mBART tokenization
accelerate>=0.31    # Optional but recommended (for device_map="auto")

πŸ’» Example app.py

import streamlit as st
import torch
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM

MODEL_ID = "MahmutCanBoran/mbart-audi-diagnosis-agent"

@st.cache_resource(show_spinner=True)
def load_model():
    tokenizer = AutoTokenizer.from_pretrained(MODEL_ID)
    model = AutoModelForSeq2SeqLM.from_pretrained(MODEL_ID)
    if torch.cuda.is_available():
        model = model.to("cuda")
    return tokenizer, model

st.set_page_config(page_title="Audi AI Diagnosis", page_icon="🚘")
st.title("🚘 Audi AI Diagnosis Agent")
st.caption("mBART-50 based: symptom β†’ likely diagnosis")

symptom = st.text_area(
    "Enter symptom:",
    height=110,
    placeholder="in my A4 40 TDI there is a rattling noise during acceleration"
)

if st.button("Diagnose", type="primary", use_container_width=True):
    tokenizer, model = load_model()
    inputs = tokenizer(symptom.strip(), return_tensors="pt")
    if torch.cuda.is_available():
        inputs = {k: v.to("cuda") for k, v in inputs.items()}
    with torch.inference_mode():
        outputs = model.generate(**inputs, max_new_tokens=96)
    st.success("Likely diagnosis")
    st.write(tokenizer.decode(outputs[0], skip_special_tokens=True))

🧭 Local Setup & Run

# Clone repo
git clone https://huggingface.co/spaces/MahmutCanBoran/audi-ai-diagnosis
cd audi-ai-diagnosis

# Install dependencies
pip install -r requirements.txt

# Run the app
streamlit run app.py
Downloads last month
10
Safetensors
Model size
611M params
Tensor type
F32
Β·
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for MahmutCanBoran/mbart-audi-diagnosis-agent

Finetuned
(158)
this model

Space using MahmutCanBoran/mbart-audi-diagnosis-agent 1