PierreBrunelle
commited on
Commit
•
50f5b14
1
Parent(s):
210dd42
Update app.py
Browse files
app.py
CHANGED
@@ -140,6 +140,28 @@ with gr.Blocks(theme=Monochrome()) as demo:
|
|
140 |
"""
|
141 |
)
|
142 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
143 |
with gr.Row():
|
144 |
with gr.Column(scale=1):
|
145 |
pdf_files = gr.File(label="Upload PDF Documents", file_count="multiple")
|
|
|
140 |
"""
|
141 |
)
|
142 |
|
143 |
+
with gr.Row():
|
144 |
+
with gr.Column():
|
145 |
+
with gr.Accordion("What This Demo Does", open = True):
|
146 |
+
gr.Markdown("""
|
147 |
+
This AI Chatbot application uses Retrieval-Augmented Generation (RAG) to provide intelligent responses based on the content of uploaded PDF documents. It allows users to:
|
148 |
+
1. Upload multiple PDF documents
|
149 |
+
2. Process and index the content of these documents
|
150 |
+
3. Ask questions about the content
|
151 |
+
4. Receive AI-generated answers that are grounded in the uploaded documents
|
152 |
+
""")
|
153 |
+
with gr.Column():
|
154 |
+
with gr.Accordion("How does it work?", open = True):
|
155 |
+
gr.Markdown("""
|
156 |
+
**Question Answering:**
|
157 |
+
- When a user asks a question, the system searches for the most relevant chunks of text from the uploaded documents.
|
158 |
+
- It then uses these relevant chunks as context for a large language model (LLM) to generate an answer.
|
159 |
+
- The LLM (in this case, GPT-4) formulates a response based on the provided context and the user's question.
|
160 |
+
**Pixeltable Integration:**
|
161 |
+
- Pixeltable is used to manage the document data, chunks, and embeddings efficiently.
|
162 |
+
- It provides a declarative interface for complex data operations, making it easier to build and maintain this RAG system.
|
163 |
+
""")
|
164 |
+
|
165 |
with gr.Row():
|
166 |
with gr.Column(scale=1):
|
167 |
pdf_files = gr.File(label="Upload PDF Documents", file_count="multiple")
|