Update README.md
Browse files
README.md
CHANGED
@@ -22,46 +22,6 @@ This page showcases the **LangGraph Agent Chat UI**, a key project developed by
|
|
22 |
* **Stay Up-to-Date:** Get the latest news and updates on LangGraph development.
|
23 |
* **Showcase:** You can share your Agent Chat UI projects, and graphs with other user group members.
|
24 |
|
25 |
-
## LangGraph Agent Chat UI: Your Gateway to Agent Interaction
|
26 |
-
|
27 |
-
The Agent Chat UI, built by the community *for* the community, is a React/Vite application that provides a clean, chat-based interface for interacting with your LangGraph agents. Here's why it's a valuable tool:
|
28 |
-
|
29 |
-
* **Easy Connection:** Connect to local or deployed LangGraph agents with a simple URL and graph ID.
|
30 |
-
* **Intuitive Chat:** Interact naturally with your agents, sending and receiving messages in a familiar chat format.
|
31 |
-
* **Visualize Agent Actions:** See tool calls and their results rendered directly in the UI.
|
32 |
-
* **Human-in-the-Loop Made Easy:** Seamlessly integrate human input using LangGraph's `interrupt` feature. The UI handles the presentation and interaction, allowing for approvals, edits, and responses.
|
33 |
-
* **Explore Execution Paths:** Use the UI to travel through time, inspect checkpoints, and fork conversations, all powered by LangGraph's state management.
|
34 |
-
* **Debug and Understand:** Inspect the full state of your LangGraph thread at any point.
|
35 |
-
|
36 |
-
## Get Started with the Agent Chat UI (and LangGraph!)
|
37 |
-
|
38 |
-
You have several options to start using the UI:
|
39 |
-
|
40 |
-
### 1. Try the Deployed Version (No Setup Required!)
|
41 |
-
|
42 |
-
* **Visit:** [agentchat.vercel.app](https://agentchat.vercel.app/)
|
43 |
-
* **Connect:** Enter your LangGraph deployment URL and graph ID (the `path` you set with `langserve.add_routes`). If using a production deployment, also include your LangSmith API key.
|
44 |
-
* **Chat!** You're ready to interact with your agent.
|
45 |
-
|
46 |
-
### 2. Run Locally (for Development and Customization)
|
47 |
-
|
48 |
-
* **Option A: Clone the Repository:**
|
49 |
-
```bash
|
50 |
-
git clone https://github.com/langchain-ai/agent-chat-ui.git
|
51 |
-
cd agent-chat-ui
|
52 |
-
pnpm install # Or npm install/yarn install
|
53 |
-
pnpm dev # Or npm run dev/yarn dev
|
54 |
-
```
|
55 |
-
* **Option B: Quickstart with `npx`:**
|
56 |
-
```bash
|
57 |
-
npx create-agent-chat-app
|
58 |
-
cd agent-chat-app
|
59 |
-
pnpm install # Or npm install/yarn install
|
60 |
-
pnpm dev # Or npm run dev/yarn dev
|
61 |
-
```
|
62 |
-
|
63 |
-
Open your browser to `http://localhost:5173` (or the port indicated in your terminal).
|
64 |
-
|
65 |
## LangGraph Code Examples
|
66 |
|
67 |
To use the Agent Chat UI, you'll need a running LangGraph agent served via LangServe. Here are two Python examples (basic and with human-in-the-loop):
|
@@ -92,338 +52,6 @@ To use the Agent Chat UI, you'll need a running LangGraph agent served via LangS
|
|
92 |
|
93 |
We encourage you to join the LangGraph User Group to connect with fellow developers, ask questions, share your projects, and contribute to the LangGraph ecosystem. Click the "Join Group" button to become a member! We look forward to seeing what you build with LangGraph!
|
94 |
|
95 |
-
# LangGraph Agent Chat UI
|
96 |
-
|
97 |
-
This project provides a simple, intuitive user interface (UI) for interacting with LangGraph agents. It's built with React and Vite, offering a responsive chat-like experience for testing and demonstrating your LangGraph deployments. It's designed to work seamlessly with LangGraph's core concepts, including checkpoints, thread management, and human-in-the-loop capabilities.
|
98 |
-
|
99 |
-
## Features
|
100 |
-
|
101 |
-
* **Easy Connection:** Connect to both local and production LangGraph deployments by simply providing the deployment URL and graph ID (the path used when defining the graph).
|
102 |
-
* **Chat Interface:** Interact with your agents through a familiar chat interface, sending and receiving messages in real-time. The UI manages the conversation thread, automatically using checkpoints for persistence.
|
103 |
-
* **Tool Call Rendering:** The UI automatically renders tool calls and their results, making it easy to visualize the agent's actions. This is compatible with LangGraph's [tool calling and function calling capabilities](https://python.langchain.com/docs/guides/tools/custom_tools).
|
104 |
-
* **Human-in-the-Loop Support:** Seamlessly integrate human intervention using LangGraph's `interrupt` function. The UI presents a dedicated interface for reviewing, editing, and responding to interrupt requests (e.g., for approval or modification of agent actions), following the standardized schema.
|
105 |
-
* **Thread History:** View and navigate through past chat threads, enabling you to review previous interactions. This leverages LangGraph's checkpointing for persistent conversation history.
|
106 |
-
* **Time Travel and Forking:** Leverage LangGraph's powerful state management features, including [checkpointing](https://python.langchain.com/docs/modules/agents/concepts#checkpointing) and thread manipulation. Run the graph from specific checkpoints, explore different execution paths, and edit previous messages.
|
107 |
-
* **State Inspection:** Examine the current state of your LangGraph thread for debugging and understanding the agent's internal workings. This allows you to inspect the full state object managed by LangGraph.
|
108 |
-
* **Multiple Deployment Options:**
|
109 |
-
* **Deployed Site:** Use the hosted version at [agentchat.vercel.app](https://agentchat.vercel.app/)
|
110 |
-
* **Local Development:** Clone the repository and run it locally for development and customization.
|
111 |
-
* **Quick Setup:** Use `npx create-agent-chat-app` for a fast, streamlined setup.
|
112 |
-
* **Langsmith API key:** When utilizing a product deployment you must provide an Langsmith API key.
|
113 |
-
|
114 |
-
## Getting Started
|
115 |
-
|
116 |
-
There are three main ways to use the Agent Chat UI:
|
117 |
-
|
118 |
-
### 1. Using the Deployed Site (Easiest)
|
119 |
-
|
120 |
-
1. **Navigate:** Go to [agentchat.vercel.app](https://agentchat.vercel.app/).
|
121 |
-
2. **Enter Details:**
|
122 |
-
* **Deployment URL:** The URL of your LangGraph deployment (e.g., `http://localhost:2024` for a local deployment using LangServe, or the URL provided by LangSmith for a production deployment).
|
123 |
-
* **Assistant / Graph ID:** The path of the graph you want to interact with (e.g., `chat`, `email_agent`). This is defined when adding routes with `add_routes(..., path="/your_path")`.
|
124 |
-
* **LangSmith API Key** (Production Deployments Only): If you are connecting to a deployment hosted on LangSmith, you will need to provide your LangSmith API key for authentication. *This is NOT required for local LangGraph servers.* The key is stored locally in your browser's storage.
|
125 |
-
3. **Click "Continue":** You'll be taken to the chat interface, ready to interact with your agent.
|
126 |
-
|
127 |
-
### 2. Local Development (Full Control)
|
128 |
-
|
129 |
-
1. **Clone the Repository:**
|
130 |
-
|
131 |
-
```bash
|
132 |
-
git clone https://github.com/langchain-ai/agent-chat-ui.git
|
133 |
-
cd agent-chat-ui
|
134 |
-
```
|
135 |
-
|
136 |
-
2. **Install Dependencies:**
|
137 |
-
|
138 |
-
```bash
|
139 |
-
pnpm install # Or npm install, or yarn install
|
140 |
-
```
|
141 |
-
|
142 |
-
3. **Start the Development Server:**
|
143 |
-
|
144 |
-
```bash
|
145 |
-
pnpm dev # Or npm run dev, or yarn dev
|
146 |
-
```
|
147 |
-
|
148 |
-
4. **Open in Browser:** The application will typically be available at `http://localhost:5173` (the port may vary; check your terminal output). Follow the instructions in "Using the Deployed Site" to connect to your LangGraph.
|
149 |
-
|
150 |
-
### 3. Quick Setup with `npx create-agent-chat-app`
|
151 |
-
|
152 |
-
This method creates a new project directory with the Agent Chat UI already set up.
|
153 |
-
|
154 |
-
1. **Run the Command:**
|
155 |
-
|
156 |
-
```bash
|
157 |
-
npx create-agent-chat-app
|
158 |
-
```
|
159 |
-
|
160 |
-
2. **Follow Prompts:** You'll be prompted for a project name (default is `agent-chat-app`).
|
161 |
-
|
162 |
-
3. **Navigate to Project Directory:**
|
163 |
-
|
164 |
-
```bash
|
165 |
-
cd agent-chat-app
|
166 |
-
```
|
167 |
-
|
168 |
-
4. **Install and Run:**
|
169 |
-
|
170 |
-
```bash
|
171 |
-
pnpm install # Or npm install, or yarn install
|
172 |
-
pnpm dev # Or npm run dev, or yarn dev
|
173 |
-
```
|
174 |
-
|
175 |
-
5. **Open in Browser:** The application will be available at `http://localhost:5173`. Follow the instructions in "Using the Deployed Site" to connect.
|
176 |
-
|
177 |
-
## LangGraph Setup (Prerequisites)
|
178 |
-
|
179 |
-
Before using the Agent Chat UI, you need a running LangGraph agent served via LangServe. Below are examples using both a simple agent and an agent with human-in-the-loop.
|
180 |
-
|
181 |
-
### Basic LangGraph Example (Python)
|
182 |
-
|
183 |
-
```python
|
184 |
-
# agent.py (Example LangGraph agent - Python)
|
185 |
-
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
|
186 |
-
from langchain_core.runnables import chain
|
187 |
-
from langchain_openai import ChatOpenAI
|
188 |
-
from langchain_core.messages import AIMessage, HumanMessage
|
189 |
-
from langgraph.prebuilt import create_agent_executor
|
190 |
-
from langchain_core.tools import tool
|
191 |
-
|
192 |
-
# FastAPI and LangServe for serving the graph
|
193 |
-
from fastapi import FastAPI
|
194 |
-
from langserve import add_routes
|
195 |
-
|
196 |
-
|
197 |
-
@tool
|
198 |
-
def get_weather(city: str):
|
199 |
-
"""
|
200 |
-
Gets the weather for a specified city
|
201 |
-
"""
|
202 |
-
if city.lower() == "new york":
|
203 |
-
return "The weather in New York is nice today with a high of 75F."
|
204 |
-
else:
|
205 |
-
return "The weather for that city is not supported"
|
206 |
-
|
207 |
-
|
208 |
-
# Define the tools
|
209 |
-
tools = [get_weather]
|
210 |
-
|
211 |
-
prompt = ChatPromptTemplate.from_messages(
|
212 |
-
[
|
213 |
-
("system", "You are a helpful assistant"),
|
214 |
-
MessagesPlaceholder(variable_name="messages"),
|
215 |
-
MessagesPlaceholder(variable_name="agent_scratchpad"),
|
216 |
-
]
|
217 |
-
)
|
218 |
-
|
219 |
-
model = ChatOpenAI(temperature=0).bind_tools(tools)
|
220 |
-
|
221 |
-
|
222 |
-
@chain
|
223 |
-
def transform_messages(data):
|
224 |
-
messages = data["messages"]
|
225 |
-
if not isinstance(messages[-1], HumanMessage):
|
226 |
-
messages.append(
|
227 |
-
AIMessage(
|
228 |
-
content="I don't know how to respond to messages other than a final answer"
|
229 |
-
)
|
230 |
-
)
|
231 |
-
return {"messages": messages}
|
232 |
-
|
233 |
-
|
234 |
-
agent = (
|
235 |
-
{
|
236 |
-
"messages": transform_messages,
|
237 |
-
"agent_scratchpad": lambda x: [], # No tools in this simple example
|
238 |
-
}
|
239 |
-
| prompt
|
240 |
-
| model
|
241 |
-
)
|
242 |
-
|
243 |
-
# Wrap the agent in a RunnableGraph
|
244 |
-
app = create_agent_executor(agent, tools)
|
245 |
-
|
246 |
-
# Serve the graph using FastAPI and langserve
|
247 |
-
fastapi_app = FastAPI(
|
248 |
-
title="LangGraph Agent",
|
249 |
-
version="1.0",
|
250 |
-
description="A simple LangGraph agent server",
|
251 |
-
)
|
252 |
-
|
253 |
-
# Mount LangServe at the /agent endpoint
|
254 |
-
add_routes(
|
255 |
-
fastapi_app,
|
256 |
-
app,
|
257 |
-
path="/chat", # Matches the graph ID we'll use in the UI
|
258 |
-
)
|
259 |
-
|
260 |
-
if __name__ == "__main__":
|
261 |
-
import uvicorn
|
262 |
-
|
263 |
-
uvicorn.run(fastapi_app, host="localhost", port=2024)
|
264 |
-
|
265 |
-
```
|
266 |
-
To run this example:
|
267 |
-
|
268 |
-
1. Save the code as `agent.py`.
|
269 |
-
2. Install necessary packages: `pip install langchain langchain-core langchain-openai langgraph fastapi uvicorn "langserve[all]"` (add any other packages for your tools).
|
270 |
-
3. Set your OpenAI API key: `export OPENAI_API_KEY="your-openai-api-key"`
|
271 |
-
4. Run the script: `python agent.py`
|
272 |
-
5. Your LangGraph agent will be running at `http://localhost:2024/chat`, and the graph ID to enter into the ui is `chat`.
|
273 |
-
|
274 |
-
### LangGraph with Human-in-the-Loop Example (Python)
|
275 |
-
|
276 |
-
```python
|
277 |
-
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
|
278 |
-
from langchain_core.runnables import chain
|
279 |
-
from langchain_openai import ChatOpenAI
|
280 |
-
from langchain_core.messages import AIMessage, HumanMessage
|
281 |
-
from langgraph.prebuilt import create_agent_executor, ToolInvocation, interrupt
|
282 |
-
from langchain_core.tools import tool
|
283 |
-
from fastapi import FastAPI
|
284 |
-
from langserve import add_routes
|
285 |
-
|
286 |
-
|
287 |
-
@tool
|
288 |
-
def write_email(subject: str, body: str, to: str):
|
289 |
-
"""
|
290 |
-
Drafts an email with a specified subject, body and recipient
|
291 |
-
"""
|
292 |
-
print(f"Writing email with subject '{subject}' to '{to}'") # Debugging
|
293 |
-
return f"Draft email to {to} with subject {subject} sent."
|
294 |
-
|
295 |
-
|
296 |
-
tools = [write_email]
|
297 |
-
|
298 |
-
prompt = ChatPromptTemplate.from_messages(
|
299 |
-
[
|
300 |
-
("system", "You are a helpful assistant that drafts emails."),
|
301 |
-
MessagesPlaceholder(variable_name="messages"),
|
302 |
-
MessagesPlaceholder(variable_name="agent_scratchpad"),
|
303 |
-
]
|
304 |
-
)
|
305 |
-
|
306 |
-
|
307 |
-
model = ChatOpenAI(temperature=0, model="gpt-4-turbo-preview").bind_tools(tools)
|
308 |
-
|
309 |
-
|
310 |
-
@chain
|
311 |
-
def transform_messages(data):
|
312 |
-
messages = data["messages"]
|
313 |
-
if not isinstance(messages[-1], HumanMessage):
|
314 |
-
messages.append(
|
315 |
-
AIMessage(
|
316 |
-
content="I don't know how to respond to messages other than a final answer"
|
317 |
-
)
|
318 |
-
)
|
319 |
-
return {"messages": messages}
|
320 |
-
|
321 |
-
|
322 |
-
|
323 |
-
def handle_interrupt(state):
|
324 |
-
"""Handles human-in-the-loop interruptions."""
|
325 |
-
print("---INTERRUPT---") # Debugging
|
326 |
-
messages = state["messages"]
|
327 |
-
last_message = messages[-1]
|
328 |
-
|
329 |
-
if isinstance(last_message, AIMessage) and isinstance(
|
330 |
-
last_message.content, list
|
331 |
-
):
|
332 |
-
# Find the tool call
|
333 |
-
for msg in last_message.content:
|
334 |
-
if isinstance(msg, ToolInvocation):
|
335 |
-
tool_name = msg.name
|
336 |
-
tool_args = msg.args
|
337 |
-
if tool_name == "write_email":
|
338 |
-
# Construct the human interrupt request
|
339 |
-
interrupt_data = {
|
340 |
-
"type": "interrupt",
|
341 |
-
"args": {
|
342 |
-
"type": "response",
|
343 |
-
"studio": { # optional
|
344 |
-
"subject": tool_args["subject"],
|
345 |
-
"body": tool_args["body"],
|
346 |
-
"to": tool_args["to"],
|
347 |
-
},
|
348 |
-
"description": "Response Instruction: \n\n- **Response**: Any response submitted will be passed to an LLM to rewrite the email. It can rewrite the email body, subject, or recipient.\n\n- **Edit or Accept**: Editing/Accepting the email.",
|
349 |
-
},
|
350 |
-
}
|
351 |
-
# Call the interrupt function and return the new state
|
352 |
-
return interrupt(messages, interrupt_data)
|
353 |
-
return {"messages": messages}
|
354 |
-
|
355 |
-
|
356 |
-
agent = (
|
357 |
-
{
|
358 |
-
"messages": transform_messages,
|
359 |
-
"agent_scratchpad": lambda x: x.get("agent_scratchpad", []),
|
360 |
-
}
|
361 |
-
| prompt
|
362 |
-
| model
|
363 |
-
| handle_interrupt # Add the interrupt handler
|
364 |
-
)
|
365 |
-
|
366 |
-
# Wrap the agent in a RunnableGraph
|
367 |
-
app = create_agent_executor(agent, tools)
|
368 |
-
|
369 |
-
# Serve the graph using FastAPI and langserve
|
370 |
-
fastapi_app = FastAPI(
|
371 |
-
title="LangGraph Agent",
|
372 |
-
version="1.0",
|
373 |
-
description="A simple LangGraph agent server",
|
374 |
-
)
|
375 |
-
|
376 |
-
# Mount LangServe at the /agent endpoint
|
377 |
-
add_routes(
|
378 |
-
fastapi_app,
|
379 |
-
app,
|
380 |
-
path="/email_agent", # Matches the graph ID we'll use in the UI
|
381 |
-
)
|
382 |
-
|
383 |
-
if __name__ == "__main__":
|
384 |
-
import uvicorn
|
385 |
-
|
386 |
-
uvicorn.run(fastapi_app, host="localhost", port=2024)
|
387 |
-
|
388 |
-
```
|
389 |
-
To run this example:
|
390 |
-
|
391 |
-
1. Save the code as `agent.py`.
|
392 |
-
2. Install necessary packages: `pip install langchain langchain-core langchain-openai langgraph fastapi uvicorn "langserve[all]"` (add any other packages for your tools).
|
393 |
-
3. Set your OpenAI API key: `export OPENAI_API_KEY="your-openai-api-key"`
|
394 |
-
4. Run the script: `python agent.py`
|
395 |
-
5. Your LangGraph agent will be running at `http://localhost:2024/email_agent`, and the graph ID to enter into the ui is `email_agent`.
|
396 |
-
|
397 |
-
## Key Concepts (LangGraph Integration)
|
398 |
-
|
399 |
-
* **Messages Key:** The Agent Chat UI expects your LangGraph state to include a `messages` key, which holds a list of `langchain_core.messages.BaseMessage` instances (e.g., `HumanMessage`, `AIMessage`, `SystemMessage`, `ToolMessage`). This is standard practice in LangChain and LangGraph for conversational agents.
|
400 |
-
* **Checkpoints:** The UI automatically utilizes LangGraph's checkpointing mechanism to save and restore the conversation state. This ensures that you can resume conversations and explore different branches without losing progress.
|
401 |
-
* **`add_routes` and `path`:** The `path` argument in `add_routes` (from `langserve`) determines the "Graph ID" that you'll enter in the UI. This is crucial for the UI to connect to the correct LangGraph endpoint.
|
402 |
-
* **Tool Calling:** If you use `bind_tools` with your LLM, tool calls and tool results will be rendered in the UI, with clear labels showing the function call and the response.
|
403 |
-
|
404 |
-
## Human-in-the-Loop Details
|
405 |
-
|
406 |
-
The Agent Chat UI supports human-in-the-loop interactions using the standard LangGraph interrupt schema. Here's how it works:
|
407 |
-
|
408 |
-
1. **Interrupt Schema:** Your LangGraph agent should call the `interrupt` function (from `langgraph.prebuilt`) with a specific schema to pause execution and request human input. The schema should include:
|
409 |
-
* `type`: `interrupt`.
|
410 |
-
* `args`: A dictionary containing information about the interruption. This is where you provide the data the human needs to review (e.g., a draft email, a proposed action).
|
411 |
-
* `type`: Can be one of `"response"`, `"accept"`, or `"ignore"`. This indicates the type of human interaction expected.
|
412 |
-
* `args`: Further arguments specific to the interrupt type. For instance, if the interrupt type is `response`, the `args` could contain a message to give to the user.
|
413 |
-
* `studio`: *Optional.* If included, this must contain `subject`, `body`, and `to` keys for interrupt requests.
|
414 |
-
* `description`: *Optional.* If used, this provides a static prompt to the user that displays the fields the human needs to complete.
|
415 |
-
* `name` (optional): A name for the interrupt.
|
416 |
-
* `id` (optional): A unique identifier for the interrupt.
|
417 |
-
|
418 |
-
2. **UI Rendering:** When the Agent Chat UI detects an interrupt with this schema, it will automatically render a user-friendly interface for human interaction. This interface allows the user to:
|
419 |
-
* **Inspect:** View the data provided in the `args` of the interrupt (e.g., the content of a draft email).
|
420 |
-
* **Edit:** Modify the data (if the interrupt schema allows for it).
|
421 |
-
* **Respond:** Provide a response (if the interrupt type is `"response"`).
|
422 |
-
* **Accept/Reject:** Approve or reject the proposed action (if the interrupt type is `"accept"`).
|
423 |
-
* **Ignore:** Ignore the interrupt (if the interrupt type is `"ignore"`).
|
424 |
-
|
425 |
-
3. **Resuming Execution:** After the human interacts with the interrupt, the UI sends the response back to the LangGraph via LangServe, and execution resumes.
|
426 |
-
|
427 |
## Contributing
|
428 |
|
429 |
Contributions are welcome! Please see the [GitHub repository](https://github.com/langchain-ai/agent-chat-ui) for issues and pull requests.
|
|
|
22 |
* **Stay Up-to-Date:** Get the latest news and updates on LangGraph development.
|
23 |
* **Showcase:** You can share your Agent Chat UI projects, and graphs with other user group members.
|
24 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
25 |
## LangGraph Code Examples
|
26 |
|
27 |
To use the Agent Chat UI, you'll need a running LangGraph agent served via LangServe. Here are two Python examples (basic and with human-in-the-loop):
|
|
|
52 |
|
53 |
We encourage you to join the LangGraph User Group to connect with fellow developers, ask questions, share your projects, and contribute to the LangGraph ecosystem. Click the "Join Group" button to become a member! We look forward to seeing what you build with LangGraph!
|
54 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
55 |
## Contributing
|
56 |
|
57 |
Contributions are welcome! Please see the [GitHub repository](https://github.com/langchain-ai/agent-chat-ui) for issues and pull requests.
|