Skip to Content
Vertex ComponentsAI Chat

AI Chat

A composable AI chat UI component for the Apollo Design System. Built with React, TypeScript, and Tailwind CSS. Designed to work with TanStack AI  — you bring useChat and a connection adapter, the component handles the chrome (scroll, input, loading, suggestions, errors) while you control how messages and tool calls render.

Features

  • TanStack AI Integration — Works with useChat from @tanstack/ai-react and UIMessage types
  • ComposableAiChat is the shell, AiChatMessage renders messages, you iterate parts and render tools inline
  • Type-Safe Tool Rendering — Check part.name in the parts loop and TypeScript narrows part.output automatically
  • AgentHub Adapter — Built-in adapter for the UiPath AgentHub normalized LLM endpoint (OpenAI + Anthropic models)
  • Markdown Rendering — Renders assistant responses with GitHub Flavored Markdown
  • Suggestion Buttons — Interactive choice buttons rendered from tool results
  • Error Display — Inline error banner for API and network errors
  • i18n Support — Built-in internationalization via react-i18next
  • Accessible — WCAG 2.1 compliant with keyboard navigation and ARIA live regions

Installation

npx shadcn@latest add @uipath/ai-chat

Install peer dependencies:

npm install @tanstack/ai @tanstack/ai-client @tanstack/ai-react eventsource-parser zod lucide-react react-i18next react-markdown remark-gfm

Quick Start

import { useChat } from '@tanstack/ai-react'; import { AiChat } from '@/components/ui/ai-chat/components/ai-chat'; import { AiChatMessage } from '@/components/ui/ai-chat/components/ai-chat-message'; import { createAgentHubConnection } from '@/components/ui/ai-chat/adapters/agenthub/adapter'; function BasicChat() { const connection = createAgentHubConnection({ baseUrl: 'https://cloud.uipath.com/{org}/{tenant}/agenthub_/llm/api', model: { vendor: 'openai' as const, name: 'gpt-4o' }, accessToken: () => getAccessToken(), systemPrompt: 'You are a helpful assistant.', }); const { messages, sendMessage, isLoading, stop, clear, error } = useChat({ connection, }); return ( <AiChat messages={messages} isLoading={isLoading} onSendMessage={(text) => sendMessage(text)} onStop={stop} onClearChat={clear} error={error} title="AI Assistant" > {messages.map((message) => ( <AiChatMessage key={message.id} message={message} /> ))} </AiChat> ); }

Tool Rendering

Render tool output inline in the chat — just like TanStack AI’s own examples. Define tools with toolDefinition, pass the input through as output in your client tool, then check part.name in the parts loop. TypeScript narrows part.output automatically.

import { z } from 'zod'; import { toolDefinition } from '@tanstack/ai'; import { clientTools } from '@tanstack/ai-client'; import { stream, useChat } from '@tanstack/ai-react'; import { AiChat } from '@/components/ui/ai-chat/components/ai-chat'; import { AiChatMessage } from '@/components/ui/ai-chat/components/ai-chat-message'; // 1. Define tools — output passes input through for rendering const showResultsInput = z.object({ entityName: z.string(), columns: z.array(z.string()), }); const showResultsDef = toolDefinition({ name: 'show_results', description: 'Display a results table', inputSchema: showResultsInput, outputSchema: showResultsInput, }); const showResults = showResultsDef.client((input) => input); const toolDefs = clientTools(showResults); // 2. Wire it up — iterate parts, render tools inline function ChatWithTools() { const { messages, sendMessage, isLoading, stop } = useChat({ connection, tools: toolDefs, }); return ( <AiChat messages={messages} isLoading={isLoading} onSendMessage={(text) => sendMessage(text)} onStop={stop} > {messages.map((message) => ( <AiChatMessage key={message.id} message={message}> {message.parts.map((part) => { // TypeScript narrows part.output when you check part.name if (part.type === 'tool-call' && part.name === 'show_results' && part.output) { return <ResultsTable key={part.id} entity={part.output.entityName} columns={part.output.columns} />; } return null; })} </AiChatMessage> ))} </AiChat> ); }

AgentHub Adapter

The built-in adapter for the UiPath AgentHub normalized LLM endpoint. It converts TanStack AI UIMessage arrays to the AgentHub wire format, calls the endpoint, and parses the SSE response back into AG-UI StreamChunk events.

import { createAgentHubConnection, type AgentHubAdapterConfig } from '@/components/ui/ai-chat/adapters/agenthub/adapter'; const connection = createAgentHubConnection({ baseUrl: 'https://cloud.uipath.com/{org}/{tenant}/agenthub_/llm/api', model: { vendor: 'openai', name: 'gpt-4o' }, accessToken: () => getAccessToken(), systemPrompt: 'You are a helpful assistant.', maxTokens: 2048, temperature: 0.7, tools: toolDefs, });

The model.vendor field controls wire-format differences:

  • "openai" — flat tool definitions ({ name, description, parameters })
  • "anthropic" — Anthropic tool format ({ type: "custom", input_schema }), non-empty assistant content on tool-call messages
  • The X-UiPath-LlmGateway-NormalizedApi-ModelName header is always sent for routing
  • Responses are always OpenAI-compatible SSE regardless of the underlying model

Suggestion Buttons

Return a choices object as a tool result content to render interactive suggestion buttons. Buttons disappear after the user sends another message.

The choices format:

{ "type": "choices", "prompt": "How would you like to proceed?", "options": [ { "id": "approve", "label": "Approve Document", "recommended": true }, { "id": "reject", "label": "Reject Document" } ] }

Handle selection explicitly or let the default behavior send option.label as a message:

<AiChat messages={messages} isLoading={isLoading} onSendMessage={(text) => sendMessage(text)} onStop={stop} onChoiceSelect={(option) => { sendMessage(option.value ?? option.label); }} > {messages.map((message) => ( <AiChatMessage key={message.id} message={message} /> ))} </AiChat>

Error Display

Pass an Error object to show an inline error banner:

<AiChat messages={messages} isLoading={isLoading} onSendMessage={(text) => sendMessage(text)} onStop={stop} error={error} > {messages.map((message) => ( <AiChatMessage key={message.id} message={message} /> ))} </AiChat>

API Reference

<AiChat>

Chat shell component. Handles layout, scroll, input, loading indicator, suggestions, and errors. Render messages as children.

PropTypeDefaultDescription
messagesUIMessage[]requiredMessages from useChat
isLoadingbooleanrequiredLoading state from useChat
onSendMessage(content: string) => voidrequiredSend handler
onStop() => voidrequiredStop/abort handler
childrenReactNodeMessage list (typically messages.map(...))
onClearChat() => voidClear handler
onChoiceSelect(option: ChoiceOption) => voidSuggestion button handler (default: sends option.label)
assistantNamestring"AI Assistant"Assistant display name
titlestringChat title in the header
emptyStateReactNodeCustom empty state
placeholderstringInput placeholder
showClearButtonbooleantrueShow the clear button
errorError | nullInline error banner

<AiChatMessage>

Renders a single message with avatar, name, markdown text, and children for custom content (tool output).

PropTypeDefaultDescription
messageUIMessagerequiredThe message to render
assistantNamestring"AI Assistant"Assistant display name
childrenReactNodeCustom content rendered below the message text (tool output, etc.)

AgentHubAdapterConfig

Configuration for the AgentHub adapter.

PropertyTypeDefaultDescription
baseUrlstringrequiredAgentHub base URL (/chat/completions is appended)
model{ vendor: 'openai' | 'anthropic'; name: string }requiredModel config
accessTokenstring | () => string | nullrequiredBearer token (refreshed per request if function)
systemPromptstring | () => stringSystem prompt prepended to messages (function form is called per request)
maxTokensnumber2048Max response tokens
temperaturenumber0.7Sampling temperature
toolsReadonlyArray<AnyClientTool>Client tools — wire-format definitions are derived automatically

TypeScript

// TanStack AI types (messages, parts) import type { UIMessage } from '@tanstack/ai-client'; import type { ToolCallPart, TextPart, ToolResultPart } from '@tanstack/ai-client'; // Choice types (app-specific, not from TanStack AI) import type { ChoiceOption, ToolResultChoices } from '@/components/ui/ai-chat/types'; // AgentHub adapter import { createAgentHubConnection, type AgentHubAdapterConfig, type AgentHubVendor, } from '@/components/ui/ai-chat/adapters/agenthub/adapter'; // Standalone markdown renderer import { AiChatMarkdown } from '@/components/ui/ai-chat/components/ai-chat-markdown';