Building an AI Chat Interface Intermediate
The useChat hook from the Vercel AI SDK handles all the complexity of building a chat interface — message state, streaming, error handling, and API communication. In this lesson, you will build a complete chat UI from scratch.
The useChat Hook
The useChat hook is the foundation of AI chat interfaces in Next.js. It manages the entire conversation lifecycle:
TypeScript - app/page.tsx
'use client'; import { useChat } from 'ai/react'; export default function ChatPage() { const { messages, input, handleInputChange, handleSubmit, isLoading } = useChat(); return ( <div className="chat-container"> <div className="messages"> {messages.map((m) => ( <div key={m.id} className={`message ${m.role}`}> <strong>{m.role === 'user' ? 'You' : 'AI'}:</strong> <p>{m.content}</p> </div> ))} </div> <form onSubmit={handleSubmit}> <input value={input} onChange={handleInputChange} placeholder="Type a message..." disabled={isLoading} /> <button type="submit" disabled={isLoading}>Send</button> </form> </div> ); }
How It Works: The
useChat hook automatically sends messages to /api/chat (your route handler), receives the streaming response, and updates the messages array in real time. You do not need to manage WebSockets or polling.
The Route Handler
The route handler receives messages and returns a streaming response:
TypeScript - app/api/chat/route.ts
import { openai } from '@ai-sdk/openai'; import { streamText } from 'ai'; export async function POST(req: Request) { const { messages } = await req.json(); const result = await streamText({ model: openai('gpt-4o'), system: 'You are a helpful assistant.', messages, }); return result.toDataStreamResponse(); }
Customizing useChat
The useChat hook accepts many configuration options:
TypeScript
const { messages, input, handleInputChange, handleSubmit, error, reload, stop } = useChat({ api: '/api/custom-chat', // Custom endpoint initialMessages: [], // Pre-populate messages body: { model: 'gpt-4o' }, // Extra data sent with each request onFinish: (message) => { // Called when response completes console.log('Completed:', message); }, onError: (error) => { // Handle errors console.error('Chat error:', error); }, });
Adding Message Features
Auto-Scroll to Latest Message
Use a ref to automatically scroll to the latest message as it streams in:
TypeScript
import { useRef, useEffect } from 'react'; const messagesEndRef = useRef<HTMLDivElement>(null); useEffect(() => { messagesEndRef.current?.scrollIntoView({ behavior: 'smooth' }); }, [messages]); // Add at the bottom of your messages container: <div ref={messagesEndRef} />
Stop and Regenerate
The hook provides stop and reload functions for user control:
TypeScript
// Stop the current stream <button onClick={stop}>Stop Generating</button> // Regenerate the last response <button onClick={() => reload()}>Regenerate</button>
Chat Interface Complete!
You now have a fully functional AI chat interface. In the next lesson, you will learn to use Server Actions for server-side AI operations.
Next: Server Actions →
Lilly Tech Systems