Advanced
LangChain.js
Build sophisticated LLM applications with chains, agents, retrieval-augmented generation (RAG), memory, and the Vercel AI SDK.
Getting Started
Terminal
npm install langchain @langchain/openai @langchain/anthropic
Basic Chat
TypeScript
import { ChatOpenAI } from '@langchain/openai'; import { HumanMessage, SystemMessage } from '@langchain/core/messages'; const model = new ChatOpenAI({ modelName: 'gpt-4o', temperature: 0.7 }); const response = await model.invoke([ new SystemMessage('You are a helpful data science tutor.'), new HumanMessage('Explain gradient descent in simple terms.') ]); console.log(response.content);
Chains and Prompt Templates
TypeScript
import { ChatPromptTemplate } from '@langchain/core/prompts'; import { StringOutputParser } from '@langchain/core/output_parsers'; const prompt = ChatPromptTemplate.fromMessages([ ['system', 'You are an expert in {topic}. Be concise.'], ['human', '{question}'] ]); // LCEL: LangChain Expression Language const chain = prompt.pipe(model).pipe(new StringOutputParser()); const result = await chain.invoke({ topic: 'machine learning', question: 'What is overfitting?' });
RAG (Retrieval-Augmented Generation)
TypeScript
import { MemoryVectorStore } from 'langchain/vectorstores/memory'; import { OpenAIEmbeddings } from '@langchain/openai'; import { RecursiveCharacterTextSplitter } from 'langchain/text_splitter'; // 1. Split documents const splitter = new RecursiveCharacterTextSplitter({ chunkSize: 1000, chunkOverlap: 200 }); const docs = await splitter.splitDocuments(documents); // 2. Create vector store const vectorStore = await MemoryVectorStore.fromDocuments( docs, new OpenAIEmbeddings() ); // 3. Retrieve and generate const retriever = vectorStore.asRetriever({ k: 3 }); const relevantDocs = await retriever.invoke('How does attention work?'); // 4. Pass context to LLM const context = relevantDocs.map(d => d.pageContent).join('\n'); const answer = await chain.invoke({ context, question });
Vercel AI SDK
TypeScript - Next.js API Route
import { streamText } from 'ai'; import { openai } from '@ai-sdk/openai'; export async function POST(req: Request) { const { messages } = await req.json(); const result = streamText({ model: openai('gpt-4o'), system: 'You are a helpful assistant.', messages, }); return result.toDataStreamResponse(); }
Vercel AI SDK provides framework-agnostic streaming helpers for React, Next.js, Svelte, and Vue. It handles streaming, token counting, and tool calling out of the box, making it the fastest way to build AI chat UIs.
Lilly Tech Systems