Introduction to LangChain
LangChain is the most popular open-source framework for building applications powered by large language models. It provides composable components for prompts, models, chains, memory, agents, and retrieval.
What is LangChain?
LangChain is a Python and JavaScript framework that simplifies building LLM-powered applications. Instead of writing raw API calls, LangChain gives you reusable building blocks that snap together like LEGO pieces — prompts, models, output parsers, chains, memory, retrievers, and agents.
LangChain = Framework for building LLM applications # Key idea: compose building blocks Prompt → Model → Parser → Output | | | | Template ChatGPT JSON Structured Variables Claude Pydantic Data Gemini String
Key Components
LangChain organizes LLM application development into six core component areas:
Models
Unified interface for OpenAI, Anthropic, Google, Ollama, and 50+ LLM providers. Swap models with one line of code.
Prompts
Template system for dynamic prompt construction with variables, few-shot examples, and message formatting.
Chains
Compose components into pipelines using the pipe operator (|). LCEL makes chains declarative and streamable.
Memory
Maintain conversation history across interactions. Buffer, summary, window, entity, and vector store memory types.
Retrieval (RAG)
Load documents, split text, embed, store in vector databases, and retrieve relevant context for generation.
Agents
Autonomous LLM-powered agents that use tools, reason step-by-step, and take actions to accomplish goals.
LangChain vs Alternatives
How does LangChain compare to other popular LLM frameworks?
| Feature | LangChain | LlamaIndex | Haystack |
|---|---|---|---|
| Primary Focus | General LLM applications | Data indexing & RAG | Search & RAG pipelines |
| Agents | Excellent (ReAct, OpenAI, LangGraph) | Basic agent support | Limited agent support |
| RAG | Full support (loaders, splitters, retrievers) | Best-in-class RAG | Strong pipeline-based RAG |
| Ecosystem | LangGraph, LangSmith, LangServe | LlamaHub, LlamaParse | Haystack Hub |
| Community | Largest (90k+ GitHub stars) | Large (35k+ stars) | Growing (17k+ stars) |
| Learning Curve | Moderate (many abstractions) | Lower (focused scope) | Lower (pipeline-focused) |
The LangChain Ecosystem
LangChain is not just a single library — it is an ecosystem of tools for the full LLM application lifecycle:
-
LangChain (Core)
The main framework with chains, prompts, models, memory, retrievers, and agents. The foundation for everything else.
-
LangGraph
Graph-based agent orchestration framework. Build stateful, multi-step agents with cycles, branching, and human-in-the-loop workflows.
-
LangSmith
Observability and evaluation platform. Trace runs, debug chains, evaluate outputs, and monitor LLM applications in production.
-
LangServe
Deploy LangChain chains as REST APIs with one command. Built on FastAPI with automatic documentation and playground UI.
History and Versions
LangChain has evolved rapidly since its creation:
- October 2022: Harrison Chase creates LangChain as an open-source project.
- 2023: Explosive growth. LangChain becomes the most popular LLM framework. Raises $25M+ in funding.
- Early 2024: LangChain v0.1 — first stable release. Introduction of LCEL (LangChain Expression Language).
- Mid 2024: LangChain v0.2 — deprecation of legacy chains in favor of LCEL. LangGraph reaches v1.0.
- Late 2024 – 2025: LangChain v0.3 — streamlined architecture,
langchain-coreas the minimal dependency. Provider packages split out. - 2025 – 2026: LangGraph becomes the recommended way to build agents. LangSmith adds advanced evaluation and prompt management.
When to Use LangChain
- To chain together multiple LLM calls with logic in between
- To build agents that use tools and reason autonomously
- To implement RAG with document loading, embedding, and retrieval
- To swap between different LLM providers easily
- Production observability with LangSmith
- Complex multi-agent workflows with LangGraph
- You only need simple API calls — use the provider SDK directly
- You need maximum control over every API parameter
- You want a minimal dependency footprint
- Your project is purely RAG-focused — consider LlamaIndex
What's Next?
In the next lesson, we will install LangChain, set up API keys, and build our first "Hello World" chain using LCEL.
Lilly Tech Systems