Introduction to FastAI
Discover fast.ai's revolutionary approach to deep learning education and software — making state-of-the-art AI accessible to anyone with basic coding skills.
What is FastAI?
FastAI is both a deep learning library and a research lab founded by Jeremy Howard and Rachel Thomas. The library, built on top of PyTorch, provides high-level components that can quickly and easily provide state-of-the-art results in standard deep learning domains. It also provides researchers with low-level components that can be mixed and matched to build new approaches.
The Top-Down Learning Philosophy
Fast.ai's educational approach is unique. Instead of starting with math theory and working up, Jeremy Howard's courses start with practical applications and then peel back the layers:
Start with a complete, working model
Get results immediately. See what deep learning can do before learning how it works.
Understand the high-level components
Learn what DataLoaders, Learners, and callbacks do and how they fit together.
Customize and experiment
Modify components, try different architectures, adjust hyperparameters.
Dive into the internals
Understand the math, implement from scratch, and contribute to research.
Built on PyTorch
FastAI is built entirely on PyTorch, which means you get all the benefits of PyTorch's dynamic computation graphs, easy debugging, and vast ecosystem. FastAI adds a layer of best practices on top:
| FastAI Feature | What It Does | PyTorch Equivalent |
|---|---|---|
| DataLoaders | Handles data preprocessing, augmentation, batching | DataLoader + custom transforms |
| Learner | Bundles model + data + optimizer + loss | Manual training loop |
| fine_tune() | Transfer learning with frozen/unfrozen stages | Manual freezing + multiple training phases |
| lr_find() | Automatic learning rate finder | Manual implementation |
| Callbacks | Extensible hooks into every training event | Custom code in training loop |
Jeremy Howard and fast.ai
Jeremy Howard, co-founder of fast.ai, is a renowned data scientist and educator. He was previously president of Kaggle, founded Enlitic (one of the first companies to apply deep learning to medicine), and has made significant research contributions including ULMFiT (the foundation for modern NLP transfer learning). The free fast.ai course (course.fast.ai) has trained hundreds of thousands of practitioners worldwide.
Key Research Contributions
- ULMFiT — Universal Language Model Fine-tuning, the technique that enabled transfer learning for NLP (precursor to BERT, GPT)
- One-cycle policy — A learning rate schedule that achieves faster convergence with better generalization
- Progressive resizing — Start training on small images, gradually increase size for faster training
- Mixup augmentation — Pioneered the use of mixup for improved generalization in vision tasks
Ready to Install FastAI?
Let's get FastAI set up on your machine so you can start building models.
Next: Installation →
Lilly Tech Systems