Learn JAX
Master Google's high-performance numerical computing library. JAX combines NumPy's familiar API with automatic differentiation, JIT compilation via XLA, and seamless GPU/TPU acceleration — the framework behind cutting-edge AI research at Google DeepMind.
Your Learning Path
Follow these lessons in order, or jump to any topic that interests you.
1. Introduction
What is JAX? Functional transforms, XLA compilation, and the jit/grad/vmap triad.
2. Installation
Install JAX with pip, configure GPU/TPU support, and verify your setup.
3. Core Concepts
DeviceArray, JIT compilation, automatic differentiation, and vmap vectorization.
4. Neural Networks
Build neural networks with Flax and Haiku, train with Optax optimizers.
5. Advanced Topics
Multi-GPU with pmap, sharding, custom gradients, and Hugging Face integration.
6. Best Practices
Performance tuning, debugging, functional programming patterns, and JAX pitfalls.
What You'll Learn
By the end of this course, you'll be able to:
Write Functional ML Code
Use JAX's pure functional approach with jit, grad, and vmap for composable, efficient computations.
Build Neural Networks
Create models with Flax or Haiku and train them with Optax optimizers on GPU/TPU.
Scale to Multiple Devices
Use pmap and sharding to distribute computation across multiple GPUs and TPU pods.
Optimize Performance
Leverage XLA compilation, vectorization, and JAX-specific patterns for maximum speed.
Lilly Tech Systems