AI Frameworks
Master the frameworks that power modern AI. 50 topics covering deep learning (PyTorch, TensorFlow, JAX, MLX), LLM/RAG (LangChain, LlamaIndex, DSPy, HuggingFace), distributed training (DeepSpeed, FSDP, Megatron, NeMo, Ray), classical ML (sklearn, XGBoost, LightGBM, pandas, Polars), specialized libraries (RAPIDS, PyG, spaCy, OpenCV, YOLO), and MLOps (MLflow, W&B, ClearML, DVC, Kubeflow, Ray).
All Topics
50 topics organized into 6 categories spanning the full AI framework landscape.
Deep Learning Frameworks
PyTorch Mastery
Master PyTorch end-to-end. Learn tensors, autograd, nn.Module, DataLoader, torch.compile, distributed training, and the patterns that ship 80% of production AI today.
6 LessonsTensorFlow / Keras 3
Master TensorFlow 2.x and Keras 3 (multi-backend). Learn the functional API, tf.data, tf.function, distributed strategies, and TFLite for mobile/edge deployment.
6 LessonsJAX Mastery
Master JAX: XLA-compiled NumPy with autodiff. Learn jit, grad, vmap, pmap, sharding, and the patterns that power Google's most ambitious AI research.
6 LessonsFlax (JAX Neural Networks)
Master Flax: the most popular JAX neural network library. Learn nnx (new) and linen (old) APIs, training loops, and porting PyTorch models to Flax.
6 LessonsEquinox (JAX)
Master Equinox: PyTorch-style neural networks for JAX with full pytree compatibility. Learn modules, filtered jit, and the patterns for clean JAX research code.
6 LessonsPyTorch Lightning
Master PyTorch Lightning: organize PyTorch code into LightningModule, Trainer, and DataModule. Get distributed training, mixed precision, and checkpointing for free.
6 Lessonsfastai
Master fastai: high-level deep learning library on top of PyTorch. Learn DataBlock, Learner, callbacks, and the patterns that make state-of-the-art accessible.
6 LessonsApple MLX
Master Apple MLX: array framework optimized for Apple Silicon. Learn unified memory, lazy evaluation, MLX-LM, and the patterns for fast AI on M-series chips.
6 LessonsPaddlePaddle (Baidu)
Master PaddlePaddle: Baidu's open-source deep learning framework. Learn dynamic and static graphs, PaddleNLP, PaddleOCR, and the deployment story.
6 LessonsMindSpore (Huawei)
Master MindSpore: Huawei's open-source AI framework. Learn the AI native graph compiler, Ascend-optimized training, and when MindSpore beats alternatives.
6 LessonsLLM & RAG Frameworks
HuggingFace Transformers
Master HuggingFace Transformers: 1M+ pretrained models with one API. Learn AutoModel, AutoTokenizer, pipelines, Trainer, and the patterns for production HF use.
6 LessonsHuggingFace Diffusers
Master HuggingFace Diffusers: state-of-the-art diffusion models for image, video, and audio generation. Learn pipelines, schedulers, and custom training.
6 LessonsHuggingFace PEFT (LoRA, QLoRA)
Master HuggingFace PEFT: parameter-efficient fine-tuning. Learn LoRA, QLoRA, prefix tuning, prompt tuning, IA3, and the patterns for cheap LLM fine-tunes.
6 LessonsHuggingFace TRL (RLHF, DPO)
Master HuggingFace TRL: train LLMs with reinforcement learning. Learn SFTTrainer, DPOTrainer, PPOTrainer, KTOTrainer, and the alignment training patterns.
6 LessonsHuggingFace Accelerate
Master HuggingFace Accelerate: distributed training that 'just works'. Learn DeepSpeed/FSDP integration, mixed precision, and zero-code-change distributed launches.
6 LessonsLangChain
Master LangChain: the most popular LLM application framework. Learn LCEL, chains, retrievers, memory, and the patterns for production LangChain apps.
6 LessonsLlamaIndex
Master LlamaIndex: the data framework for LLM apps. Learn ingestion, indexing, query engines, agents, and the patterns for production RAG with LlamaIndex.
6 LessonsDSPy
Master DSPy: program LLMs declaratively, then optimize. Learn signatures, modules, optimizers (BootstrapFewShot, MIPRO), and the algorithmic prompt-engineering pattern.
6 LessonsHaystack
Master Haystack: production-ready LLM framework from deepset. Learn pipelines, components, document stores, and Haystack 2.0's modular architecture.
6 LessonsSemantic Kernel (Microsoft)
Master Microsoft Semantic Kernel: SDK for integrating LLMs into C#, Python, Java apps. Learn plugins, planners, and Microsoft's enterprise AI integration patterns.
6 LessonsTraining & Distributed
DeepSpeed
Master Microsoft DeepSpeed: train trillion-parameter models. Learn ZeRO stages, optimizer offload, pipeline parallelism, and the patterns for memory-efficient training.
6 LessonsPyTorch FSDP
Master PyTorch FSDP (Fully Sharded Data Parallel). Learn auto-wrap policies, mixed precision, activation checkpointing, and FSDP2 (per-parameter).
6 LessonsMegatron-LM (NVIDIA)
Master Megatron-LM: NVIDIA's framework for training huge language models. Learn tensor parallelism, sequence parallelism, and the patterns powering frontier-scale training.
6 LessonsNVIDIA NeMo
Master NVIDIA NeMo: end-to-end framework for conversational AI, ASR, TTS, and LLMs. Learn NeMo 2.0, NeMo Curator, NeMo Aligner, and production patterns.
6 LessonsRay Train
Master Ray Train: distributed training on Ray. Learn TorchTrainer, integration with FSDP/DeepSpeed, fault tolerance, and the patterns for elastic training.
6 LessonsComposer (MosaicML)
Master MosaicML Composer (now Databricks): efficient PyTorch training with algorithmic speedups. Learn Trainer, algorithms (e.g., ALiBi, EMA), and the deterministic recipes.
6 LessonsColossalAI
Master ColossalAI: efficient large-scale model training with auto-parallelism. Learn the Booster API, ZeRO, tensor parallelism, and ColossalChat for RLHF.
6 LessonsAxolotl
Master Axolotl: YAML-driven LLM fine-tuning framework. Learn config-driven training, LoRA/QLoRA recipes, and the patterns for fast iteration on 7B-70B models.
6 LessonsClassical ML
scikit-learn
Master scikit-learn: the bedrock of classical ML. Learn estimators, pipelines, cross-validation, hyperparameter search, and the patterns for production sklearn.
6 LessonsXGBoost
Master XGBoost: the gradient boosting framework that wins Kaggle. Learn DMatrix, parameters, GPU training, early stopping, and the production deployment patterns.
6 LessonsLightGBM
Master Microsoft LightGBM: faster gradient boosting via leaf-wise growth and histograms. Learn parameters, GPU support, and when LightGBM beats XGBoost.
6 LessonsCatBoost
Master Yandex CatBoost: gradient boosting with native categorical handling. Learn ordered boosting, target encoding, and when CatBoost shines on tabular data.
6 Lessonsstatsmodels
Master statsmodels: classical statistics in Python. Learn OLS, GLM, time series (ARIMA, SARIMAX), and the diagnostics and inference tools sklearn lacks.
6 LessonsNumPy
Master NumPy: the foundation of scientific Python. Learn ndarray, broadcasting, vectorization, einsum, and the patterns for fast numerical code.
6 Lessonspandas
Master pandas: dataframes for Python. Learn Series, DataFrame, groupby, merge, time series, and the patterns to handle 1M-1B rows efficiently.
6 LessonsPolars
Master Polars: blazing-fast Rust-based dataframes. Learn lazy vs eager, expressions, query optimization, and when Polars destroys pandas on speed.
6 LessonsSpecialized Frameworks
RAPIDS (cuDF, cuML)
Master NVIDIA RAPIDS: GPU-accelerated dataframes (cuDF) and ML (cuML). Learn pandas-like APIs at GPU speed and when RAPIDS gives 10-100x speedups.
6 LessonsPyTorch Geometric (PyG)
Master PyTorch Geometric: graph neural networks on PyTorch. Learn message passing, GCN, GAT, GraphSAGE, and the patterns for scalable graph learning.
6 LessonsDeep Graph Library (DGL)
Master DGL: framework-agnostic graph neural network library. Learn DGLGraph, message passing, sampling, and DGL on PyTorch/MXNet/TF.
6 LessonsspaCy (NLP)
Master spaCy: production-ready NLP in Python. Learn pipelines, transformers integration, custom components, and the patterns for fast NLP at scale.
6 LessonsNLTK
Master NLTK: the classic NLP library for Python. Learn tokenization, POS tagging, parsing, sentiment, and when NLTK still beats modern alternatives.
6 LessonsOpenCV
Master OpenCV: the dominant computer vision library. Learn image I/O, transformations, object detection, video, DNN module, and integration with PyTorch/ONNX.
6 LessonsDetectron2 (Meta CV)
Master Meta's Detectron2: state-of-the-art object detection and segmentation. Learn the model zoo, custom datasets, and production deployment patterns.
6 LessonsUltralytics YOLO
Master Ultralytics YOLO (v8, v11): blazing-fast object detection, segmentation, and pose estimation. Learn the unified API, training, and edge deployment patterns.
6 LessonsMLOps & Experimentation
MLflow
Master MLflow: the open-source MLOps platform. Learn tracking, projects, models, model registry, and the patterns to make ML reproducible at scale.
6 LessonsWeights & Biases (W&B)
Master Weights & Biases: experiment tracking, hyperparameter sweeps, artifacts, and reports. Learn how W&B becomes the system of record for ML teams.
6 LessonsClearML
Master ClearML: open-source MLOps platform. Learn experiment tracking, orchestration, datasets, model serving, and self-hosted MLOps patterns.
6 LessonsDVC (Data Version Control)
Master DVC: Git for data and ML pipelines. Learn dvc add, dvc push, pipelines (dvc.yaml), experiments, and the patterns for reproducible ML.
6 LessonsKubeflow
Master Kubeflow: ML toolkit for Kubernetes. Learn Pipelines, KServe (model serving), Katib (hyperparameter tuning), and the cloud-native MLOps stack.
6 LessonsRay (Distributed Compute)
Master Ray: distributed compute for AI. Learn Ray Core, Ray Data, Ray Tune, Ray Serve, Ray Train, and the patterns for scaling Python AI workloads.
6 LessonsWhy an AI Frameworks Track?
The right framework lets you focus on the problem; the wrong one becomes the problem.
Deep Learning Stack
PyTorch, TensorFlow, JAX, MLX, Flax, Equinox, Lightning, fastai, PaddlePaddle, MindSpore.
LLM Ecosystem
HuggingFace Transformers, Diffusers, PEFT, TRL, Accelerate; LangChain, LlamaIndex, DSPy, Haystack, Semantic Kernel.
Distributed Training
DeepSpeed, FSDP, Megatron-LM, NeMo, Ray Train, Composer, ColossalAI, Axolotl.
MLOps & More
Classical ML (sklearn, XGBoost, LightGBM, pandas, Polars), specialized (RAPIDS, PyG, spaCy, OpenCV, YOLO), MLOps (MLflow, W&B, ClearML, DVC, Kubeflow, Ray).
Lilly Tech Systems