Learn Hugging Face Spaces
Deploy machine learning demo apps for free on Hugging Face Spaces. Build interactive demos with Gradio, Streamlit, or Docker — with optional free GPU hosting and seamless integration with the Hugging Face ecosystem.
Your Learning Path
Follow these lessons in order, or jump to any topic that interests you.
1. Introduction
What are Hugging Face Spaces? Discover free GPU hosting, demo app deployment, and the Spaces ecosystem.
2. Creating Spaces
Create your first Space, set up Git integration, configure requirements.txt, and deploy an app.py.
3. Gradio Spaces
Build and deploy Gradio apps with Interface and Blocks APIs, add examples, and create rich ML demos.
4. Streamlit Spaces
Deploy Streamlit applications on Spaces, build multi-page apps, and integrate with HF models.
5. Docker Spaces
Use Docker for custom environments, persistent storage, and advanced deployment configurations.
6. Best Practices
Optimize performance, manage secrets, handle GPU resources, and build production-quality demos.
What You'll Learn
By the end of this course, you'll be able to:
Deploy ML Demos
Create interactive machine learning demos and deploy them for free on Hugging Face Spaces.
Multiple Frameworks
Build apps with Gradio, Streamlit, or Docker — choosing the right tool for each use case.
GPU Acceleration
Leverage free and paid GPU tiers to run inference on large models directly in your Spaces.
HF Ecosystem
Integrate with Hugging Face models, datasets, and the Transformers library seamlessly.
Lilly Tech Systems