Back to Glossary
AI & Machine LearningAdvanced

What is Model Distillation?

Creating a smaller, faster AI model that mimics the behaviour of a larger, more capable one.

Why It Matters

Distillation makes it possible to run powerful AI capabilities on smaller devices or with lower costs.

Real-World Example

Training a compact model to replicate the quality of a large language model for a specific task like email classification.

“Understanding terms like Model Distillation matters because it helps you have better conversations with developers and make smarter decisions about your software. You do not need to be technical. You just need to know enough to ask the right questions.”

Callum Holt, Founder, 13Labs

Learn More at buildDay Melbourne

Want to understand these concepts hands-on? Join our one-day workshop and build a real web application from scratch.