Back to Glossary
AI & Machine LearningIntermediate

What is Tokenisation?

The process of breaking text into smaller pieces called tokens that an AI model can process.

Why It Matters

Tokenisation affects how much text a model can handle at once and how well it understands different words and languages.

Real-World Example

The word 'understanding' might be split into tokens like 'under', 'stand', and 'ing'.

“Understanding terms like Tokenisation matters because it helps you have better conversations with developers and make smarter decisions about your software. You do not need to be technical. You just need to know enough to ask the right questions.”

Callum Holt, Founder, 13Labs

Learn More at buildDay Melbourne

Want to understand these concepts hands-on? Join our one-day workshop and build a real web application from scratch.