Back to Glossary
AI & Machine LearningAdvanced

What is Attention Heads?

Parallel attention mechanisms within a transformer that each focus on different types of relationships in the input.

Why It Matters

Multiple attention heads let AI models capture different types of patterns simultaneously, improving understanding.

Real-World Example

One attention head might focus on grammar while another tracks which pronouns refer to which nouns.

“Understanding terms like Attention Heads matters because it helps you have better conversations with developers and make smarter decisions about your software. You do not need to be technical. You just need to know enough to ask the right questions.”

Callum Holt, Founder, 13Labs

Learn More at buildDay Melbourne

Want to understand these concepts hands-on? Join our one-day workshop and build a real web application from scratch.