What is Masked Language Model?
An AI model trained by hiding random words in text and learning to predict what the missing words are.
Why It Matters
Masked language models like BERT are especially good at understanding text meaning and are widely used for search and classification.
Real-World Example
Given 'The ___ sat on the mat', the model learns to predict 'cat' based on the surrounding context.
“Understanding terms like Masked Language Model matters because it helps you have better conversations with developers and make smarter decisions about your software. You do not need to be technical. You just need to know enough to ask the right questions.”
Related Terms
Autoregressive Model
An AI model that generates output one piece at a time, using each piece to predict the next.
Large Language Model (LLM)
An AI system trained on massive amounts of text that can understand and generate human language.
Transformer
A type of AI architecture that processes text by paying attention to relationships between all words at once, rather than reading sequentially.
Learn More at buildDay Melbourne
Want to understand these concepts hands-on? Join our one-day workshop and build a real web application from scratch.
Related Terms
Large Language Model (LLM)
An AI system trained on massive amounts of text that can understand and generate human language.
Transformer
A type of AI architecture that processes text by paying attention to relationships between all words at once, rather...
Autoregressive Model
An AI model that generates output one piece at a time, using each piece to predict the next.
Attention Mechanism
A technique that lets AI models focus on the most relevant parts of the input when generating output.
Tokenisation
The process of breaking text into smaller pieces called tokens that an AI model can process.
Embeddings
A way of representing words, sentences, or other data as lists of numbers that capture their meaning.