What is Context Window?
The maximum amount of text an AI model can consider at once when generating a response.
Why It Matters
A larger context window means the model can work with longer documents and remember more of your conversation.
Real-World Example
A model with a 100,000-token context window can read and discuss an entire book in one conversation.
“Understanding terms like Context Window matters because it helps you have better conversations with developers and make smarter decisions about your software. You do not need to be technical. You just need to know enough to ask the right questions.”
Related Terms
Large Language Model (LLM)
An AI system trained on massive amounts of text that can understand and generate human language.
Tokenisation
The process of breaking text into smaller pieces called tokens that an AI model can process.
Prompt Engineering
The practice of crafting effective instructions for AI models to get the best possible responses.
Learn More at buildDay Melbourne
Want to understand these concepts hands-on? Join our one-day workshop and build a real web application from scratch.
Related Terms
Large Language Model (LLM)
An AI system trained on massive amounts of text that can understand and generate human language.
Tokenisation
The process of breaking text into smaller pieces called tokens that an AI model can process.
Prompt Engineering
The practice of crafting effective instructions for AI models to get the best possible responses.
Transformer
A type of AI architecture that processes text by paying attention to relationships between all words at once, rather...
Attention Mechanism
A technique that lets AI models focus on the most relevant parts of the input when generating output.
Embeddings
A way of representing words, sentences, or other data as lists of numbers that capture their meaning.