What is Temperature (AI)?
A setting that controls how creative or predictable an AI model's responses are.
Why It Matters
Adjusting temperature lets you choose between reliable, consistent answers and more creative, varied outputs.
Real-World Example
Low temperature for factual Q&A (consistent answers), high temperature for creative writing (varied and imaginative).
“Understanding terms like Temperature (AI) matters because it helps you have better conversations with developers and make smarter decisions about your software. You do not need to be technical. You just need to know enough to ask the right questions.”
Related Terms
Large Language Model (LLM)
An AI system trained on massive amounts of text that can understand and generate human language.
Top-k Sampling
A method for controlling AI text generation by only considering the k most likely next words at each step.
Inference
The process of using a trained AI model to generate predictions or outputs from new input data.
Prompt Engineering
The practice of crafting effective instructions for AI models to get the best possible responses.
Learn More at buildDay Melbourne
Want to understand these concepts hands-on? Join our one-day workshop and build a real web application from scratch.
Related Terms
Large Language Model (LLM)
An AI system trained on massive amounts of text that can understand and generate human language.
Prompt Engineering
The practice of crafting effective instructions for AI models to get the best possible responses.
Inference
The process of using a trained AI model to generate predictions or outputs from new input data.
Top-k Sampling
A method for controlling AI text generation by only considering the k most likely next words at each step.
Transformer
A type of AI architecture that processes text by paying attention to relationships between all words at once, rather...
Attention Mechanism
A technique that lets AI models focus on the most relevant parts of the input when generating output.