What is Gradient Descent?
The mathematical process AI models use to gradually reduce errors and improve their predictions.
Why It Matters
Gradient descent is the core optimisation method that makes machine learning possible.
Real-World Example
Like walking downhill in fog by always stepping in the steepest direction, the model finds better solutions step by step.
“Understanding terms like Gradient Descent matters because it helps you have better conversations with developers and make smarter decisions about your software. You do not need to be technical. You just need to know enough to ask the right questions.”
Related Terms
Backpropagation
The algorithm that calculates how to adjust each part of a neural network to reduce errors.
Learning Rate
A number that controls how much an AI model adjusts its knowledge from each batch of training data.
Neural Network
A computing system inspired by the human brain, made up of layers of connected nodes that learn patterns from data.
Learn More at buildDay Melbourne
Want to understand these concepts hands-on? Join our one-day workshop and build a real web application from scratch.
Related Terms
Neural Network
A computing system inspired by the human brain, made up of layers of connected nodes that learn patterns from data.
Learning Rate
A number that controls how much an AI model adjusts its knowledge from each batch of training data.
Backpropagation
The algorithm that calculates how to adjust each part of a neural network to reduce errors.
Large Language Model (LLM)
An AI system trained on massive amounts of text that can understand and generate human language.
Transformer
A type of AI architecture that processes text by paying attention to relationships between all words at once, rather...
Attention Mechanism
A technique that lets AI models focus on the most relevant parts of the input when generating output.