What is Epochs?
The number of times an AI model goes through the entire training dataset during training.
Why It Matters
More epochs can improve model quality, but too many can cause overfitting to the training data.
Real-World Example
Training for 10 epochs means the model sees every training example 10 times.
“Understanding terms like Epochs matters because it helps you have better conversations with developers and make smarter decisions about your software. You do not need to be technical. You just need to know enough to ask the right questions.”
Related Terms
Hyperparameters
Settings you choose before training an AI model that control how the training process works.
Batch Size
The number of training examples an AI model processes at once before updating its knowledge.
Overfitting
When an AI model memorises its training data too closely and performs poorly on new, unseen data.
Training Data
The dataset used to teach an AI model patterns and knowledge during its initial training.
Learn More at buildDay Melbourne
Want to understand these concepts hands-on? Join our one-day workshop and build a real web application from scratch.
Related Terms
Training Data
The dataset used to teach an AI model patterns and knowledge during its initial training.
Overfitting
When an AI model memorises its training data too closely and performs poorly on new, unseen data.
Hyperparameters
Settings you choose before training an AI model that control how the training process works.
Batch Size
The number of training examples an AI model processes at once before updating its knowledge.
Large Language Model (LLM)
An AI system trained on massive amounts of text that can understand and generate human language.
Transformer
A type of AI architecture that processes text by paying attention to relationships between all words at once, rather...