WeeBytes
Start for free
Overfitting: When Your Model Is Too Good on Paper
IntermediateMachine LearningKnowledge

Overfitting: When Your Model Is Too Good on Paper

Your model hits 99% accuracy on training data, then falls apart in production. This is overfitting — and it's one of the most important problems to understand.

**Overfitting** happens when a model memorises training data instead of learning general patterns.

**Signs:**

- Very high training accuracy, low validation accuracy

- Model 'knows' training examples but can't generalise

**Analogy:** A student who memorises all past exam answers but can't answer novel questions.

**Solutions:**

- More training data

- Regularisation (L1/L2 — penalise model complexity)

- Dropout (randomly disable neurons during training)

- Early stopping (stop before the model overfits)

- Cross-validation (test on multiple held-out splits)

ml-basicsmachine-learningml

Want more like this?

WeeBytes delivers 25 cards like this every day — personalised to your interests.

Start learning for free