Overconfidence
Overconfidence is a systematic tendency to assign probabilities that are too extreme compared to reality. It shows up as high confidence predictions that do not come true often enough.
Definition
Overconfidence means your probabilities are too extreme relative to outcomes. In practice, events you label as 80% happen much less than 80%, or events you label as 20% happen much more than 20%.
How it appears in calibration
Overconfidence is a calibration problem. On a reliability (calibration) diagram, your realized frequencies fall closer to 50% than your stated probabilities.
Example: your 0.90 bucket resolves at 0.75, and your 0.10 bucket resolves at 0.25.
Impact on Brier score
Overconfidence is punished strongly by Brier score because squared error grows quickly when you miss with extreme probabilities.
Common causes
• Treating a strong narrative as certainty.
• Forgetting base rates and prior probabilities.
• Cherry picking evidence and ignoring disconfirming signals.
How to reduce it
• Use probability ranges and update gradually as evidence arrives.
• Track your forecast buckets and review where you are miscalibrated.
• Compare against a benchmark or market consensus.
Related
Overconfidence is the mirror image of underconfidence and is closely linked to calibration and sharpness.