Everyday Philosophy: Bad at probability? That might be a blessing.

Everyday Philosophy: Bad at probability? That might be a blessing.

Very few of my friends understand the basic probabilities of life. They just see things in black and white, and almost always from hunches. Can you tell me why a lot of humans are so bad at probability, and why it’s important to be better?

Pratyush, India

I used to work with someone who called me Hopscotch Jonny. Hopscotch is a popular playground game where you jump from one square to another: one leg, two legs, one leg, two legs. Hop, hop, hop. The reason I was called Hopscotch Jonny is because I spend a lot of my life occupied with fads. I live in phases. I’ll become obsessed with a topic, but five books on the subject, and then ramble on about it to anyone who will listen, boring them all. My life is consumed by a new obsession for a few months before I move on to a new thing — before I hop to the next box. My study is a jungle thanks to my pot-plant phase, my cupboard is filled with swimming artifacts from my triathlon months, and now, as I write this, my phone is nearly full of downloaded podcasts about data analysis. (Unpaid plug here, but “The Studies Show” is my far-away favorite).

So, Pratyush’s question comes at a good time for me, because I’ve come to realize that I am one of those humans who is “bad at probability.” My dad, a psychologist, used to say, “Half of people are below average intelligence, you know,” to try and teach me. But whatever your dad says is worthless until you hit 20. So, let’s dig into why I, and many people, are so bad at probability.

To do that, I will try my layman’s best to explain Bayes’ theorem and how it relates to “hunches” and “black and white.” Then, we can explore the extent to which humans actually fit the mold of the Enlightenment ideal. Are humans actually that rational? Are we meant to be?

Bayes’ theorem: The rational ideal

Bayes’ theorem is arguably the single most important thing any wannabe rational person can learn. So many of our debates and disagreements we shout about are because we don’t understand Bayes’ theorem, or how human rationality often works.

Bayes’ theorem is named after the 18th-century mathematician Thomas Bayes, and essentially, it’s a formula that asks: When you are presented with all of the evidence for something, how much should you believe it?

Bayes’ theorem teaches us that our beliefs are not fixed; they are probabilities. Our beliefs change as we weigh up new evidence against our assumptions, or our “priors.” In other words, we all carry with us certain ideas about how the world works and new evidence will challenge us. For example, somebody might believe that “smoking is safe,” that “Vitamin C prevents sickness,” or that “human activity is unrelated to climate change.” These are their priors: their existing beliefs, formed by culture, biases, and information they’ve encountered.

Now, imagine a new study that challenges one of your priors. Well, a single study might not carry enough weight to overturn your existing beliefs, but imagine the studies accumulate and eventually the scales start to tip. At some point, your prior will become less and less plausible.

Bayes’ theorem argues that being rational is not about black and white, as Pratyush pointed out. It’s not even about true or false. It’s about what’s most reasonable based on the best available evidence. But for this to work, we need as much high-quality data as possible. Without evidence, without belief-forming data, we have only our priors and biases.

Priors and biases: Why we’re not all that rational

This column is called Everyday Philosophy, not Scientific Method 101. The job here is to look at human beliefs, the human condition, and how societies work more broadly. And when we look at it from this position, Bayes’ theorem definitely hits a wall. After all, while Bayes’ theorem is a great — possibly the greatest — way to interpret data and move the dial on scientific findings, it is not the only way to account for human belief.

In our day-to-day lives, “new evidence” is rarely, if ever, a peer-reviewed, double-blind study published in a reputable academic journal. It’s an indefinable blur of personal experience, trusted testimony, background hunches, and what that guy wrote on social media last week. We could, and have, presented a heavy ledger full of “cognitive biases” or “logical fallacies.” For example, the authority bias, where we think opinions from certain authority figures count for more (even on topics outside their authority). But these biases are not unwelcome shortcomings we should always purge. They are there for a reason.

One of the best and most familiar examples of this is in Daniel Kahneman’s works, like Thinking, Fast and Slow. The human mind has evolved over hundreds of thousands of years in step with a certain environment that demands certain cognitive abilities. Over the timescale needed for evolution, very little is “pointless.” The survival of the fittest leaves little room for redundancy. These biases — these “bad at probability” mindsets — serve a purpose. For example, take the “optimism bias,” which makes us believe we’re less likely to experience negative events like illness or accidents than others are. This can boost motivation and resilience, encouraging people to take risks and strive for goals they might otherwise avoid. If every new entrepreneur fully grasped the statistical likelihood of failure, they’d probably never start. Human agency, and especially human daring, demands a high degree of risk ignorance.

Bad but good

So, I agree with you, Pratyush. I think a lot of people are bad at probabilities. We often don’t understand how things like Bayesian statistics work, nor do we appreciate how unsure so many things are. But, while being “bad at probability” might lead to flawed decisions sometimes, it’s also deeply tied to how we navigate an uncertain and messy world.

Rationality, like Bayes’ theorem, gives us an ideal to strive toward, perhaps, but our evolutionary quirks remind us that we’re still very much human.

This article Everyday Philosophy: Bad at probability? That might be a blessing. is featured on Big Think.

The post “Everyday Philosophy: Bad at probability? That might be a blessing.” by Jonny Thomson was published on 12/20/2024 by bigthink.com