Today we look at Bayes’ theorem and common sense highlighting the role of prior beliefs.
How Should We Think About Probabilities?
There is a big rift in the way people think about probabilities. (This is in addition to he normal challenges doing the math). Events either happen or they don’t. We can think of probabilities as what has happened in similar cases but that gives a lot of discretion when thinking about ‘similar cases’.
This is especially true when looking at people who clearly have multiple identities. If I want to know the chance that someone has a Phd there are all sorts of indicators I could use:
- What country they are from
- Where they are from in the country
- Where they went to school
- How old they are etc….
What is the appropriate cut? If we go too narrow we can describe the person so perfectly that they are unique. If they are the only person in their category they either have or do not have a Phd. Calling it a probability is a bit odd. Too broad — e.g., using % of people in the world with PhD — does not use potentially helpful information.
Bayesian Thinking
One important way of thinking about probability relates to Bayes’ Theorem. Thomas Bayes was an eighteen century Presbyterian minister who famously had a theory about probability. In many ways his theorem seems like common sense. Basically, you have a prior belief. This what you think at the beginning of any process. You get new evidence and this changes your prior belief.
You might think at the beginning of a primary that a certain candidate has a great chance to beat the incumbent president who you don’t like. This is your prior. As the primary process plays out you get more information. The candidate might have a great debate performance and you raise your predicted odds. Something terrible might be revealed about the candidate’s past and you lower the odds. You create a posterior belief. (This is a belief held after the new fact emerges. It is not a belief about the candidate’s posterior).
Understanding Bayes Theorem And Prior Beliefs
I liked Scott Hartshorn’s Bayes Theorem Examples. To be honest I couldn’t see how he could make it a full book but he does enough to justify this. (It is still a small book). One of his most important comments was the discussion about how much the prior matters. Pick a reasonable figure to start but, “The initial probability estimate has the largest impact when there are few data points” (Hartshorn, 2016). This makes sense. When you get a load of new data your prior beliefs are increasing less important compared to the new data.
Also critical was the advice to: “Never set any probability to zero unless you are positive it cannot occur and you want to cut it from any future consideration.” Assigning something a zero probability means you never get a probability for it whatever evidence the future turns up. If 2020 has taught us anything, indeed you could say this is true generally after the UK’s Brexit vote, it is that everything can happen, however absurd it seems.
Read: Scott Hartshorn (2016) Bayes Theorem Examples: A Visual Guide For Beginners, check out some more stuff at the Fairly Nerdy site.