Tim's Share: on Bayes’ theorem
- Feb 23
- 2 min read
Updated: Mar 2

From Tim: Bayes’ theorem is really just a reminder that our beliefs aren’t fixed truths, they’re starting points. And they should shift, slowly and proportionally, as new evidence comes in.
So many arguments harden because we treat our priors as facts, rather than probabilities.
If we held our views a little more lightly and weighed new information more honestly our conversations might look very different.
In case you don't have an X account, it's a short one so copy pasting it below:
Bayes’ theorem is probably the single most important thing any rational person can learn. So many of our debates and disagreements that we shout about are because we don’t understand Bayes’ theorem or how human rationality often works. Bayes’ theorem is named after the 18th-century Thomas Bayes, and essentially it’s a formula that asks: when you are presented with all of the evidence for something, how much should you believe it? Bayes’ theorem teaches us that our beliefs are not fixed; they are probabilities. Our beliefs change as we weigh new evidence against our assumptions, or our priors. In other words, we all carry certain ideas about how the world works, and new evidence can challenge them. For example, somebody might believe that smoking is safe, that stress causes mouth ulcers, or that human activity is unrelated to climate change. These are their priors, their starting points. They can be formed by our culture, our biases, or even incomplete information. Now imagine a new study comes along that challenges one of your priors. A single study might not carry enough weight to overturn your existing beliefs. But as studies accumulate, eventually the scales may tip. At some point, your prior will become less and less plausible. Bayes’ theorem argues that being rational is not about black and white. It’s not even about true or false. It’s about what is most reasonable based on the best available evidence. But for this to work, we need to be presented with as much high-quality data as possible. Without evidence—without belief-forming data—we are left only with our priors and biases. And those aren’t all that rational.
💬 A quick note: replies can be easy to miss here, so feel free to add a new comment rather than replying directly. This isn’t a fast-fire space, it’s intentionally slower, and shaped for thoughtful engagement with the ideas themselves, rather than back-and-forth responses.

Best exemplified by the famous Monty Hall Problem: Always change! always receive and adapt upon receiving new information! Great mental model that helps in all kinds of situations.
This is an interesting idea but not how I think humans operate. Boyle was a living in an age (1700's) where Christian religion was being challenged by the rationalists. It was never a direct challenge rather a slow erosion of the dogma of the day.
As a minister himself he was trying to make sense of the world around him where rationalist were seeking answers that were explainable by facts. Humans are not always rational and do not necessarily change their views based upon rational thought processes. Evidence, the facts, the truth are all important but so are subjective views of our world.
Descartes (1650 period) is known for his "I think therefore I am". Regrettably he seems to have…