This reminds me of a good joke. How can you identify an extroverted mathematician? He is the one looking at your shoes instead of his own. Pretty funny right?
If you aren't afraid to get a little math on your hands you might find yourself able to reach logical conclusions with the humility to know that we don't really know anything completely. Logical conclusions are more of a grey scale and less of a black and white certainty.
Even maths are probabilities--not rock solid truths. Best guesses of information that can or should shift our perception and yes even conclusions in the face of more evolving data. I like this concept because we should all collectively exhale. Truth and knowledge are a continuum--a dynamic theorum that we update across our Bayes network when we receive new information. The only problem? Our brains are not Bayesian. If they were, every new piece of information would update across all the nodes--the fallacies or inaccuracies in our previous thought or conclusion.
The truth is more alarming. The litter of wrong conclusions or assumptions clutters up the network. We don't really know what "truths" belong to which beliefs. It makes sense. We lack the ability to update our interconnected tangential beliefs when presented with new information. The orphan beliefs are part of the reason we often have inconsistent beliefs.
There is an application of Bayes Theorem that I find interesting. You likely have heard about the over-screening of women for breast cancer. I am not talking about women with familial or genetic risk factors. A routine screening population. Here are how probabilities or data can be used to help reach a conclusion.
Here is a specific example: You can use whatever data you want for each variable but this one is pretty close to actual statistics. Approximately 1% of women aged 40-50 have breast cancer. Let's think of women going to be screened without any diagnostic information--just a routine evaluation. A woman with breast cancer has a 90% chance of a positive test from her mammogram screening while a woman that does not have cancer has a 10% chance of a false positive.
What is the probability a woman has breast cancer given that she just had a positive test? Here is the wiki definition "In probability theory and statistics, Bayes' theorem (alternatively Bayes' law or Bayes' rule) describes the probability of an event, based on conditions that might be related to the event."
A = “positive test”
B= “breast cancer”
One more consideration is the Bayesian principle of remembering your priors or the official term "base-rate neglect". We must also consider the actual percentage of women who indeed get mammograms. The probability of 8.3% is valid only if EVERYONE is tested or a random distribution of women are tested (recall the 99% and the 1%). The odds are overwhelming in favor of not having cancer unless we factor in family history and other relevant demographic information. This is why guidelines have shifted away from mass screenings and for shared decision making at the point of care.
Join the newsletter to crowdsource the new book about surveys--publishing in July. Pre-order here...
I presented a quick 6 minute and 40 second presentation on fallacies if you would like to see presentations from CMEpalooza last week-- they are fast and fun! I think I am third one to go so about 20 minutes in but I think you will enjoy all!
Thoughtful discussions about content development and outcomes analytics that apply the principles and frameworks of health policy and economics to persistent and perplexing health and health care problems.
Browse the archive...
Thank you for making a donution!
In a world of "evidence-based" medicine I am a bigger fan of practice-based evidence.
Remember the quote by Upton Sinclair...
“It is difficult to get a man to understand something, when his salary depends upon his not understanding it!”