This reminds me of a good joke. How can you identify an extroverted mathematician? He is the one looking at your shoes instead of his own. Pretty funny right?
If you aren't afraid to get a little math on your hands you might find yourself able to reach logical conclusions with the humility to know that we don't really know anything completely. Logical conclusions are more of a grey scale and less of a black and white certainty.
Even maths are probabilitiesnot rock solid truths. Best guesses of information that can or should shift our perception and yes even conclusions in the face of more evolving data. I like this concept because we should all collectively exhale. Truth and knowledge are a continuuma dynamic theorum that we update across our Bayes network when we receive new information. The only problem? Our brains are not Bayesian. If they were, every new piece of information would update across all the nodesthe fallacies or inaccuracies in our previous thought or conclusion.
The truth is more alarming. The litter of wrong conclusions or assumptions clutters up the network. We don't really know what "truths" belong to which beliefs. It makes sense. We lack the ability to update our interconnected tangential beliefs when presented with new information. The orphan beliefs are part of the reason we often have inconsistent beliefs.
There is an application of Bayes Theorem that I find interesting. You likely have heard about the overscreening of women for breast cancer. I am not talking about women with familial or genetic risk factors. A routine screening population. Here are how probabilities or data can be used to help reach a conclusion.
Here is a specific example: You can use whatever data you want for each variable but this one is pretty close to actual statistics. Approximately 1% of women aged 4050 have breast cancer. Let's think of women going to be screened without any diagnostic informationjust a routine evaluation. A woman with breast cancer has a 90% chance of a positive test from her mammogram screening while a woman that does not have cancer has a 10% chance of a false positive.
What is the probability a woman has breast cancer given that she just had a positive test? Here is the wiki definition "In probability theory and statistics, Bayes' theorem (alternatively Bayes' law or Bayes' rule) describes the probability of an event, based on conditions that might be related to the event."
A = “positive test”
B= “breast cancer”
If you aren't afraid to get a little math on your hands you might find yourself able to reach logical conclusions with the humility to know that we don't really know anything completely. Logical conclusions are more of a grey scale and less of a black and white certainty.
Even maths are probabilitiesnot rock solid truths. Best guesses of information that can or should shift our perception and yes even conclusions in the face of more evolving data. I like this concept because we should all collectively exhale. Truth and knowledge are a continuuma dynamic theorum that we update across our Bayes network when we receive new information. The only problem? Our brains are not Bayesian. If they were, every new piece of information would update across all the nodesthe fallacies or inaccuracies in our previous thought or conclusion.
The truth is more alarming. The litter of wrong conclusions or assumptions clutters up the network. We don't really know what "truths" belong to which beliefs. It makes sense. We lack the ability to update our interconnected tangential beliefs when presented with new information. The orphan beliefs are part of the reason we often have inconsistent beliefs.
There is an application of Bayes Theorem that I find interesting. You likely have heard about the overscreening of women for breast cancer. I am not talking about women with familial or genetic risk factors. A routine screening population. Here are how probabilities or data can be used to help reach a conclusion.
Here is a specific example: You can use whatever data you want for each variable but this one is pretty close to actual statistics. Approximately 1% of women aged 4050 have breast cancer. Let's think of women going to be screened without any diagnostic informationjust a routine evaluation. A woman with breast cancer has a 90% chance of a positive test from her mammogram screening while a woman that does not have cancer has a 10% chance of a false positive.
What is the probability a woman has breast cancer given that she just had a positive test? Here is the wiki definition "In probability theory and statistics, Bayes' theorem (alternatively Bayes' law or Bayes' rule) describes the probability of an event, based on conditions that might be related to the event."
A = “positive test”
B= “breast cancer”
Let's look at a tree diagram. Moving left to right the first node is the statistic that 1% of woman aged 40 to 50 years of age have breast cancer.
The top line and second node indicates that 90% of women with breast cancer will have a positive test from mammogram and 10% will have a false negative. The remaining 99% that do not have breast cancer will have 90% with a negative mammogram but 10% will have a false positive. 
P(.90) (.01)
(.01)(.90) + (.99)(.10) 
Plug in the numbers from the "+" paths and you should get 8.3% as the actual risk from a Bayesian perspective of actually having cancer.

One more consideration is the Bayesian principle of remembering your priors or the official term "baserate neglect". We must also consider the actual percentage of women who indeed get mammograms. The probability of 8.3% is valid only if EVERYONE is tested or a random distribution of women are tested (recall the 99% and the 1%). The odds are overwhelming in favor of not having cancer unless we factor in family history and other relevant demographic information. This is why guidelines have shifted away from mass screenings and for shared decision making at the point of care.
Join the newsletter to crowdsource the new book about surveyspublishing in July. Preorder here...
I presented a quick 6 minute and 40 second presentation on fallacies if you would like to see presentations from CMEpalooza last week they are fast and fun! I think I am third one to go so about 20 minutes in but I think you will enjoy all!
Thoughtful discussions about content development and outcomes analytics that apply the principles and frameworks of health policy and economics to persistent and perplexing health and health care problems.