If you aren't afraid to get a little math on your hands you might find yourself able to reach logical conclusions with the humility to know that we don't really know anything completely. Logical conclusions are more of a grey scale and less of a black and white certainty.
Even maths are probabilities--not rock solid truths. Best guesses of information that can or should shift our perception and yes even conclusions in the face of more evolving data. I like this concept because we should all collectively exhale. Truth and knowledge are a continuum--a dynamic theorum that we update across our Bayes network when we receive new information. The only problem? Our brains are not Bayesian. If they were, every new piece of information would update across all the nodes--the fallacies or inaccuracies in our previous thought or conclusion.
The truth is more alarming. The litter of wrong conclusions or assumptions clutters up the network. We don't really know what "truths" belong to which beliefs. It makes sense. We lack the ability to update our interconnected tangential beliefs when presented with new information. The orphan beliefs are part of the reason we often have inconsistent beliefs.
There is an application of Bayes Theorem that I find interesting. You likely have heard about the over-screening of women for breast cancer. I am not talking about women with familial or genetic risk factors. A routine screening population. Here are how probabilities or data can be used to help reach a conclusion.
Here is a specific example: You can use whatever data you want for each variable but this one is pretty close to actual statistics. Approximately 1% of women aged 40-50 have breast cancer. Let's think of women going to be screened without any diagnostic information--just a routine evaluation. A woman with breast cancer has a 90% chance of a positive test from her mammogram screening while a woman that does not have cancer has a 10% chance of a false positive.
What is the probability a woman has breast cancer given that she just had a positive test? Here is the wiki definition "In probability theory and statistics, Bayes' theorem (alternatively Bayes' law or Bayes' rule) describes the probability of an event, based on conditions that might be related to the event."
A = “positive test”
B= “breast cancer”
Let's look at a tree diagram. Moving left to right the first node is the statistic that 1% of woman aged 40 to 50 years of age have breast cancer.
The top line and second node indicates that 90% of women with breast cancer will have a positive test from mammogram and 10% will have a false negative.
The remaining 99% that do not have breast cancer will have 90% with a negative mammogram but 10% will have a false positive.
(.01)(.90) + (.99)(.10)
Plug in the numbers from the "+" paths and you should get 8.3% as the actual risk from a Bayesian perspective of actually having cancer.