In figure skating there are two buckets of competition. The prized innovative routines as varied and charismatic as the individual athletes and also the compulsories--a series of selected maneuvers that are identical for each competitor. We can now evaluate how the foundational skills are performed and determine who is deserving of the highest marks.
I work in healthcare. You should notice that I did not say pharmaceuticals, health policy, or even health economics. I believe when we firewall ourselves into distinct silos we become part of the problem. How can you work within the pharmaceutical industry and not be curious about upstream prevention strategies or the influence of social determinants of health on clinical outcomes?
In the same manner, health policy does not exist in a vacuum. I have written about historic precedence for framing healthcare as a "public good" but the thread is long ignored during debates. How does health economics ignore these determinations when discussing cost benefits of therapeutics? You would almost think we were ignoring the compulsory truths. Perhaps we are confusing skepticism with cynicism?
Each of us is unique in the interplay of genetic makeup and environment. The path to maintaining or regaining health is not the same for everyone. Choices in this gray zone are frequently not simple or obvious. For that reason, medicine involves personalized and nuanced decision making by both the patient and doctor...Although presented as scientific, formulas that reduce experience of illness to numbers are flawed and artificial.
Many analyses require analyzing databases from multiple health plans or healthcare delivery systems. The bulk of my work is often spent harmonizing databases--looking to identify similar variables. A structure is needed to convert data. Lucky for me Sentinel, National Medical Product Monitoring System, and PCORI have a common data model. This often is not the case.
We lack harmonization and interoperability. You certainly don't need me to tell you that. But you do need to understand how the lack of standardization may impact the data you generate, report, or consume. A recent little bug-a-boo I noticed when reviewing biomarker prediction data. Biomarkers are endogenous time-varying covariates--their future path depends on previous events. Standard time-varying Cox models are not appropriate.
Because the biomarker measurements are collected at specific time points the biomarker may be missing at some observed time points. If imputation models such as last observation carried forward are used they may introduce bias into the model estimation. Joint modeling of time-dependent biomarkers is a new concept for me but explained quite nicely at the recent Joint Statistics Meeting in Baltimore. A longitudinal model characterizes the unobservable trajectory as a latent, time-dependent covariate in the survival model to predict failure times.
A quote from Willful Ignorance The Mismeasure of Uncertainty underscores why we can't rely on statisticians to be the only data aware professionals at the table.
"Relying uncritically on statistics for answers has become so second-nature to us that we have forgotten how recent and revolutionary this way of thinking really is. That is the crux of problems we now face.
Fortunately, there is a path out of the stagnation that plagues our research currently, and it is surprisingly simple, in theory...In a nutshell, we must learn to become more mindful in applying probability-based statistical methods and criteria."
I know Big Data is the new blockbuster insight. But does the data really inform?
Or is it the meaning we attach to the data? What is the old adage to be learned from Tarascon Pocket Pharmacopoeia?
The dose is the poison...