I like the idea of a blueprint for developing a scalable data strategy. It isn't smart to recommend a step by step recipe for something as abstract and heterogenous as an analytics strategy. I imagine many organizations or even individuals distribute around the mean of a normal curve and are smug and self congratulatory regarding data frameworks or even a process for evaluating data on hand.
A blueprint is created with the entire framework integrated and included in the evolving architectural plan. We need to see what is possible before selecting the "finishes" or design elements.
You will need to trust me up front or at least stay while I walk through a case study--when I tell you a large part of ineffective data strategy stems from "not knowing what we don't know". The types of questions and even answer options integrated into a survey will definitely distort your findings and diminish insights pulled from analyses. Remember the old adage, "garbage in, garbage out"?
Data out in the real world is complicated. Understandably the patient populations in clinical trials need to be homogenous so that we can make assumptions about any signals observed during the trial period.
Unfortunately we are limited in what the data will mean outside in the real world. Your clinic waiting room is not curated. Patients have comorbidities, differences in sex, age, ethnicity, social correlates, and will likely respond differently out in the real world beyond the limits of small study populations and limited evaluation periods.
Sign up for our newsletter!
Browse the archive...
Thank you for making a donution!
In a world of "evidence-based" medicine I am a bigger fan of practice-based evidence.
Remember the quote by Upton Sinclair...
“It is difficult to get a man to understand something, when his salary depends upon his not understanding it!”