data&donuts
  • Data & Donuts (thinky thoughts)
  • COLLABORATor
  • Data talks, people mumble
  • Cancer: The Brand
  • Time to make the donuts...
  • donuts (quick nibbles)
  • Tools for writers and soon-to-be writers
  • datamonger.health
  • The "How" of Data Fluency

hello data
I visualize data buried in non-proprietary healthcare databases
https://unsplash.com/@winstonchen

immovable objects meeting unstoppable forces

8/30/2019

 
Picture
I usually discover misleading data collection and reporting the same way everybody else does.

An eye-catching headline or something similar limps across a social media feed--not unlike a wounded animal in the wild--the eye is drawn.

​In this recent instance, it was the claim that 60% of healthcare executives say they use predictive analytics. Clearly the headline referred to this recent report from the society of actuaries--2019 Predictive Analytics in Health Care Trend Forecast.

​I have "0" familiarity with the Society of Actuaries but I loosely know what their professional responsibilities include.

​Wikipedia defines an actuary as "a business professional who deals with the measurement and management of risk and uncertainty".
Picture
Let's embed the discussion of this report alongside a few steps to improve your survey game and post survey analytics, shall we?

Full disclosure, I have not seen the survey and for all I know some of these practices were implemented under the hood but if they were--why not say so?

Define what you are measuring. How do you know if predictive analytics are being used in your organization if you aren't presented with a baseline definition of predictive analytics or how you would like respondents to consider whether it is being used or not and in what capacity?

Report the raw numbers. When percentages are reported without accompanying numerators and denominators how are we to evaluate the ability to extrapolate the findings to the real world. Perhaps this only applies to the 201* respondents to the survey. How many received the survey? If it was sent to 20,000 health payers and provider executives how reflective are the findings of this single survey to the larger group?

*stated on last page of report--100 health payer executives and 101 health provider executives were interviewed.

What is a health provider executive? I know what I think they are. Are they defined the same way across organizations?

Picture
You can't compare percentages from one year to the next without stating the raw numbers. Clearly the number of respondents varies from year to year so how are we intended to evaluate a 13-percentage point increase from last year? Or a 6-point increase from 2017?

How are they using the predictive analytics and are they comparable across organizations? What exactly are they trying to predict? Employee retention? Per member per month (PMPM) capitation payments?

You need specificity in your outcomes. "Nearly two-thirds of executives (61%) forecast that predictive analytics will save their organization 15% or more over the next five years."

Save their organization what?

Who are these organizations that are saying no to predictive analytics? Isn't that the foundational algorithm embedded in healthcare outcomes forecasting?

​In the absence of a workable definition of predictive analytics presented to the respondents--what can we say for certain?
Picture
To be honest with you, I don't even know what I am looking at in this graphic below. No idea. So I am simply going to skip it. Don't report low-value information. Not everything needs a graphic.
Picture
My confusion continues with the next graphic. Costs of what? You need to measure specifically in order to know if costs were reduced. And what does "Staffing/workforce needs Clinical outcomes" mean? Is it a typo? Is part of the chart missing?

At first glance I would also make an assumption that the "actual results overall" are negative? Why? The color choice. Red in a chart can be misinterpreted because we all arrive at graphicacy with our own perceptions and biases. 
Picture
Another problem of not seeing the actual survey--I am not sure if these choices are ranked or asked in a Likert or multiple choice format. Ranking would be preferred and add value even if I don't quite understand the responses. For example, what question wording would you have you respond "Data visualization"?

If the future of predictive capabilities question yielded "Data visualization at 23% what are you all looking at now?

Ranking questions can yield probabilities but any other format would be reporting descriptive statistics only.
Picture
Picture
This report--although well intentioned--lacks clarity that seems at odds with the work of an actuarial organization.

My point isn't too blame but to demonstrate how we can all do better--myself included.

We need to slow down and take the time to make sure that we aren't introducing unsolvable paradoxes with our own data collection.

​After all, when an unstoppable force meets an immovable object, the laws of physics are quite boring. Don't let your survey become a black hole...

​

Comments are closed.
    Sign up for our newsletter!
    Picture
    Browse the archive...
    follow us in feedly
    Picture
    Thank you for making a donution!
    donations=more content
    In a world of "evidence-based" medicine I am a bigger fan of practice-based evidence.

    ​Remember the quote by Upton Sinclair...


    “It is difficult to get a man to understand something, when his salary depends upon his not understanding it!”

    Follow the evolution of Alzheimer's Disease into a billion dollar brand
    Picture
Proudly powered by Weebly
  • Data & Donuts (thinky thoughts)
  • COLLABORATor
  • Data talks, people mumble
  • Cancer: The Brand
  • Time to make the donuts...
  • donuts (quick nibbles)
  • Tools for writers and soon-to-be writers
  • datamonger.health
  • The "How" of Data Fluency