When healthcare is reduced to a single number
Not all numbers are the same.
I have already covered how medical Artificial Intelligence can depend on where you live, how even pulsimeters can be racists, and the links between Amazon reviews and health insurance.
Three weeks ago I came across one more example of how bad healthcare can get when decision makers leave algorithms and data run the shop without supervision.
I offered to give the story more exposure, and got the green light. So here it goes, with just minimal reformatting.
A telehealth tale by Mel Andrews
I had a “telehealth” doctor’s visit the other day. I was prescribed a scheduled medication without extensive questioning or documentation.
Why? The prescribing physician had been given a score for me, generated by a proprietary algorithm, indicating that I was at low risk for abuse and addiction.
The doctors employing this system have access to neither the algorithm nor the patient data it employs. All I could think was, black patients are probably being denied life-saving medication because some rudimentary algorithm flagged them for abuse potential, on the basis of their income or the neighborhood they live in.
Life or death decisions are being placed into the hands of proprietary algorithms let loose on the public, and there is no opening of the black box or opting out.
[Said the algorithmically empowered doctor]…
“[proprietary algorithm] is able to generate a single number that quickly gives me a more accurate assessment of the patient’s…history. This number, coupled with the ease and speed of the web page, makes getting the information I need, instantaneous.”
“As such, my colleagues and I use this program much more frequently; allowing us to deliver better care to our patients.”
(added later) This is Ohio. Seems to now be state-wide. I’m new here, this is the first I’ve encountered anything like it.
Algorithms should be like medicine
That is Mel Andrews' story. Me, I can only repeat what I say here, all the time: software is extremely powerful. Software can do a lot of good. Even in healthcare. But can do it only if we remain in control. No doctor would ever prescribe a drug coming into a black box, with nothing else but “This is good for you” written over it. Healthcare algorithms should be the same. At least doctors should understand how they work, and remain the final responsible for what they tell patients.
Who writes this, why, and how to help
I am Marco Fioretti, tech writer and aspiring polymath doing human-digital research and popularization.
I do it because YOUR civil rights and the quality of YOUR life depend every year more on how software is used AROUND you.
To this end, I have already shared more than a million words on this blog, without any paywall or user tracking, and am sharing the next million through a newsletter, also without any paywall.
The more direct support I get, the more I can continue to inform for free parents, teachers, decision makers, and everybody else who should know more stuff like this. You can support me with paid subscriptions to my newsletter, donations via PayPal (mfioretti@nexaima.net) or LiberaPay, or in any of the other ways listed here.THANKS for your support!