No One Can Take Your Call Right Now
From Michigan's unemployment algorithm to Australia's Robodebt to Dr. Oz's new Medicare AI — the same loop keeps gathering momentum: reward denial, remove humans, call it efficiency.
On Saturday I discussed this with Ali Velshi on MS Now, scroll down to the bottom to watch our conversation.
Keith Magnuson, an 83 year-old who was still rock climbing a year ago, is now trapped in a chair in his Seattle home, a prisoner of pain prolonged by AI.
The common intervention to relieve the pain caused by his lumbar spinal stenosis — beginning with a routine steroid injection recommended by his doctor — has been denied by a new AI-powered approval system rolled out this year. Magnuson is now forced to manage his pain using oxycodone, and the former athlete can’t walk more than a few yards at a time, according to The Seattle Times.
His experience is part of a longstanding pattern in the deployment of AI systems intended to detect fraud: The machine issues an accusation. The person accused either finds out too late to appeal, or has to find their way to a human who can review the machine’s decision. But the human capacity to review has often been diminished or even cut — an efficiency the machine was brought in to accomplish. It’s a Kafkaesque loop: The computer has spoken, no one can take your call, and thus the decision is final.

This is the structural problem with the system that denied Magnuson’s care: the Wasteful and Inappropriate Service Reduction (WISeR) model that CMS Administrator Dr. Mehmet Oz launched across six states on January 1, 2026 — a program that pays private tech vendors a percentage of whatever Medicare spending they avoid. The marketing language makes the whole thing sound like a new frontier of efficiency. But it’s part of a familiar pattern.


