Remember Watson?
Watson is that supercomputer that was so good at Jeopardy even Ken
Jennings couldn’t beat it. But it turns
out Watson has some limitations.
According to a
story in the online news site The
Correspondent:
IBM stated in 2016 that Watson would cause a "revolution in healthcare." Instead, several research centres have since cancelled their cooperation with the system because Watson’s recommendations were not only wrong but also dangerous.
Even more striking, because of what it says about how media
approach such supposed breakthroughs, is an example to which The Correspondent linked: This
story from Health News Review. Here’s how that story begins:
We often call out overly optimistic news coverage of drugs and devices. But information technology is another healthcare arena where uncritical media narratives can cause harm by raising false hopes and allowing costly and unproven investments to proceed without scrutiny.
The story goes on to cite one gushy news account after
another about how Watson would be a great leap forward in treating cancer.
It wasn’t.
So consider: Watson was being asked to help diagnose and
design treatments for something that already existed. It wasn’t even asked to predict much. And it was dealing in the area of hard
science.
Yet we are supposed to believe that algorithms can predict
who is going to abuse a child. And we
are supposed to believe that the humans who program these algorithms will
magically avoid incorporating into them any human biases. We are supposed to believe this because, just
as happened with Watson and helping to cure cancer, the use of predictive
analytics algorithms in child welfare has been the subject of an avalanche of gushy,
uncritical
news coverage. (The exception: This
story in Wired, an excerpt from
Prof. Virginia Eubanks’ book, Automating
Inequality.)
In her story for The
Correspondent, reporter Sanne Blauw writes:
We’re looking to technology as the sole solution to all our problems. Yet the best solutions could be in a completely different place. Climate change may require systemic change rather than a better algorithm. Teachers and healthcare workers would probably benefit more from a better salary than a robot assistant.
Similarly, child welfare agency caseworkers – and the
families they deal with – would benefit more from
a better salary than a predictive analytics algorithm. And child welfare definitely requires
systemic change rather than an algorithm.
So here’s your Final Jeopardy answer, Watson:
It doesn’t make children safer, it magnifies human biases,
it gathers vast troves of data on mostly poor people without their consent, and
the human beings in charge of it will never be able to live up to their
promises to strictly limit its use.
The question: What is predictive analytics in child welfare?
UPDATE: Looking at the brighter side, at least Watson wasn't racially biased - which is more than can be said for another recent effort to use predictive analytics in medicine.
UPDATE: Looking at the brighter side, at least Watson wasn't racially biased - which is more than can be said for another recent effort to use predictive analytics in medicine.