Marc Cherna is was once one of
the best human services leaders in America. But even he shouldn't have the power to be the Mark Zuckerberg of child welfare.
Today, across America and much of the world, the big story
will be Facebook CEO Mark Zuckerberg testifying before Congress about how
personal data from millions of Americans wound up in the hands of Cambridge
Analytica. Although the data breach is outrageous, at least those data were originally
uploaded voluntarily – Facebook users have the right to not share their data in
the first place.
In Pittsburgh, Pa. poor people have NO. SUCH. CHOICE. They
are forced to surrender their data. And their data can be used to decide whether
to take away their children. I’ve written about the implications here
and here. Another example comes courtesy of a
Pennsylvania dentist:
Last week, I published
a post about a dentist in Pennsylvania who sent threatening form letters to
some of his patients. The patients had dared to not schedule follow-up appointments
when the dentist thought they should. In the case which brought this to public
attention, the patient didn’t like the dental practice and had made clear her
intention to go elsewhere.
The letters threaten to report patients who don’t schedule
follow up appointments to child protective services. According to at least one news account, the dentist
acknowledges following through on the threat 17 times last year.
The earlier post discusses the potentially devastating
consequences for children. If the report is “screened in” – as is likely
because it came from a medical professional – it means, at a minimum, a highly
intrusive investigation that could do lasting emotional harm to the
children. That harm can’t be undone if
the child welfare agency realizes the report was false.
The mere existence of a false report in a child welfare
agency file can increase the chances that, if there’s another false report, the
new report will be wrongly substantiated – because of a bizarre notion in child
welfare that enough false reports are bound to equal a true report. This
increases the odds that the children will be consigned to the chaos of foster
care.
And, of course, all those false reports steal time
caseworkers should be spending finding children in real danger.
The only good news here is that this dentist practices in
eastern Pennsylvania. At least in that
part of the state a child abuse “hotline” operator deciding if a case should be
“screened-in” can check the file and, seeing a previous allegation based solely
on a missed dental appointment, might realize how absurd it was.
Were this dentist at the other end of the state, in
Allegheny County (metropolitan Pittsburgh) it could be far worse.
Automating absurdity
That’s because Allegheny County is home to the nation’s
most advanced experiment in using “predictive analytics” to decide when to
investigate if a child is in danger of being abused or neglected.
Whenever the county receives a report alleging that a child
is being abused or neglected, an algorithm known as the Allegheny Family
Screening Tool (AFST) uses more than 100 different data points to spit out a
secret “risk score” between 1 and 20 -- an invisible “scarlet number” that
tells the county how likely it is that the child is, in fact being abused or
neglected or is at risk of abuse or neglect.
The higher the number the more likely the report will be “screened in”
and investigators will be sent out.
Though the investigators don’t know the risk score, they do
know that a high risk score is why they are being sent out in the first place.
Prof. Virginia Eubanks offers a devastating critique of AFST
in her book, Automating
Inequality. Part of that chapter is excerpted
in Wired magazine. I discussed
her findings in
detail in Youth Today and I
discussed the ethically-challenged “ethics review” used to justify AFST on
this blog, so I would repeat that overall critique here.
But the case of the disgruntled dentist prompts me to focus on one particular piece of the Allegheny algorithm: The mere fact that a previous report exists – regardless of how stupid that report may have been – raises the risk score. No human being intervenes first to see if the report had any legitimacy.
In fact, it appears that the Allegheny County algorithm even
counts previous reports that were considered so absurd they were screened out
with no investigation at all.
So suppose, hypothetically, an Allegheny County dentist
reported someone just for missing a follow-up appointment. This was considered too absurd even to
investigate. A few months later someone
else calls the child abuse hotline about the same family. The existence of that previous,
uninvestigated report from the dentist raises the risk score. So now, the child has a higher scarlet
number.
Making it even worse: In the Allegheny algorithm still
another factor increasing the risk score is if a report, no matter how absurd,
was made by a medical professional – such as a dentist.
In cases alleging actual abuse, there is a long, almost
impossible appeals process. And in a
bizarre twist of Pennsylvania law, in less serious cases there is no appeals
mechanism at all. In such cases, the
county keeps a record of the case until the child who was the subject of the
report turns 23.
This is where we find out how Marc Cherna feels about
keeping junk reports and using them in his algorithm. He told Eubanks: “The stuff stays in the
system.” And, of course, Cherna said what those who hold onto junk
reports of child abuse always say. In effect, enough false reports are bound to
equal a true report. Said Cherna: “A lot of times where there’s smoke there’s
fire.”
But a lot more often there’s someone who’s just blowing
smoke. So let’s just be grateful that a
certain dentist hasn’t opened a branch office in Pittsburgh.
UPDATE: 11:45 am: In an earlier version of this post, I asked whether inclusion of certain elements in AFST violated Pennsylvania law concerning expungement of records involving unfounded reports. This was based on a list of factors included in AFST. I noted that I had e-mailed Cherna and his deputy Erin Dalton on April 5 and received no response. I have just received a response from Cherna, in which he makes clear that, in fact, AFST does NOT include any information that legally should be expunged. Therefore, I have deleted that portion of the original post.
And, as we watch what's happening in Congress today, let’s also remember one thing more. Marc Cherna is now the Mark Zuckerberg of
child welfare. Both Zuckerberg and Cherna amass huge quantities of data. Then
they decide what will happen to those data.
There are two key differences: Marc Cherna isn’t doing it to make money.
In fact both his intentions, and his track record as a child welfare leader are
excellent. On the other hand, Facebook
can’t use data to take away your children. Marc Cherna’s agency can.
UPDATE: 11:45 am: In an earlier version of this post, I asked whether inclusion of certain elements in AFST violated Pennsylvania law concerning expungement of records involving unfounded reports. This was based on a list of factors included in AFST. I noted that I had e-mailed Cherna and his deputy Erin Dalton on April 5 and received no response. I have just received a response from Cherna, in which he makes clear that, in fact, AFST does NOT include any information that legally should be expunged. Therefore, I have deleted that portion of the original post.