Laurie Tuff really,
really wanted her little
study of the local Court-Appointed Special Advocates (CASA) program to show
that it worked.
That’s understandable. At the time her study was completed,
in 2014, she’d been associated with the local CASA program for 14 years, first
as a volunteer then as a member of the staff.
At the time she conducted her study she was a program director.
But it didn’t work out that way. In fact, based on the outcomes she chose
herself, most of which are exactly the outcomes one would expect to measure to
see if CASA is effective, CASA did no good at all. On one measure, it did harm.
When the results didn’t turn out the way she wanted them
too, what was Tuff’s conclusion? She must
have chosen the wrong outcomes! If these
outcomes don’t show CASA is doing any good, she says, we need to find other
outcomes!
CASA’s poor track record
CASA is the program in which minimally trained volunteers,
overwhelmingly white and middle-class, are assigned to families who are
overwhelmingly poor and disproportionately nonwhite. Then these amateurs tell
judges if the children should be taken from those families, sometimes forever.
In more than 60 percent of cases, judges rubber-stamp every single
recommendation these amateurs make.
A law review article aptly describes CASA as “anexercise of white supremacy” – not just because of outrages
such as these, but because of the program’s very nature.
And the largest, most comprehensive study ever done of the
program, a study commissioned by the National CASA Association itself and
conducted by Caliber Associates, found that it does
nothing to make children safer. The study also found that CASA prolongs
foster care and reduces the chances children will be placed with relatives
instead of strangers. The trade journal Youth Today found that CASAs efforts to
spin that study’s findings “can border on duplicity.”
So it’s no wonder a staffer with a CASA program would be
desperate to find evidence of effectiveness.
Enter Laurie Tuff and her “capstone project” for her
graduate work at the University of Washington.
Tuff didn’t work for just any CASA program. She worked for
the scandal-plagued program in Snohomish County, Washington. - the one blastedby a county judge for “the blatant withholding and destruction of evidence
and … rampant continuing lying …” The
one in which the judge said a program volunteer “infiltrated” – the judge’s
word – a listserv for parental defense attorneys. The one which had as a
volunteer for 20 years someone whose
comments about the families he investigated (and about one entire religion
– you can probably guess which one) read as if they’d been written by Steve
Bannon.
First came the standard excuses
Tuff first offers the same excuse CASA always dredges up
when studies don’t go their way. She claims the Caliber study supposedly didn’t
take into account that CASAs tend to be assigned to more difficult cases.
But the Caliber study did
take that into account – and took a series of steps to, in the study’s own
words “level the playing field.” (It is
not clear if Tuff read the full Caliber study. Her bibliography mentions only
an eight-page summary issued by Caliber.)
Tuff was able to find exactly one study which found what she
wanted to find – a very small study from one county published in 1999. A small study means nothing unless it can be
replicated. Tuff set out to do just
that. And Tuff bent over backwards to
make sure that the cases in her study in which children had CASAs and those in
which they did not were equally difficult.
She writes that:
The purpose of the present research was to replicate [the] 1999 study … and to compare the results … The hypothesis was that cases assigned a CASA worker were more likely to have shorter dependencies [time in foster care] fewer out-of-home placements, obtain more services for the child and have more family contact after dismissal.
The actual results for the children with a CASA:
● No shorter time in foster care
● No more services
● No more family contact
● More moves from
placement to placement
The study is very small, probably even smaller than the 1999
study. If this were all the evidence
that CASA didn’t work it would prove very little (though the burden should be
on CASA to prove that it does work). But this study is in addition to the massive,
rigorous Caliber study.
Spinning the results
But the most amazing part of Tuff’s study is how she sought to
spin the results. She writes:
...[T]he present research was unable to replicate the findings of [the 1999 study], confirming that these measures of effectiveness may not properly assess the value of CASA involvement. … [T]raditional measures of effectiveness are too narrowly defined and miss the subtlety of the CASA’s contribution to the child’s well-being.
What if the study had come out the other way? If, by these measures, children with CASAs did
have better outcomes would these magically become valid measures of children’s
well-being after all? If the measures
don’t really tell us anything about whether CASA helps children, why bother
trying to replicate a study that used them?
The answer is obvious: From the perspective of Tuff and
others in CASA, if the measures make CASA look good, then they’re valid, if
they make CASA look bad, get new measures!
After all, Tuff writes, there is “widespread anecdotal
evidence that CASAs are effective at representing the best interests of the
child…”
And if there’s anecdotal evidence,” what else do you need?