|"Big Data" told us she would be president. (Photo by Gage Skidmore)|
●A predictive analytics model spreading all over the country, called Rapid Safety Feedback, has failed spectacularly in Illinois.
● In Los Angeles County, another experiment was hailed as a huge success in spite of a “false positive” rate of more than 95 percent. And that experiment was conducted by the private, for-profit software company that wants to sell its wares to the county. (Los Angeles has now quietly dropped this experiment, but still is pursuing predictive analytics.)
|In Pittsburgh, they try to slap an invisible "scarlet number"|
risk score on every child - at birth.
How did the county solve the ethics review problem? By commissioning two more ethics reviews. But the information the county released about those other ethics reviews turned out to be incomplete.
At no point does the county address the issue of informed consent - the fact that vast amounts of data are being taken, disproportionately from poor people, without their permission and then potentially used against them.
When Facebook misused data it obtained from users voluntarily, it was fined more than $5 billion. But Allegheny County can take data from poor people and turn it against them with impunity.
|As we said in 2019|
Predictive analytics is a fad that presents serious new dangers to children in impoverished families, especially children of color. That’s because predictive analytics does not eliminate the racial and class biases that permeate child welfare, predictive analytics magnifies those biases. Predictive analytics amounts to data-nuking impoverished families. It is computerized racial profiling.
Risk factors associated with child maltreatment include extreme poverty, family unemployment, caregiver substance abuse, lack of understanding of child development, and neighborhood violence. However, each of these only weakly predicts the likelihood of maltreatment.
For example, although maltreatment is more common among families living in poverty than among other families, the majority of parents with low incomes do not maltreat their children. When risk factors are present, protective factors can mitigate the likelihood of maltreatment. Such protective factors include parental social connections, knowledge of parenting and child development, concrete support in times of need, and children’s social-emotional competence.
Because maltreatment is so difficult to predict, prevention approaches that strengthen protective factors among at-risk families broadly—even if the risk is low—are likely to be most effective in reducing maltreatment.
The bias already in the system
|Three studies have found that 30 percent of foster children|
could be home right now if their parents
just had adeqate housing
The criminal justice experience
|Eric Holder warned of the dangers of|
In a classic example of the disingenuous way Allegheny County has been selling predictive analytics, they falsely claimed to have fixed the poverty profiling issue in their “Hello Baby” algorithm, the one that seeks to stamp a scarlet number risk score on every child at birth.
Thus, they claim: “Unlike the Allegheny Family Screening Tool model, the Hello Baby model only relies on data where the County has the potential to have records for every family; it only uses universal (rather than means tested) data sources.”
But the key weasel word there is potential.
Because right before making this claim, the county acknowledges that they probably will use “child protective services, homeless services and justice system data.”
So unless Allegheny County’s jails are filled with wealthy white-collar corporate criminals, and its homeless shelters are filled with CEOs spending the night because they misplaced the keys to their McMansions, this is still poverty profiling.
Predictive analytics: The stop-and-frisk of child welfare
|Too many liberals start to sound like Newt Gingrich|
when you whisper the words "child abuse" in their ears.
(Photo by Gage Skidmore)
One self-proclaimed liberal, used exactly the same logic, and almost the same words, as Gingrich to defend the use of predictive analytics in child welfare. Here's what she wrote for what was then known as the Chronicle of Social Change:
As to the concern about racial profiling that has haunted the predictive analytics debate, I find it very hypocritical. Letting more black children die to protect them from racial profiling cannot possibly be an ethical approach or one that is endorsed by responsible child welfare leaders.
As you watch the Daily Show clip, try this: Whenever Trevor Noah says “crime” or “criminal” substitute “child abuse” or “child abuser.” And whenever he says stop-and-frisk, substitute “predictive analytics.”
Other reasons the risk is greater in child welfare
The reality on the ground
A man storms into an office in downtown Pittsburgh. His 12-year-old daughter is with him. The man appears intoxicated. The man screams and yells and pounds his fists against a bulletin board. He demands to have his picture taken.
He forcefully grabs his daughter’s forearm, pulling her into the picture as she tries her best to pull away from him. She screams “Please, please Daddy, no!” multiple times. And multiple times he yanks on her arm, trying to pull her to his side so a photo could be taken of both of them. He yells at his daughter and repeatedly jabs his finger in her shoulder.
The daughter is crying hysterically and red-faced. The father rips a cell phone out of her hand because he thought she was trying to call her mother.
As one eyewitness said:
I was extremely concerned for his daughter‘s safety, and I actually noticed that my heart was racing. … Having to watch as [the father] terrorized his teenage daughter — with his hair disheveled and his face twisted — was something I’m never going to forget.
The father denies all of this – but remember, an allegation is all it takes to put someone through the AFST database and cough up a risk score.
We don’t know if anyone ever called the Allegheny County family police agency. Indeed, we don’t know if the father actually lives in Allegheny County or even Pennsylvania. But if the answer to those questions is yes, and if the father’s name were run through AFST, odds are his risk score would be very low. That’s because the father is John Robinson Block. The office in question is that of the Pittsburgh Post-Gazette – which Block publishes. That makes him way too affluent for much to turn up in the AFST database. And, as we’ve seen, AFST doesn’t factor in the seriousness of the allegation.
The child welfare response: We can control our nukes
|Big Data is like nuclear power at best, nucelar weapons at worst|
Efforts to abuse analytics already are underway
States frequently have significant year-to-year swings in the number and rate of fatalities. In small states, a single incident rather than a systemic issue can dramatically affect annual statistics. In addition, in small states an analysis of data from the past five years…would include too few cases to draw definitive conclusions.
|CECANF was the Keystone Kops of commissions|
Lessons from the elections of 2016
It was a rough night for number crunchers. And for the faith that people in every field … have increasingly placed in the power of data. [Emphasis added]
[The election results undercut] the belief that analyzing reams of data can accurately predict events. Voters demonstrated how much predictive analytics, and election forecasting in particular, remains a young science …
[D]ata science is a technology advance with trade-offs. It can see things as never before, but also can be a blunt instrument, missing context and nuance. … But only occasionally — as with Tuesday’s election results — do consumers get a glimpse of how these formulas work and the extent to which they can go wrong. … The danger, data experts say, lies in trusting the data analysis too much without grasping its limitations and the potentially flawed assumptions of the people who build predictive models.
Two years ago, the Samaritans, a suicide-prevention group in Britain, developed a free app to notify people whenever someone they followed on Twitter posted potentially suicidal phrases like “hate myself” or “tired of being alone.” The group quickly removed the app after complaints from people who warned that it could be misused to harass users at their most vulnerable moments.