Here's how an AI tool may flag parents with disabilities

  • 📰 AP
  • ⏱ Reading Time:
  • 108 sec. here
  • 3 min. at publisher
  • 📊 Quality Score:
  • News: 46%
  • Publisher: 51%

Technology Technology Headlines News

Technology Technology Latest News,Technology Technology Headlines

The Justice Department is investigating a Pittsburgh county’s child welfare system to determine whether its use of an artificial intelligence tool discriminates against people with disabilities or other protected groups.

They stayed with their daughter around the clock when she was moved to a rehab center to regain her strength. Finally, the 8-month-old stopped batting away her bottles and started putting on weight again.

More than a year later, their daughter, now 2, remains in foster care. The Hackneys, who have developmental disabilities, are struggling to understand how taking their daughter to the hospital when she refused to eat could be seen as so neglectful that she’d need to be taken from her home. Her pediatrician first reassured them that babies sometimes can be fickle with feeding and offered ideas to help her get back her appetite, they said.

The Hackneys’ story — based on interviews, internal emails and legal documents — illustrates the opacity surrounding these algorithms. Even as they fight to regain custody of their daughter, they can’t question the “risk score” Allegheny County’s tool may have assigned to her case because officials won’t disclose it to them. And neither the county nor the people who built the tool have ever explained which variables may have been used to measure the Hackneys’ abilities as parents.

A number of local leaders have tapped into AI technology while under pressure to make systemic changes, such as in Oregon during a foster care crisis and in Los Angeles County after a series of high-profile child deaths in one of the nation’s largest county child welfare systems. In Larimer County, Colorado, one official acknowledged she didn’t know what variables were used to assess local families.

Vaithianathan and Putnam-Hornstein’s work has been hailed in reports published by UNICEF and the Biden administration alike for devising computer models that promise to lighten caseworkers’ loads by drawing from a set of simple factors. They have described using such tools as a moral imperative, insisting that child welfare officials should draw from all data at their disposal to make sure children aren’t maltreated.

Vaithianathan and Putnam-Hornstein declined the AP’s repeated interview requests to discuss how they choose the specific data that powers their models. But in a 2017 report, they detailed the methods used to build the first version of Allegheny’s tool, including a footnote that described a statistical cutoff as “rather arbitrary but based on trial and error.”

 

Thank you for your comment. Your comment will be published after being reviewed.
Please try again later.

子供を可愛いと思うのか 子供を守り愛しみ育てていかないと思うのか 論理回路 理性や感情生まれるのか

Glad some attorneys have stepped in to help them. CPS has proof the child has a medical issue and they still won't give the child back. Bet ya they are banking on using the parents' disabilities to TPR and sell that baby off.

Pointing back to this tweet 10 years from now: 'See, it all started with the best of intentions...'

Stop 🛑

I do not think most people realize how prevalent disability discrimination is. Especially if the disability affects cognitive ability, or (more commonly) if it is assumed to do so. It's not surprising that another ML model is producing biased results. Garbage in, garbage out!

How about we delete the Eugenics AI like right now before anybody gets the idea to use it. sounds like a better idea.

Frightening, not supportive at all

Excessively monitoring people with disabilities with the intent to take their children is literally eugenics.

Heartbreaking 😢

This is just the tip the iceberg. Checkout the story below and book ‘Algorithms of Inequality.’

Ugh… 💩propaganda 💩 reporting

It's the off loading of decisions from humans to machine as the MAGAts refuse to take the wealthy for their share. Imagine if you are elderly and AI screening takes you from your home 🏡 - is that the future you want?

why is the judge allowing this to continue? they have the control to send the child back to the parent's if AI is the reason

I don't know if she'll ever see it, but I would love to hear Imani_Barbarin thoughts here

We have summarized this news so that you can read it quickly. If you are interested in the news, you can read the full text here. Read more:

 /  🏆 728. in TECHNOLOGY

Technology Technology Latest News, Technology Technology Headlines

Similar News:You can also read news stories similar to this one that we have collected from other news sources.

Not magic: Opaque AI tool may flag parents with disabilitiesHow is artificial intelligence used to predict children at risk? AP investigated, and found several government algorithms that have profiled families based on everything from the parents' disability status to if the mother smoked before pregnancy. You want to kill more babies, do you? Yep, AI is “In Bloom” — circa the ol’ failed Gates venture as DanielSolove so eloquently explained years ago. How awful. Give the child back and monitir. This is so wrong. 💔
Source: AP - 🏆 728. / 51 Read more »