Not magic: Opaque AI tool may flag parents with disabilities

  • 📰 CTVNews
  • ⏱ Reading Time:
  • 109 sec. here
  • 3 min. at publisher
  • 📊 Quality Score:
  • News: 47%
  • Publisher: 99%

Technology Technology Headlines News

Technology Technology Latest News,Technology Technology Headlines

The U.S. Justice Department is investigating a county's child welfare system to determine whether its use of an artificial intelligence tool that predicts which children could be at risk of harm is discriminating against parents with disabilities or other protected groups.

Lauren Hackney feeds her 1-year-old daughter chicken and macaroni during a supervised visit at their apartment in Oakdale, Pa., on Thursday, Nov. 17, 2022. Lauren and her husband, Andrew, wonder if their daughter’s own disability may have been misunderstood in the child welfare system. The girl was recently diagnosed with a disorder that can make it challenging for her to process her sense of taste, which they now believe likely contributed to her eating issues all along.

"They had custody papers and they took her right there and then," Lauren Hackney recalled. "And we started crying." Lauren Hackney has attention-deficit hyperactivity disorder that affects her memory, and her husband, Andrew, has a comprehension disorder and nerve damage from a stroke suffered in his 20s. Their baby girl was just 7 months old when she began refusing to drink her bottles. Facing a nationwide shortage of formula, they traveled from Pennsylvania to West Virginia looking for some and were forced to change brands. The baby didn't seem to like it.

Over the past six years, Allegheny County has served as a real-world laboratory for testing AI-driven child welfare tools that crunch reams of data about local families to try to predict which children are likely to face danger in their homes. Today, child welfare agencies in at least 26 states and Washington, D.C., have considered using algorithmic tools, and jurisdictions in at least 11 have deployed them, according to the American Civil Liberties Union.

What they measure matters. A recent analysis by ACLU researchers found that when Allegheny's algorithm flagged people who accessed county services for mental health and other behavioral health programs, that could add up to three points to a child's risk score, a significant increase on a scale of 20.

"We're sort of the social work side of the house, not the IT side of the house," Nichols said in an interview. "How the algorithm functions, in some ways is, I don't want to say is magic to us, but it's beyond our expertise and experience." In Pennsylvania, California and Colorado, county officials have opened up their data systems to the two academic developers who select data points to build their algorithms. Rhema Vaithianathan, a professor of health economics at New Zealand's Auckland University of Technology, and Emily Putnam-Hornstein, a professor at the University of North Carolina at Chapel Hill's School of Social Work, said in an email that their work is transparent and that they make their computer models public.

 

Thank you for your comment. Your comment will be published after being reviewed.
Please try again later.
We have summarized this news so that you can read it quickly. If you are interested in the news, you can read the full text here. Read more:

 /  🏆 1. in TECHNOLOGY

Technology Technology Latest News, Technology Technology Headlines