Not magic: Opaque AI tool may flag parents with disabilities

PITTSBURGH (AP) — As part of a yearlong investigation, The Associated Press obtained the data points underpinning several algorithms deployed by child welfare agencies to understand how they predict which children could be at risk of harm. They offer rare insight into the mechanics driving these emerging technologies. Among the factors they use to measure a family’s risk, whether outright or by proxy: race, poverty rates, disability status and family size. The tool’s developers say their work is transparent and that they make their models public. The AP has learned that the U.S. Justice Department is investigating one Pennsylvania county’s child welfare system to determine whether its use of an algorithm discriminates against people with disabilities or other protected groups.


Widget not in any sidebars