Parmy Olson, Columnist

AI Needs a Babysitter, Just Like the Rest of Us

Police, schools and HR departments are too trusting of fallible algorithms. A special brand of human overseer is needed.  

A fine line.

Photographer: Carl Court/Getty Images Europe
Lock
This article is for subscribers only.

Back in 2018, Pete Fussey, a sociology professor from the University of Essex, was studying how police in London used facial recognition systems to look for suspects on the street. Over the next two years, he accompanied Metropolitan Police officers in their vans as they surveilled different pockets of the city, using mounted cameras and facial-recognition software.

Fussey made two important discoveries on those trips, which he laid out in a 2019 study. First, the facial-recognition system was woefully inaccurate. Across all 42 computer-generated matches that came through on the six deployments he went on, just eight, or 19%, turned out to be correct.