Google Is Investigating Why it Trained Facial Recognition on ‘Dark Skinned’ Homeless People

Here we go again.

Yesterday, the New York Daily News reported that Google was funding a facial recognition project using “dubious tactics” to target “darker skin people.” Former Workers who collected the facial scans told Daily News that teams deceptively targeted homeless people in Atlanta, college students across the U.S., attendees of the BET Awards in Los Angeles, and other places—lying to them about the nature of their participation in a project to scan and record their faces.

The project was first reported in July 2019 by ZDNet and Android Police, which reported that Google employees were offering $5 gift cards to people on the street in exchange for a facial scan. Several anonymous sources have since spoken to Daily News, shedding light on a host of “questionable and misleading methods” driving the project.

The workers (Google TVCs—temps, vendors, or contractors) said they were hired by a third-party firm named Randstand. The Daily News reported that Randstad specifically told contractors to go after people of color and lie if it would help them collect more data through facial scans.

A Google spokesperson told Motherboard that it is now investigating the claims.

“We regularly conduct volunteer research studies. For recent studies involving the collection of face samples for machine learning training, there are two goals. First, we want to build fairness into Pixel 4’s face unlock feature,” the spokesperson said. “It’s critical we have a diverse sample, which is an important part of building an inclusive product. And second, security. Face unlock will be a powerful new security measure, and we want to make sure it protects as wide a range of people as possible.”

“We’re taking these claims seriously and investigating them,” they added. “The allegations regarding truthfulness and consent are in violation of our requirements for volunteer research studies and the training that we provided.”

The Daily News reported that contractors were told to present the facial scan as a “selfie game” and tell them to “play with the phone for a couple minutes and get a gift card.” Another former contractor said Randstad supervisors instructed teams to lie about whether they were being recorded, saying “If the person were to look at that screen after the task had been completed, and say ‘Oh was it taking a video?’ … we were instructed to say, ‘Oh it’s not really.'”

The Daily News also reported that contractors were encouraged to target homeless people because they were likely to participate and were unlikely to talk to the media.

The Daily News reported that Google was not only involved in multiple conference calls about the data collection and review process, but that it was present for “the mandate to go after ‘darker skin tones.’”

Despite Google’s rhetoric about wanting a diverse field of participants to train its technology, Google seemingly cut corners (and costs, presumably) by once-again turning to its shadow army of temps. Racist bias in algorithms and machine learning is a real problem that affects the dignity of those who deal with it, not a math problem to be solved with quotas targeting homeless people and students.