The Google Pixel 4 series will pack advanced facial recognition functionality, enabling users to quickly unlock their devices and more with a glance. We already know that the company employed people to capture scans of people’s faces for $5 gift cards to improve this face unlock technology. Now, the New York Daily News has reported that Google was using some questionable methods in order to get these scans.
According to the outlet, Google has specifically been targeting people with darker skin. This makes a lot of sense, as it’s not unheard of for some products/services to have trouble with darker skin tones. Remember that video showing a soap dispenser working for a white person but not for a black person?
A Google spokesperson acknowledged the facial data collection drive and its aim to ensure compatibility with more people.
“We regularly conduct volunteer research studies. For recent studies involving the collection of face samples for machine learning training, there are two goals,” the spokesperson told the outlet.
“First, we want to build fairness into Pixel 4’s face unlock feature. It’s critical we have a diverse sample, which is an important part of building an inclusive product,” the spokesperson noted. “And second, security. Pixel 4’s face unlock will be a powerful new security measure, and we want to make sure it protects as wide a range of people as possible.”
How did it get these scans?
The New York Daily News spoke to several people who had worked on the project as contractors, and it seems like some subjects were misled about these scans.
The contractors were employed by third-party firm Randstad, and it reportedly told the workers to specifically target people with darker skin, hide the fact that people’s faces were being recorded, and directed them to lie in a bid to get as much data as possible.
These contractors told the outlet that they were sent to target homeless, black people in Atlanta, “unsuspecting” college students, and attendees at the BET Awards. But it’s the fact that homeless people were targeted which might raise more eyebrows.
“They said to target homeless people because they’re the least likely to say anything to the media,” one former contractor said. “The homeless people didn’t know what was going on at all.”
It’s unclear if Google was aware of homeless people being targeted for facial data. However, another former worker says a Google manager wasn’t in earshot when they were directed by Randstad to recruit homeless people for the scans. So it’s possible the search company didn’t know about this directive.
We’ve contacted Google about these claims and will update the article accordingly.