Illustration by Alex Castro / The Verge
In July, Google admitted it has employees pounding the pavement in a variety of US cities, looking for people willing to sell their facial data for a $5 gift certificate to help improve the Pixel 4’s face unlock system. But the New York Daily News reports that a Google contractor may be using some questionable methods to get those facial scans, including targeting groups of homeless people and tricking college students who didn’t know they were being recorded.
According to several sources who allegedly worked on the project, a contracting agency named Randstad sent teams to Atlanta explicitly to target homeless people and those with dark skin, often without saying they were working for Google, and without letting on that they were actually recording people’s faces.
Google wasn’t necessarily aware that Randstad was going after homeless people, but a Google manager reportedly did instruct the group to target people with darker skin, one source told the Daily News.
There are too many eyebrow-raising passages in the full story to print them all here, but here’s a few:
“They said to target homeless people because they’re the least likely to say anything to the media,” the ex-staffer said. “The homeless people didn’t know what was going on at all.”
Some were told to gather the face data by characterizing the scan as a “selfie game” similar to Snapchat, they said. One said workers were told to say things like, “Just play with the phone for a couple minutes and get a gift card,” and, “We have a new app, try it and get $5.”
Another former TVC said team members in California were specifically told they could entice cash-strapped subjects by mentioning a state law that says gift cards less than $10 can be exchanged for cash.
The report includes pictures of the alleged device used to capture the facial data, too — a phone encased in a large, rugged metal frame and sealed with tamper-detection stickers and security screws.
It’s important to note there are legitimate reasons Google would want to make sure it’s testing its new face unlock feature broadly with people of color — it shouldn’t be biased against them because of a lack of data. Apple also collected facial recognition data ahead of the launch of Face ID for this reason, as we reported earlier. But while it’s not terribly creepy to walk up to someone and fully explain the situation, it sounds like Google and/or its contractor may have been taking some extreme and unsavory shortcuts to cash in.
Google and Randstad didn’t immediately reply to requests for comment.