Face Detection Turing Test to tell Computers and Humans Apart (Ars Electronica June 9, 2013)
Imagine that you are a pastor of an American megachurch. You need to track attendance of your flock for spiritual and financial purposes, but your records are always inaccurate. How about syncing your security cameras with face recognition technology?
Spooky as it sounds, a company called Churchix is marketing software which tracks faces in churches and adds their names to a database. And, no, the churches don’t tell their parishioners.
If religious outfits are using it, you can imagine how eager commercial enterprises are to use this software. Face recognition is clearly a useful tool for fighting crime and terrorism. But other applications outrage privacy advocates.
Want to boost sales? One company promises to recognize “high-value customers each time they shop”. It also sends “alerts when known litigious individuals enter any of your locations”. Tom Cruise’s character in the Spielberg film Minority Report was bombarded with targeted advertising as he walked along the street.
This is no longer science fiction.
Curious about someone? An app called NameTag is developing software which will allow anyone with a mobile phone to match a photo with a dating site profile. “NameTag can make the big, anonymous world we live in as friendly as a small town,” the company says. And just to be sure that everyone is friendly, it also scans half a million entries in the US National Sex Offender Registry and other criminal databases.
Most people don’t think so. In fact, they feel very complacent about allowing companies like Facebook and Google to store and analyse personal images. Facebook recently launched a new app, Moments, which identifies your friends in other users’ photos. How many complaints have you heard about it?
But thoughtful defenders of privacy are dismayed. They fear that face recognition software could be the nail in the coffin of personal privacy. In a post-Snowden world, corporate databases can be siphoned into government databases and government databases can be stolen by hackers from the Chinese government or the Russian mafia.
In May hackers stole 22 million social security numbers along with other personal information from the Office of Personnel Management in the United States. Imagine what they could do if they stole your face as well. You can change your password, or even your social security number. You cannot change your face.
Europeans are sensitive about privacy. The Moments app is not going to be rolled out in Europe because Facebook refuses to implement an opt-in system. In the US, the states of Texas and Illinois take the same position. They require companies to obtain permission before taking face scans or sharing biometric information.
As the Electronic Frontier Foundation, a consortium of privacy advocates, declared recently:
“We believe that people have a fundamental right to privacy. People have the right to control who gets their sensitive information, and how that information is shared. And there is no question that biometric information is extremely sensitive. You can change your password and your credit card number; you cannot change your fingerprints or the precise dimensions of your face. Through facial recognition, these immutable, physical facts can be used to identify you, remotely and in secret, without any recourse.”
The US government wants to create a voluntary, enforceable code of conduct for commercial use of face recognition technology. However, recent discussions between privacy advocates and industry representatives broke down almost immediately. They could not agree on the answer to the simple question: “If you are walking down the street, a public street, should a company be able to identify you without your permission?”
“What facial recognition allows is a world without anonymity … You walk into a car dealership and the salesman knows your name and how much you make. That’s not a world I want to live in,” says Alvaro Bedoya of the Georgetown University Law Centre in Washington DC.
From an ethical perspective the first question to be asked is: has the consumer given informed consent to face identification? Often the answer will be Yes, if the information is gathered through social media.
But that is not the end of the story. Almost no one will have read the consent form. Once the information has been gathered, unbeknownst to most consumers, it will be aggregated with big data techniques to create a personal profile. And the consumer will not have consented to this. But he may not be able to protest because he will not be able to access the profile and probably won’t even know that it exists.
This is why legislation for opt-in face-recognition is absolutely essential — despite yelps of pain from companies like Google and Facebook.
In any case, face recognition software makes some awful blunders. Google Photos recently classified two African-Americans as gorillas. “This is 100% Not OK,” Google tweeted. But it ought to give everyone an queasy feeling about what life in the Google global village could be like.
Michael Cook is editor of MercatorNet.