Convenience, security, and profit drive facial algorithm research, but privacy concerns grow
Facial recognition technology is already a part of our daily lives, from traffic cameras and police databases to social media sites and our phone security settings. Some concepts are the stuff of science fiction made real. For instance, as Geektime has previously reported, comedy clubs have experimented in using the technology to (literally) charge patrons by the laugh. Mood recognition technology, for medical or educational use, is within sight now, too.
There are already a number of companies that have rolled out commercial facial recognition technology, including Facebook and Snapchat, the Israeli “Churchix” technology used to track attendance at religious services, and Tzekwan Technology’s facial recognition ATMs that debuted in China last year.
But one of the most talked-about market entrants to date is the Russian firm NTechLab, with active customers in entertainment venues and social media platforms. Law enforcement agencies are also reportedly interested, as this is one area where the technology is already very well-established worldwide. A recent profile of the company in The Wall Street Journal highlights the potential for widespread application, as well as the legal and ethical challenges that await.
Millions and millions of images
The MegaFace competition is significant because it has scaled up the sample sizes for testing developers’ facial recognition algorithms: 1 million Flickr images. Most of the algorithms trained with smaller datasets than FaceN or FaceNet performed below expectations during the competition. As IEEE Spectrum notes, this is an important finding because most current facial recognition technology has 95% accuracy when tested against the old industry standard of 13,000 images, so that number is no longer as good a predictor of performance.
Algorithms “trained” on larger sets tend to do better than others. FaceNet was built using a 10 million person-strong database (500 million images in total), while NTechLab’s FaceN used “only” 200,000 people. FaceN won the “FaceScrub” identification test over FaceNet, 73.3% to 70.5%, though FaceNet went on to win other tests during the challenge.
Both developers, though, found that their near-perfect record with small datasets in the low thousands could not yet be replicated at the 1 million images scale, falling from the high nineties to the low seventies. The 1 million image test will become more common across the industry as the algorithms have to one day be able to handle hundreds of millions of people in a sample size to prove practical.
NTechLab’s founders are very forward about the utilities of the technology, and that privacy concerns should not derail further development. NTechLab has, though, told Fusion that, “We’re now developing algorithms that will help us detect an inappropriate usage” since “inappropriate usage” of the algorithm has already been documented.
The company successfully co-developed the FindFace app that can be used to run image searches across VKontakte, Russia’s most popular social media network. Unfortunately, some have already abused it to harass and dox users, women in particular.
FindFace users also proved that the technology could be used for both missed connections and stalking: A photographer was able to match random photographs he took of Moscow Metro commuters with people’s VKontakte data.
Of course, privacy organizations and civil liberties groups are among those most concerned about the future of facial recognition for these reasons. But even spy agencies who want to adopt the technology for real-time surveillance are worried about how it will change the landscape of their work.
No one really knows what the impact will be like, when even laughter can be commodified.
Facial recognition technology most obviously has law enforcement and intelligence gathering applications. It has great potential to assist with investigations and avoid cases of mistaken identity. New York City is already set to begin rolling out, “advanced cameras and sensors” at bridges and tunnels, “to read license plates and test emerging facial recognition software and equipment” in order to create a single mass transit monitoring system as well as deter terrorism. Several malls and airports in the US are also exploring this technology.
The trade off is that going undercover or engaging in whistleblowing becomes much more difficult in such an environment. The Director of MI6 has even said that such technology will make it much harder to build a cover story for agents or meet with sources in places where there might be cameras recording. So, even as Big Brother will be watching, others will be watching back.
The world we’ll live in
The technology’s commercial applications also pose significant ethical dilemmas. The pornography industry is an area where commercial applications are well-advanced but the laws governing them are vague. Pairing the technology with sex-cams allows consumers to find performers who resemble their favorite stars, or anyone, in fact. This is not illegal per se, as it simply automates a process people work out in the heads when they look for ideal men and women to ogle, but that automation is what makes it seem so alien and uncomfortable to many.
Facebook is already facing a potential lawsuit over its use of facial recognition algorithms since 2011 that automatically detect people on your friend list in uploaded photos. Billed as a tagging time-saver, it may face a legal hurdle over consent concerns. On the other hand, the technology would be useful for dating apps, even helping to prevent scams and identify theft.
Current measures to defeat the technology make people visibly conspicuous, hardly ideal for defeating surveillance if you want to remain unnoticed. If such camouflage techniques became commonplace, though, then anonymity through them would be practicable because when everyone looks “strange,” no one does.
So that may indeed be a “new normal.” But, it would also only be one of several new normal(s). People have already learned to live with a lot of technologies that erase their anonymity rather than give up the benefits they bring. Facial recognition technology is not proving very different.