Critics Call for Regulating Facial Recognition

Print More

An unusual consensus has emerged among artificial intelligence researchers, activists, lawmakers and many of the largest technology companies: Facial recognition software breeds bias, risks fueling mass surveillance and should be regulated, Bloomberg reports. Deciding on effective controls and acting on them will be a lot harder. On Tuesday, the Algorithmic Justice League and the Center of Privacy & Technology at Georgetown University Law Center announced a Safe Face Pledge, which asks companies not to provide facial AI for autonomous weapons or sell to law enforcement unless laws are passed to allow it. Microsoft Corp. said the software carries significant risks and proposed rules to combat the threat. “Principles are great – they are starting points. Beyond the principles we need to be able to see actions,” said Joy Buolamwini of the Algorithmic Justice League. None of the biggest makers of the software – companies like Microsoft, Google, Amazon.com Inc., Facebook Inc. and IBM – has yet signed the pledge.

Large tech companies may be reluctant to commit to a pledge like this because it could mean walking away from lucrative contracts for the emerging technology. The market for video surveillance gear is worth $18.5 billion a year, and AI-powered equipment for new forms of video analysis is an important emerging category. “There are going to be some large vendors who refuse to sign or are reluctant to sign because they want these government contracts,” said Laura Moy of the Center on Privacy & Technology. Microsoft is still selling facial recognition software to governments. The American Civil Liberties Union  asked Microsoft to halt the sales and join the organization’s call for a federal moratorium on government use of the technology. The use of facial recognition for surveillance, policing and immigration is being questioned because researchers have shown the technology isn’t accurate enough for critical decisions and performs worse on darker-skinned people.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.