Should we be afraid of facial recognition technology?

From our phones to banks to police, facial recognition technology is everywhere – should we be afraid of it?

Should we be afraid of facial recognition technology?

Earlier this year, the Financial Times reported that Kings’ Cross Station was using facial recognition technology for security purposes. Just before that, the Metropolitan police were testing this across London. Groups such as Liberty have raised their concerns about facial recognition, and in a world that is becoming increasingly data-centric, are they right in saying this is something we should be worried about?

An explanation of how the software works can be found here, but in short facial recognition technology digitally maps faces and compares them to the faces already stored in its database. This has obvious advantages for security. In crowded areas, known suspects or criminals can be found far easier than with human eyesight. For businesses, this also means that they’re better able to keep track of who enters and exits their premises. With the introduction of 3D facial recognition, this technology is continually improving. Some software can even identify people when they’re not looking at the camera.

Whilst the positives are clear, there are deep issues with the use of the technology.

Many of us have had the experience of an Instagram or Snapchat filter not working properly on our faces. Now what if that error could lead to you getting stopped and searched? Or even arrested? Even though the margins for error are relatively low, an error could have significant consequences. 

This potential for inaccuracy is even more prominent with BAME people, especially Black people. Even some of the highest quality facial recognition technology in the world misidentifies Black people up to 10 times more frequently than it does white people. Black people are already disproportionately more likely to be stopped and searched, and have force used against them than other ethnicities, so the inaccuracies in this technology could potentially compound the institutional racism in British policing. This issue seems unlikely to disappear any time soon, because whilst the Met Police are aware of these potential racialised inaccuracies, they are yet to investigate them.

Another issue is privacy. How do we opt-out of having our faces stored? For example, the facial recognition technology at Kings’ Cross Station was installed without any public consultation, or request for consent, or even warning. There was also no mechanism for opting out of having your face stored, or legal mechanism to guarantee that the data was being ‘regularly deleted’ as Argent (the private property developer at King’s Cross) claimed. 

Also, if these cameras do record our faces, then how long does that data get to be held before it is deleted? We know that ‘false flags’ (potential matches that turn out to be inaccurate) could be held by the police for weeks after being identified as false. The situation with non-public bodies seems murky at best. Even if we are to accept that the government/public body that holds these images can be trusted, what about the private companies all this data is outsourced to? 

At a time where we are increasingly reliant on biometric technology, any leaks pose a massive threat to people’s livelihoods and security. Your phones, and all the important apps that reside on them, are increasingly switching to facial recognition as the preferred form of biometric security, and private companies are storing your facial data in databases we have no way to audit. This threat is made even greater by the rise of ‘deepfakes’ - where fake videos/photos are made that can digitally replace one person’s likeness with another. It’s not too far a leap to envisage a future where biometric databases are the next big hack.

The Cardiff High Court recently ruled that the South Wales Police’s use of facial recognition is legal, so the technology is unlikely to disappear any time soon. In fact, it’s likely that we’ll only see the technology become more prevalent, and more widely accessible. Only time will tell as to whether this becomes another step on the road to authoritarian dystopia, or simply becomes another security measure we all get used to.

Header Image Credit: Bree~commonswiki

Author

Oluwatayo Adewole

Oluwatayo Adewole Contributor

Hey there! I'm a wordy-type who's into all kinds of stuff, but especially: film, comics, theatre and trying to make the world a better place

We need your help supporting young creatives

Recent posts by this author

View more posts by Oluwatayo Adewole

0 Comments

Post A Comment

You must be signed in to post a comment. Click here to sign in now

You might also like

The place for creativity in the corporate world

The place for creativity in the corporate world

by Guest Contributor

Read now