Freshers

Déjà Vu? The Ethics of Facial Recognition

In Fresher’s week, you will meet thousands of faces you’ve never seen before. A fundamentally cosmopolitan institution, these faces will come from all around the world, and you likely won’t recognise a single one. A few may ring a bell, make you do a double take. A couple you may well be certain you’ve seen somewhere before, but you can’t quite place them. Cue a migraine of thought dedicated to trying to place those people, to recognise their faces.

Yet, technology is evolving that can do all that in an instant. And with that comes another string of controversies.

King’s Cross is the latest of these. It was recently revealed that security cameras around the area had been using facial recognition technology. Privacy groups were reportedly outraged, with tech commentator Stephanie Hare criticising the technology, saying “It allows us to be identified and tracked in real time, without our knowledge or our informed consent”. Criticisms have also been made that the emerging technology has trouble differentiating between men and women, specifically of a darker skin tone.

King’s Cross Station, part of the area at the centre of this facial recognition controversy.

 

There are several proposed reasons for this. Tech developers tend to use photos available in public databases (i.e. the internet) to train their various algorithms, which results in a white Western skew. A lack in colour contrast on people of a darker complexion leads to the software struggling to identify the key features of the face, whilst the technology can have difficulties with women wearing make-up.

This being said, the technology is improving. Microsoft said that in the past year, it had “reduced the error rates for identifying women and darker-skinned men by up to 20 times”. We all know that technology evolves and improves at an incredible rate, and one can only imagine that a technology as potential useful as this is going to have a lot of time and investment thrown into it. Thus, the accuracy is bound to improve to a level where such discrepancies disappear, so long as the technology is allowed to develop and not become stuck in some messy legal quicksand.  

“The real heart of the question here is whether or not our right to privacy in such a public setting outweighs the benefits of the technology?”

The real heart of the question here is whether or not our right to privacy in such a public setting outweighs the benefits of the technology?

I say it does not.

I am a firm believer in if you have done nothing wrong, then what do you have to fear? Hare’s remarks regarding “informed consent” are troubling to me, for why would a career criminal ever consent to having their face stored in a database? In fact, I struggle to comprehend just what “informed consent” would constitute in this instance, for a tool so beneficial to security and so reliant on its potential for mass use would lose all of its worth if there was any kind of ‘opt-out’ system.

Missing persons can be found. Criminals can be caught. The latter, in fact, was cited by the developers of King’s Cross (Argent) as the reason for why they employed the technology originally, in what was an attempt to catch previous offenders there. It later became apparent that King’s Cross discontinued the use of the technology in March 2018, with the controversy now only just being drawn into the public light.

Missing person can be found. Criminals can be caught.

There is certainly potential for the technology’s misuse, and that’s why calls for more regulations controlling the technology are understandable. It is an emerging field, and legally, we must proceed with caution, as one knows that any tool used to incriminate a person will be put under the greatest scrutiny in the courts. Indeed, legal challenges are already underway, though the technology has stayed strong so far amidst a court case in South Wales. What I fear, however, is that the potential benefits of such a tool, in a world flooded with unrest, terrorism and violence, are soon going to be drowned out by incessant cries that one’s privacy is being invaded.

I care little if an organisation has my face stored in some data-storing and crosschecking system (I would be a hypocrite if I did, given my face is plastered over my own Instagram), even if that organisation did so without my consent. Why? Because that means that the fellow across the road planning to commit a terrorist attack, or the poor little child round the corner who has been horridly abducted, are having their faces stored and crosschecked in that facial recognition system as well, and with that increases the chances of the former being caught, or the latter being saved.

That fact alone allows me to sleep peacefully at night, even if, God forbid, some Big Brother figure out there in the ether now knows what my face looks like.

Joe Paternoster

Featured image courtesy of rafael parr via Flickr. No changes were made to this image. Image use license here.

Body image courtesy of Artur Salisz via Flickr. No changes were made to this image. Image use license here.

For more Freshers content, as well as uni news, reviews, entertainment articles, lifestyle, features and so much more, follow us on Twitter and like our Facebook page for more articles and information on how to get involved!  If you would like to write Science articles for Impact Lifestyle drop us an email at lifestyle@impactnottingham.com.

Categories
FreshersLifestyleScience

Leave a Reply