eyeball engaged in retina scan for security

What’s going on with your face?

Advances in AI and facial recognition software is both exciting and frightful at the same time. The divided opinion is justified. The use of such software is increasingly more common for law enforcement, leading to more arrests. While perhaps not keeping our streets altogether safer, they are improving the number of convictions and could lead to a reduction in crime over time. It has, however, also led to a number of false positives too. People being picked up and questioned for crimes they supposedly committed when they may even have been in the opposite end of the country. Facial recognition is useful when unlocking your phone or paying for shopping, but how accurate is it really?


 

In general, facial recognition is currently being used for 3 main purposes:

  • Detection (detecting the presence of a human face)
  • Verification (such as unlocking a phone)
  • Identification (to differentiate one person from another)

 

The promise of improved security is there. In fact, many wouldn’t argue against facial recognition being a perfectly acceptable and secure biometric to unlock a device or to gain access to an app. The concerns aren’t with detection or verification, but rather with identification. You don’t have to spend long researching the successes of facial recognition to stumble upon it’s failures. Despite the technology being around for some time, the accuracy of the results vary wildly depending on the subject’s ethnicity, age and gender. We may have come a long way from the infamous 2017/18 NIST studies that showed errors of racial bias on around 20% of the 190 algorithms tested, but there is still a divide in the results. You simply can’t have equality unless the results are actually equal.

 

Artificial Intelligence (AI) and facial recognition are intrinsically linked. The draft for the proposed new EU AI Act (AIA) was released in April and is likely to be passed and to come into effect next year. The AIA sets out a comprehensive list of standards and requirements that AI should conform to during development, going to market and use of AI in the European Union. It is designed to allow regulation of the industry and to provide legal surety, to build public trust in the technology that it will respect fundamental human rights. A copy of the official draft can be read here.

 

While the promise of regulation is both welcomed and likely to encourage further investment into the industry, it isn’t absurd to believe that for every technology designed with good intentions in mind, there will be a handful of them without. A report was leaked online in China that showed the results of a study that used an assessment of people’s brainwaves combined with facial recognition software to determine if the subjects were loyal to the Chinese Communist Party. Needless-to-say, the report was withdrawn very quickly, yet that thought may linger indefinitely with many. Facial recognition has been used for social scoring and public shaming in China and recently, to identify protestors to the invasion of Ukraine in Russia. It won’t be long until people are changing the way they behave in public, hiding their personalities, sexual orientation, religion… There’s a very real risk of changing the very nature of our public spaces.

 

When it comes to facial recognition, we may blindly sit in awe of apps like Facebook automatically detecting and tagging people in your photos, but we don’t always think about how it does that. For recognition to be possible, you need a database of a number of known images to compare against. If you’ve ever set up facial recognition on your phone you’ll have sat for a few minutes taking selfies from slightly different angles so the device has the data it needs to differentiate you from someone else. That may not be fine on your own phone, you may even be okay with Facebook using your camera roll for its tagging wizardry, but how are law enforcement able to identify you? Do you remember giving them permission to use hundreds of photos of you to build a profile of your face?

 

Like most people, I’m excited about the new technologies that await us. Let’s hope it doesn’t lead to us forgoing our basic human rights by stepping out of the front door.



Jason Abrahamse

Jason is ITbuilder's security expert and leads our information security project team. He provides consultancy and support on matters relating to cyber-resilience and data protection.

Something of an industry veteran, Jason has held various roles in the industry and combines that expertise to consult with customers on security best practices.

Jason is a native of South Africa, but is now a fully naturalised Brit except for not being accustomed to the cold. He lives locally in Hertfordshire.


More articles from

Back to Blog