Facial Recognition and facial authentication sound similar but there are distinct differences and this article takes a broad a look at how both are playing more of a role in our lives going forward. So firstly, what’s the difference?
Facial Recognition
This refers to the biometric technology system that maps facial features from a photograph or video taken of a person e.g. while walking in the street or at an event and then compares that with the information stored in a database of faces to find a match. The key element here is that the cameras are separate from the database which is stored on a server. The technology must, therefore, connect to the server and trawl through the database to find the face. Facial recognition is often involuntary i.e. it is being used somewhere that a person happens to go – it has not been sought or requested.
Facial recognition is generally used for purposes such as (police) surveillance and monitoring, crime prevention, law enforcement and border control.
Facial Authentication
Facial Authentication, on the other hand, is a “match on device” way of a person proving that they are who they claim to be. Unlike facial recognition, which requires details of faces to be stored on a server somewhere, a facial authentication scan compares the current face with the one that is already stored (encrypted) on the device. Typically, facial authentication is used by a person to gain access to their own device, account, or system. Apple’s FaceID is an example of a facial authentication system. Unlike facial recognition, it is not something that involuntarily happens to a person but is something that a person actively uses to gain entry/access.
Facial recognition and facial authentication both use advanced technologies such as AI.
Facial Recognition – Advantages
The advantages of facial recognition technology include:
– Saving on Human Resources. AI-powered facial recognition systems can scan large areas, large moving crowds and can pick out individuals of interest, therefore, saving on human resources. They can also work 24/7, all year round.
– Flexibility. Cameras that link to facial recognition systems can be set up almost anywhere, fixed in place or as part of mobile units.
– Speed. The match with a face on the database happens very quickly (in real-time), thereby enabling those on the ground to quickly apprehend or stop an individual.
– Accuracy – Systems are very accurate on the whole, although police deployments in the UK have resulted in some mistaken arrests.
Facial Recognition Challenges
Some of the main challenges to the use of facial recognition in recent times have been a lack of public trust in how and why the systems are deployed, how accurate they are (leading to possible wrongful arrest), how they affect privacy, and the lack of clear regulations to effectively control their use.
For example, in the UK:
– In December 2018, Elizabeth Denham, the UK’s Information Commissioner launched a formal investigation into how police forces used FRT after high failure rates, misidentifications and worries about legality, bias, and privacy. This stemmed from the trial of ‘real-time’ facial recognition technology on Champions League final day June 2017 in Cardiff, by South Wales and Gwent Police forces, which was criticised for costing £177,000 and yet only resulting in one arrest of a local man whose arrest was unconnected.
– Trials of FRT at the 2016 and 2017 Notting Hill Carnivals led to the Police facing criticism that FRT was ineffective, racially discriminatory, and confused men with women.
– In September 2018 a letter, written by Big Brother Watch (a privacy campaign group) and signed by more than 18 politicians, 25 campaign groups, and numerous academics and barristers, highlighted concerns that facial recognition is being adopted in the UK before it has been properly scrutinised.
– In September 2019 it was revealed that the owners of King’s Cross Estate had been using FRT without telling the public, and with London’s Metropolitan Police Service supplying the images for a database.
– A recently published letter by London Assembly members Caroline Pidgeon MBE AM and Sian Berry AM to Metropolitan Police commissioner Cressida Dick asked whether the FRT technology could be withdrawn during the COVID-19 pandemic on the grounds that it has been shown to be generally inaccurate, and it still raises questions about civil liberties. The letter also highlighted concerns about the general inaccuracy of FRT and the example of first two deployments of LFR this year, where more than 13,000 faces were scanned, only six individuals were stopped, and five of those six were misidentified and incorrectly stopped by the police. Also, of the eight people who created a ‘system alert’, seven were incorrectly identified. Concerns have also been raised about how the already questionable accuracy of FRT could be challenged further by people wearing face masks to curb the spread of COVID-19.
In the EU:
Back in January, the European Commission considered a ban on the use of facial recognition in public spaces for up to five years while new regulations for its use are put in place.
In the U.S.
In 2018, a report by the American Civil Liberties Union (ACLU) found that Amazon’s Rekognition software was racially biased after a trial in which it misidentified 28 black members of Congress.
In December 2019, a US report showed that, after tests by The National Institute of Standards and Technology (NIST) of 189 algorithms from 99 developers, their facial recognition technology was found to be less accurate at identifying African-American and Asian faces, and was particularly prone to misidentifying African-American females.
Backlash and Tech Company Worries
The killing of George Floyd and of other black people in the U.S by police led to a backlash against facial recognition technology (FRT) and strengthened fears by big tech companies that they may, in some way be linked with its negative aspects.
Even though big tech companies supply facial recognition software such as Amazon (Rekognition), Microsoft and IBM, some have not sold it to police departments pending regulation, but most have also had their own concerns for some years. For example, back in 2018, Microsoft said on its blog that “Facial recognition technology raises issues that go to the heart of fundamental human rights protections like privacy and freedom of expression. These issues heighten responsibility for tech companies that create these products. In our view, they also call for thoughtful government regulation and for the development of norms around acceptable uses”.
With big tech companies keen to maintain an ethical and socially responsible public profile, follow-up on their previous concerns about problems with FRT systems and a lack of regulation, and to distance themselves from the behaviour of police as regards racism/racial profiling or any connection to it e.g. by supplying FRT software, four big tech companies recently announced the following:
– Amazon has announced that it is implementing a one-year moratorium on police use of its FRT in order to give Congress enough time to implement appropriate rules.
– After praising progress being made in the recent passing of “landmark facial recognition legislation” by Washington Governor Jay Inslee, Microsoft has announced that it will not sell its FRT to police departments until there is a federal law (grounded in human rights) to regulate its use.
– IBM’s CEO, Arvind Krishna, has sent a letter to the U.S. Congress with policy proposals to advance racial equality, and stating that IBM will no longer offer its general-purpose facial recognition or analysis software.
– Google has also distanced itself from FRT with Timnit Gebru, leader of Google’s ethical artificial intelligence team, commenting in the media about why she thinks that facial recognition is too dangerous to be used for law enforcement purposes at the current time.
Masks
The need to wear masks during the pandemic has proven to be a real challenge to facial recognition technology. For example, recent research results from the US National Institute of Standards and Technology showed that even the most advanced facial recognition algorithms can only identify as little as between 2 and 50 per cent of faces when masks are worn.
Call For Clear Masks
There have been some calls for the development of clear or opaque masks or masks with a kind of ‘window’, as highlighted by the National Deaf Children’s Society, to help the 12 million people in the UK who are deaf or suffer from degrees of hearing loss e.g. to help with lip-reading, visual cues and facial expressions. Some companies are now producing these masks e.g. the FDA-approved ‘Leaf’ transparent mask by Redcliffe Medical Devices in Michigan. It remains to be seen how good facial recognition technology is at identifying people with a clear/opaque mask as opposed to a normal mask.
Authentication Challenges
The security, accuracy, and speed challenges of more traditional methods of authentication and verification have made facial authentication look like a more effective and attractive option. For example, passwords can be stolen/cracked and 2-factor authentication can be less convenient and can be challenging if, for example, more than one user needs access to an account.
Facial Authentication Advantages
Some of the big advantages of facial authentication include:
– Greater Accuracy Assurance. The fact that a person needs to take a photo of their government-issued ID photo e.g. from a passport to match with a selfie + embedded 3D likeness detection to set up their facial authentication on their device means that it is likely to be accurate in identifying them.
– Ease of Use. Apple’s Face ID is easy to use and is likely to become a preferred way of authentication by users.
– Better Fraud Detection. Companies and individuals using facial authentication may have a better chance of detecting attempted fraud (as it happens) than other current systems. Companies that use facial authentication with systems may, therefore, be able to more confidently assess risk, minimise fraud losses, and provide better protection for company and customer data.
– Faster For Users. Face ID, for example, saves time compared to other methods.
– Cross-Platform Portability. 3D face maps can be created on many devices with a camera, and users can enrol using a laptop webcam and authenticate from a smartphone or tablet. Facial authentication can, therefore, be used for many different purposes.
The Future – Biometric Authentication
The small and unique physical differences that we all have (which would be very difficult to copy) make biometric authentication something that’s likely become more widely used going forward. For example, retinas, irises, voices, facial characteristics, and fingerprints are all ways to clearly tell one person from another. Biometric authentication works by comparing a set of biometric data that is preset by the owner of the device with a second set of biometric data that belongs to a device visitor. Many of today’s smartphones already have facial or fingerprint recognition.
The challenge may be, however, if biometric data is required for entry systems access that is not “on device” i.e. a comparison will have to be made with data stored on a server, thereby adding a possible security risk step.
Human Micro-Chipping
There may be times where we do not have access to our devices, where fast identification is necessary or where we may need to carry data and information that can’t be easily learned or remembered e.g. our medical files. For these situations, some people have presented the argument for human micro-chipping where microchip implants, which are cylindrical ‘barcodes’, which can be scanned through a layer of skin to transmit a unique signal.
Neuralink Implant
Elon’s Musk’s Neuralink idea to create an implantable device that can act as an interface between the human brain and a computer could conceivably be used in future for advanced authentication methods.
Looking Forward
The benefits of facial recognition e.g. for crime prevention and law enforcement are obvious but for the technology to move forwards, regulatory hurdles, matters of public trust and privacy, and technical challenges e.g. bias, and accuracy need to be overcome.
Facial authentication provides a fast, accurate and easy way for people to prove that they are who they say they are. This benefits both businesses and their customers in terms of security, speed and convenience as well as improving efficiency.
More services where sensitive data is concerned e.g. financial and medical services, and government agency interactions are likely to require facial authentication in the near future.