How we can help

Need help with your Managed IT Services?

Our team are available Mon – Fri: 7:30am-5:30pm

Call Now On:
Stourport: 01299 848311 Hereford: 01432 663026

Technical Support

Contact us

- 21st Apr 2021

Tech News : MEPs Seek Ban On Public Biometric Surveillance

Following the recent leak of an EU draft of rules for applying to AI, 40 MEPs have called for a ban on the use of facial recognition and other types of biometric surveillance in public places.

Draft Rules

The leaked draft rules by EU lawmakers to regulate the use of AI prompted the MEPs to publish a letter outlining how the draft rules could be strengthened to offer greater privacy protection and guard against discrimination, threats to privacy, and more.

Biometric Mass Surveillance

The MEPs noted that the draft rules do not include a ban on the use of facial recognition or similar biometric remote identification technologies in public places (e.g. facial recognition systems).  The letter from the MEPs highlighted how this type of surveillance has been shown to lead to mistaken identification/wrongful reporting of subjects, discrimination of “under-represented groups” and having a “chilling effect” in a society that is diverse and used to certain freedoms. The MEPs have, therefore called for a total ban on this type of surveillance.

Automated Inference Warning

The letter also warned of how automated inference, such as predictive policing and indiscriminate monitoring using biometrics, could violate rights to privacy and data protection, suppress free speech, be counter-productive in the fight against corruption, and pose a particular risk to “LGBTQI+ communities, people of colour, and other discriminated-against groups”. The MEPs, therefore, request in the letter that the new rules should prohibit “automatic recognition of gender, sexuality, race/ethnicity, disability and any other sensitive and protected characteristics”.

Other Areas of Concern

The letter also says that the MEPs would like the wording of the proposed new rules to be tightened up to cover all untargeted and indiscriminate mass-surveillance, and that the proposed exemption on the prohibition on mass-surveillance for public authorities (or commercial entities working for them) would threaten public security.

In the UK

The use of biometric public surveillance in the UK has also caused concern.  For example:

– In December 2018, Elizabeth Denham, the UK’s Information Commissioner launched a formal investigation into how police forces used FRT after high failure rates, misidentifications and worries about legality, bias, and privacy. This stemmed from the trial of ‘real-time’ facial recognition technology on Champions League final day June 2017 in Cardiff, by South Wales and Gwent Police forces, which was criticised for costing £177,000 and yet only resulting in one arrest of a local man (whose arrest was unconnected).

– Trials of FRT at the 2016 and 2017 Notting Hill Carnivals led to the Police facing criticism that FRT was ineffective, racially discriminatory, and confused men with women.

– In September 2018 a letter, written by Big Brother Watch (a privacy campaign group) and signed by more than 18 politicians, 25 campaign groups, and numerous academics and barristers, highlighted concerns that facial recognition is being adopted in the UK before it has been properly scrutinised.

– In May 2019 in the UK, following controversial incidents where facial recognition had been tested in some public places, Luciana Berger (MP) put forward a written parliamentary question about bringing forward ‘biometrics legislation’ related to how facial recognition was being used for immigration purposes at airports. Also, questions were asked in Parliament about possible safeguards to protect the security and privacy of citizens’ data that is held as part of the Home Office’s biometrics programme.

– In September 2019, it was revealed that the owners of King’s Cross Estate had been using FRT without telling the public, together with London’s Metropolitan Police Service supplying the images for a database.

– A letter published by London Assembly members Caroline Pidgeon MBE AM and Sian Berry AM to Metropolitan Police commissioner Cressida Dick asked whether the FRT technology could be withdrawn during the COVID-19 pandemic on the grounds that it has been shown to be generally inaccurate, and it still raises questions about civil liberties. The letter also highlighted concerns about the general inaccuracy of FRT and the example of the first two deployments of LFR this year, where more than 13,000 faces were scanned, only six individuals were stopped, and five of those six were misidentified and incorrectly stopped by the police. Also, of the eight people who created a ‘system alert’, seven were incorrectly identified. Concerns have also been raised about how the already questionable accuracy of FRT could be challenged further by people wearing face masks to curb the spread of COVID-19.

What Does This Mean For Your Business?

Biometric surveillance clearly has benefits in terms of being a tool to help with the work of government agencies and law enforcement, but many now feel that its use is advancing too far ahead of the legislation. In a diverse society where data protection rights have been tightened up and respect for privacy with it (due to GDPR), mass surveillance of this kind feels to many people like it goes against those rights, and in a ‘chilling’ way that feels as though it may affect freedom and could be used (if not properly regulated) to discriminate. The UK trials and usage of facial recognition to date has also revealed areas where the technology has been unreliable, and there may also be issues of bias.  It is not surprising, therefore, that a group of MEPs have chosen to apply pressure to tighten up the rules, although it remains to be seen if the concerns of the MEP group affect the final legislation.

Google Rating
5.0
Based on 45 reviews