Facial recognition cameras – what your rights are

facial-recognition

Authored by DAS

Facial recognition cameras have been installed quietly cross the country, which has sparked controversy amongst privacy campaigners and the public who are unsure about the growing use of facial recognition technology. These devices are just the latest additions to a growing system of increasingly intrusive snooping on the population.

Where does the law stand on the use of facial recognition software?

Elisa Ribeiro, legal adviser at DAS Law, tells you what you need to know…

What is facial recognition technology?

The Information Commissioner’s Office defines facial recognition technology as “the process by which a person can be identified or otherwise recognised from a digital facial image” and in essence explains that a facial recognition camera will capture a person’s image, take details of that person’s specific facial characteristics, store it and compare it to images already captured to check if there is a match.

Is the use of facial recognition technology legal in the UK?

There is no specific facial recognition law in the UK, however, the Information Commissioner’s Office has set out specific responsibilities in the Data Protection Act 2018 and other legislation.

On the 18th June 2021, the public concern about the potential misuse of live automated facial-recognition technology in public places was addressed by the Information Commissioner’s Opinion report which clarified the legal requirements for the use of facial recognition technology stating that the law requires controllers to “demonstrate that their processing can be justified as fair, necessary and proportionate”

The Information Commissioner’s Opinion also highlighted that controllers seeking to deploy Live Facial Recognition must:

  • Comply with all relevant parts of the UK General Data Protection Regulation and Data Protection Act 2018. This includes the data protection principles:
    • lawfulness, fairness and transparency
    • purpose limitation
    • data minimisation
    • storage limitation
    • security and accountability
  • Identify a lawful basis and meet its requirements.
  • Identify, where required, appropriate conditions for processing special category data under UK GDPR Article 9 and criminal offence data under Article 10.
  • Ensure that data subjects are able to exercise their rights, including:
    • the right to be informed
    • the rights of access, rectification and erasure
    • the rights to restrict processing and to object and
    • rights in relation to automated decision making and profiling
  • Ensure clarity of controller, joint controller and processor roles and responsibilities where necessary, and be able to demonstrate compliance.
  • Take a data protection by design and default approach.
  • Undertake a Data Protection Impact Assessment where required (DPIA), and if the DPIA identifies risks that cannot be mitigated by the controller, consult the ICO.

Whilst facial recognition technology can be legal, the lack of specific legislation has led to some organisations getting it wrong.

In 2020 the court of appeal in R (Bridges) v Chief Constable of South Wales held that the use of automated facial recognition technology by the South Wales Police Force was in breach of Article 8 of the European Court of Human Rights due to no clear guidance on where automated facial-recognition could be used and who could be put on a watch list, therefore, more guidance was needed.

An update to the Surveillance Camera Code of Practice was laid before Parliament on 16 November 2021 and came into effect on 12 January 2022. The guidelines are as follows:

System operators should adopt the following 12 guiding principles:

  • Use of a surveillance camera system must always be for a specified purpose which is in pursuit of a legitimate aim and necessary to meet an identified pressing need.
  • The user of a surveillance camera system must take into account its effect on individuals and their privacy, with regular reviews to ensure its use remains justified.
  • There must be as much transparency in the use of a surveillance camera system as possible, including a published contact point for access to information and complaints.
  • There must be clear responsibility and accountability for all surveillance camera system activities including images and information collected, held and used.
  • Clear rules, policies and procedures must be in place before a surveillance camera system is used, and these must be communicated to all who need to comply with them.
  • No more images and information should be stored than that which is strictly required for the stated purpose of a surveillance camera system, and such images and information should be deleted once their purposes have been discharged.
  • Access to retained images and information should be restricted and there must be clearly defined rules on who can gain access and for what purpose such access is granted; the disclosure of images and information should only take place when it is necessary for such a purpose or for law enforcement purposes.
  • Surveillance camera system operators should consider any approved operational, technical and competency standards relevant to a system and its purpose and work to meet and maintain those standards.
  • Surveillance camera system images and information should be subject to appropriate security measures to safeguard against unauthorised access and use.
  • There should be effective review and audit mechanisms to ensure legal requirements, policies and standards are complied with in practice, and regular reports should be published.
  • When the use of a surveillance camera system is in pursuit of a legitimate aim, and there is a pressing need for its use, it should then be used in the most effective way to support public safety and law enforcement with the aim of processing images and information of evidential value.
  • Any information used to support a surveillance camera system which compares against a reference database for matching purposes should be accurate and kept up to date.

Can the public appeal a decision to install or use facial recognition cameras?

There is no legal basis on which the public can appeal against the installation of a camera, however Biometrics and Surveillance Camera Commission has set out some general ethical principles that should be applied to any trial of live facial recognition technology. These include:

  • The public should be informed and consulted when a trial, or series of trials, is to be conducted and the purpose and general approach to evaluation explained.
  • The public should be informed at a trial location that a trial is in progress and given a contact where further information can be found/requested.

If a member of the public felt that the technologies were to be used for any unlawful recording, they could look to challenge the police (or other organisation) if they were to feel that their privacy was being invaded. However, if the police can show that they have legitimate aim and meet the oversight and regulation framework outlined by the Commission, their use of the technology is likely to be justified.

Am I allowed to cover my face when approaching a face recognition camera located in a public space?

There is no legislation in place prohibiting the use of face covering, however, the Criminal Justice and Public Order Act 1994 state that the police have the power to require removal of facial coverings in England and Wales if they feel they are being worn for the purpose of concealing identity and if they believe incidents involving violence may take place in any locality.

Can I install face recognition cameras on my property?

It is highly unlikely that a member of the public would be able to install facial recognition technology at their property unless they had a specified purpose which is in pursuit of a legitimate aim and necessary to meet an identified pressing need.

If a member of the public found that someone was invading their right to privacy with the use of facial recognition technology, they should report that potential breach of data to the Information Commissioner’s Office and potentially to the police to consider whether any criminal act had been committed.

Can I sue the police if I’m incorrectly identified as a suspect?

Anyone who believe that they have been misidentified by the facial recognition technology or believes that their image has been taken unlawfully should seek legal advice and contact the Information Commissioner’s Office.

CLICK HERE TO SIGN UP FOR OUR
FREE BI-WEEKLY NEWSLETTER

About DAS Group

The DAS UK Group comprises an insurance company (DAS Legal Expenses Insurance Company Ltd), a law firm (DAS Law), and an after the event (ATE) legal expenses division.

DAS UK introduced legal expenses insurance (LEI) in 1975, protecting individuals and businesses against the unforeseen costs involved in a legal dispute. In 2018 it wrote more than seven million policies.

 The company offers a range of insurance and assistance add-on products suitable for landlords, homeowners, motorists, groups and business owners, while it’s after the event legal expenses insurance division offers civil litigation, clinical negligence and personal injury products. In 2013, DAS also acquired its own law firm – DAS Law – enabling it to leverage the firm’s expertise to provide its customers with access to legal advice and representation.

 DAS UK is part of the ERGO Group, one of Europe’s largest insurance groups (the majority shareholder in ERGO is Munich Re, one of the world’s largest reinsurers).