Back

Blogs

Gemserv develops Green Book Compliant business case for a £100m 'Able to Pay' Loan scheme in South West

View All

Case Studies

Cyber Threat Intelligence for Energy Theft Prevention

View All

Upcoming Events

FairHeat Annual Conference 2024

View All

Webinars

Thoughts

Transformative technologies and their ethical challenges for law enforcement

22nd Mar, 2021

With the challenges presented by pressure on law enforcement funding and increasingly digitised criminal activities, police forces are keen to use transformative technologies to help predict and solve crime.

In particular, a variety of AI and Big Data tools are being used in the UK and US to bring real efficiencies to police forces. For example, using robotic process automation to save back-office time in filing and archiving activities can allow officers to focus more on tasks that require specialised skills and discretion. Police authorities from those in Kent to Los Angeles have also been using predictive policing solutions such as PredPol, which rely on machine learning to identify trends in offences over time, to help identify hotspots for potential criminal activity.

However, public attention is being increasingly placed on data ethics and transparency on how mass surveillance and Big Data is used, stemming from Edward Snowden’s global surveillance revelations and the Cambridge Analytica scandal. As a result, police authorities need to be conscious of the public perception around transformative technologies and the discrimination and bias they can bring. Awareness of these reputational risks has been evidenced in the decisions taken by IBM, Amazon and Microsoft, who in June 2020 suspended the sale of facial recognition technologies (which use AI to match individuals in real-time CCTV frames with national identification databases) to law enforcement forces.

Even more recently, it has also been seen in jurisprudence. For example, in R (Bridges) vs CC South Wales, Court of Appeal held that South Wales police force’s use of facial recognition software to identify criminals was unlawful. The court considered that sufficient guidance on limiting the amount of individuals under surveillance was not given to police officers, and that authority had not conducted a risk assessment on the potential inaccuracies (and thus false positives) that the technology could have on different skin tones and demographic groups.

Whilst the deployment of transformative technologies to improve efficiency and improve safety should be encouraged, law enforcement authorities must maintain a reputation for ethical and transparent conduct. Police forces are subject to public scrutiny and obligations (including the public sector equality duty under the Human Rights Act 1998), and so will ultimately bear the accountability for any fallout from predictive or surveillance technologies that result in unfair or heavy-handed policing used, rather than the software providers. As such, police authorities, like much of the public sector, should look towards conducting public-facing Data Protection Impact Assessments and developing and integrating a Code of Ethics for the usage of data analytics systems, in order to both comply with legal obligations and ensure the bond of trust with citizens is maintained.

Gemserv’s Cyber Security & Privacy Practice has a pedigree in offering a wide range of services from risk management to cyber and privacy assurance in our increasingly digital driven world. Our consultants understand how markets are transitioning and therefore the impact of future trends in cyber and privacy, and ultimately the potential implications for you.

If you’d like to know more visit our Public Sector page below or get in touch by contacting bd@gemserv.com

  • WordPress › Error

    There has been a critical error on this website.

    Learn more about troubleshooting WordPress.