Back

Blogs

ICO consults on data protection and GenAI

View All

Case Studies

Powering Alt HAN Co.'s Smart Meter Rollout

View All

Upcoming Events

World Hydrogen 2024 Summit & Exhibition

View All

Webinars

Private & blended finance retrofit: lessons from a pioneering partnership

View All

Thoughts

Could data privacy have been endangered by the King’s Speech?

8th Nov, 2023

King Charles gave the first King’s Speech in over 70 years this week, announcing the range of bills the UK government will focus on during the next year of parliament. The context for the speech was a little unusual, as there is a lot of speculation about the potential for a general election that could happen as early as May 2024, and which the current ruling Conservative party is widely expected to lose. This means that reaction to the speech has been somewhat muted, with commentators sceptical about the prospects of the proposals actually making it into law.

However, the speech included some impactful proposals that deserve adequate coverage, including one that is likely to alarm privacy and security professionals.

Worrying news about the Investigatory Powers (Reform) Bill

The King’s Speech announced that ‘my Ministers will give security and intelligence services the powers they need’. This is a very soft way to say that, as the Times reported, ‘the bill will force technology companies to inform the Home Office in advance of any security and privacy features they want to add – and force them to disable those the government objects too’. It will also increase the power of the Home Office to force non-UK companies to comply with changes it wants them to make to security features without the right to appeal, as is the case at present… Apple has said the law will give the UK an “authority that no other country has” and says it will ‘remove services such as Facebook and iMessage from the UK rather than weaken security’.

National security and privacy objectives have been in tension for as long as there have been international telecommunications. The entire intelligence gathering profession is designed to find dangerous information that those communicating it don’t want to share with the authorities. Intercepting electronic communications has been part of that since the first transatlantic cables were laid (by the British). Alan Turing’s cracking of the Enigma codes to enable the UK to intercept secret Nazi communications is one of the most famous stories from the Second World War. The Snowden files showed us just how far the US can go in tracking social media posts. So it’s no surprise that the UK should be interested in a bill like this.

So what’s the risk and why should this be an issue?

There are three main problems with a bill like this.

First, these kinds of powers are only safe in the hands of a benign government that is using them for the benefit of the population. We cannot assume that the UK will always have a government like this. A corrupt government could use powers like this to control, intimidate – and worse – sections of the population.

Second, powers like this could threaten our ‘adequacy decision’ from the EU. The UK got an expedited decision after Brexit that will last until 27 June 2025 and the EU is due to start its review to decide whether to extend this during 2024. We already know that UK investigatory powers regulations are of concern to the EU. In 2021, Big Brother Watch and others took the UK to the European Court of Human Rights and asked it to consider whether the UK’s bulk surveillance rules comply with the European Convention on Human Rights. The court found that bulk surveillance is ok in principle but there need to be end-to-end safeguards in place to protect citizens. The judgement set out what these safeguards are and it’s difficult to believe that the UK won’t consider them when drafting the bill. If we get this wrong and lose adequacy, it could be disastrous for the UK economy and the UK’s desire to be a global data-driven innovation hub.

Third, cyber security is a technical challenge and anything that interferes with the technology can seriously undermine it. If the Home Office demanded ‘back doors’ to enable security services to intercept encrypted communications, for example, these back doors might become discoverable by bad actors. We’ve already had famous examples of this, such as the WannaCry cyber attack which is widely considered to have used an exploit developed by US security services which was stolen and then used by North Korea. This attack went massively viral and took down information systems at major organisations including large parts of the UK NHS.

Privacy and cyber security professionals should watch the progress of this bill carefully and consider contributing to any consultation processes.

Potentially exciting news about artificial intelligence

The second King’s Speech announcement of interest to privacy and cyber security professionals was equally simple: ‘The United Kingdom will continue to lead international discussions to ensure that Artificial Intelligence is developed safely’.

This process started last week with the UK’s AI Safety Summit at Bletchley Park (where Turing cracked the Enigma codes). This summit brought the UK’s AI elite together and finished with the landmark decision to set up an AI Safety Institute in the UK to assess frontier models and risks and support their safe development. This is excellent news for the UK’s emerging AI sector and supports the UK’s clear aim to focus on safe technological development.

However, ‘safety’ means different things to different people and different cultures, so there is likely to be a lot of work to do to agree what it should mean at an international level and what enforcement against ‘unsafe’ technologies should look like. These are essential discussions for our futures, and it’s great to see the UK taking a leading role in them.

Authors

Camilla Winlo

Head of Data Privacy

Read Bio