top of page

Apple’s Advanced Data Protection Ambitions Clash with UK Policy

  • Writer: Ray Alner
    Ray Alner
  • Apr 20
  • 4 min read

Backstory

If you hadn’t read my previous discussion on the UK requesting Apple to provide a backdoor into Apple’s international user data, you can read about it here.

While this update happened back in February, it’s still relevant and updates are still coming out about this issue.

What’s New

So now, whats new. The UK government’s request was actually successful in making Apple change their ways, but this time in a negative sense.

Apple’s response to the UK government asking for a backdoor in Apple’s system was to remove Advance Data Protection from Apple’s offering in the UK.

For new users, they will be presented this screen, showing that ADP can no longer be offered in the UK.

For existing users, they will still have ADP, but Apple will require their users to turn it off at a later date, likely by restricting new uploads or forcing the sign-out of all users who have ADP enabled and require them to turn it off before using Apple services again. Apple can’t turn off ADP for them because that would defeat the purpose of not even Apple having the decryption keys.

What does that mean?

It means that Apple can now be required by law to provide data stored by its users if the UK government compels it to through a court order.

From a privacy perspective, this is devastating. From a security perspective this is a step backwards. Lets break it down.

Privacy

The privacy impact is greater than a security impact in my opinion as users have an innate right to privacy. Security is the method by which privacy is provided to a user. People are allowed to have thoughts and opinions of their own, and their devices should be an extension of that right.

While the technology has now shifted users data to devices that may not be owned by them, the right could still extend to devices that the users have allowed their data to sit on.

Now, there are limits, because “ownership” is such a fluid term used in the 21st century, we really haven’t updated it to include data, so companies push users to store data on devices that while not owned by the user still house the user’s data. Society has not evolved enough to realize that “my data” on “someone else’s device” is still “my data”.

Because of that, both companies and governments tend to take advantage of this loophole and use it to push what they would like society to see as ownership and make both businesses (in profiting from stuff that is not theirs) and governments (by making it easier for them to go after potential criminals).

For instance, a user in the UK would still be allowed to use encryption on devices they own. So if a user wanted to encrypt their data they would still have the legal right to do so. If a government compelled the user to unlock the encrypted device, the user could still deny the request, but face penalties for doing so. For data stored in the cloud, users don't have the option to tell businesses "don't comply with that request—it's my data and I'll take the penalty for it." Instead, compliance depends on who physically owns the device storing the data.

This is wrong, and is changing society to break down the connection of data and data devices in a way that can impact a users privacy to better suit the uses of business and governments and is a slippery slope for users and their privacy.

Security & Encryption

While security still has its own issues with quantum computing coming down the line impacting security, the impact of these restrictions does a few things for the general security and encryption practices, mainly the slower rollout of security mechanisms for peoples private data on services not owned by the user.

Rolling out security mechanisms that benefit the end user cost millions. The design and development of these security tools to truly be secure is an investment businesses usually don’t take lightly. If a country like the UK and other major economic zones decide to restrict the functionality of encryption, business are less likely to invest in those mechanisms as the cost and pushback to implement outweigh the benefit an area that don’t want it rolled out.

For instance, Meta slowed down rollout of E2E encryption because several country’s governments had problems with the rollout as it may cause issues for those government’s ability to request access from the companies holding users data.

Restricting companies from enabling E2E encryption forces users to accept unnecessary risks with their data. Even if a user's data isn't under investigation and never will be, a breach in the company's network means attackers can steal that data and damage the user's reputation. Current data protection laws, still based on outdated concepts of ownership, are inadequate—when companies lose user data, they typically only need to provide basic monitoring services to affected users.

My Thoughts

These restrictions on user privacy and security exist primarily because governments want easier access to users' information during investigations. However, this approach weakens security for everyone else in the process.

To me, this is backwards. It flies in the face of innocent until proven guilty, makes it easier to gather information on unsuspecting users, gives bad actors an easier time on gathering data from companies and users who have poor security practices, and gives companies an excuse to use data to train models, algorithms, and other data gathering methods about users.

If we really want to give privacy back to the user we have to give users the ability to use tools and products privately as is their right.

Comments


bottom of page