The UK Parliament is currently seeking feedback for itsOnline Safety Bill, a proposal that’s intended to regulate online interactions and fight child abuse. But several companies, including Apple, are threatening to pull services from the UK if this bill goes into law.

It’s an odd situation, but it’s something we’ve anticipated for some time. The UK government began working on its Online Safety Bill a few years ago, and it’s regularly received criticism from onlooking security experts. The idea of the bill is pretty straightforward—all security features utilized by messaging apps must be pre-screened by a regulatory body, and messaging apps must be prepared to disable these features (without informing the public) at the government’s behest.

Essentially, the UK would like to have unencrypted access to messages that may be violent or abusive. This would allow the government to regulate online interactions for public safety. Note that one of the main goals of this proposal is to detect CSAM (child sexual abuse material), which is often shared through encrypted messaging serviceslike Telegram. In its proposal, the UK claims that message scanning will be automated through the use of AI, which will (theoretically) detect things like CSAM or terroristic threats.

There are several valid criticisms of this bill. But the biggest criticism is that it would effectively outlawend-to-end encryptionon a global scale—Apple and other companies can’t just “turn off” this encryption for the UK. The bill would also hamper the development of new security technologies (which would need to be pre-screened), and it would place several (mostly foreign) companies under the thumb of UK regulators.

Apple raises all of these concerns in its statement to the UK government. And, notably, it takes a moral stance on the issue, implying that the proposed Online Safety Bill is a violation of basic human rights.

(Before reading the quote below, note that Apple regularly complies with subpoenas from the FBI and other government bodies. But it’s alsorefused to workwith the FBI. At the end of the day, Apple is a corporation, and any moral appeal it makes should be viewed through a corporate lens.)

“By requiring non-U.K. technology companies to maintain the ability to produce unencrypted data for all of their users worldwide—without notifying their users of that ability—the IPA would include a worldwide gag order … That is deeply problematic, especially considering that the legal systems of most nations treat free speech as a fundamental individual right.”

If this bill passes, Apple says it will not conform. Instead, it will pull services like iMessage and FaceTime from the United Kingdom. Other encrypted messaging apps, including WhatsApp, have threatened similar action.

For reference, the estimated number of iPhone users in the UK floatsaround 20 million. There are about 1.5 billion iPhone users globally, meaning that the UK makes up just 1.5% of all iPhone users. Apple has a lot to lose here, but it has enough wiggle room for this to be a genuine threat.

But the UK government dismisses these concerns. It says that the Online Safety Bill has plenty of provisions to protect users, and that disabling security technologies would only be used as a “last resort.” It also wants to use automated tools (specifically AI) to scan for CSAM, terrorist threats, and other material, creating a privacy barrier between the government and the unencrypted data that it wants to collect.

Frankly, the idea that AI can accurately detect this kind of material is a stretch (at least in this half of the decade). And even if this bill contains safeguards, there’s no way to know how these safeguards will be practiced. It seems that the UK is in a pickle—it wants to crack down on one of the most popular avenues of violence and abuse, but it can’t do so without “punishing” regular people. (For what it’s worth, Apple itself wasrecently criticizedfor trying to detect CSAM on users' iCloud accounts. It’s a problem that’s difficult to solve, clearly.)