A Response to the eSafety Commission’s End-to-End Encryption Position Statement

With Australia’s eSafety Commission — a federal government statutory body — in the news lately, I though I’d tackle a topic that’s been bothering me for many months.

In February, 2020, the eSafety Commission released a position statement on end-to-end encryption. As I wrote earlier, the position statement could be renamed to, “Reasons Why the Government Must be able to Read Your Private Communications”. As I also wrote, services such as Facebook, Signal, WhatsApp, etc. enabling end-to-end encryption is a direct response to the Snowden revelations of mid-2013. Indeed, without Snowden, Signal may not exist, WhatsApp certainly wouldn’t have implemented Signal’s double-ratchet algorithm, and Facebook wouldn’t offer end-to-end encryption of messages.

It’s important to point out the irony: Governments want to restrict end-to-end encryption, a direct response to government over-reach. Yes, governments are responding to problems of their own making. This point should not be lost, although it is certainly rarely mentioned by those pushing against end-to-end encryption.

As I also wrote earlier, governments, bureaucrats, and the media have parked the, “It’s for your [physical] safety” narrative as to why governments need to be able to read end-to-end encryption messages. This narrative — including references to terrorism, organised crime, and physical child abuse — is oft-recited in the Crypto Wars: Give up individual freedom in order for the government to protect everyone. As many people have pointed out, a government with too much power is a far greater risk than private citizens committing crimes. Anyone who has even peeked at history knows this statement to be true.

However, the old narrative can only be trotted out a finite number of times, especially after the Snowden revelations. The new narrative, as I noted before, is around “misinformation“, censorship, “hate speech“, andonline safety“. Note the move away from physical safety to online safety, excluding very important concerns about children being physically harmed.

The eSafety Commission is a leading voice of the new narrative. Let’s take a look at the position statement’s wishlist:

  • Using certain types of encryption that allow proactive tools to function.
  • Implementing proactive detection tools at transmission, rather than on receipt.
  • Moving AI and proactive technical tools to the device level.

Wish 1) is probably referencing an idea called “homomorphic encryption“, under which users can grant access to data without revealing the data itself. But reveal the query results to whom? Who writes the tools? Which types of queries are allowed? With whom can the query result be shared? If the data can’t be revealed, then what is revealed? Metadata?

Telecommunications metadata collection (without a warrant) by the federal government has already over-reached. Why should we trust such metadata collection? Either app vendors would open up a complete can of worms or outsource the problem to the government.

I assume that wish 2) is to stop the recipient from receiving certain messages, to stop the messages at the source. Given the ease with which anyone can control who is allowed to send someone messages, I don’t see the point of this wish. It’s very easy to ensure that messages are only received from trusted, known people.

For wish 3), moving AI and protective technical tools to the device assumes that these wishes already exist elsewhere. They don’t, not when end-to-end encryption is correctly implemented. However, as I wrote above, there are serious issues with such suggestions. Ultimately either data is shared/accessible by the app vendor or a third party (read: government) or it’s not.

One key aspect to secure messaging apps is their being open source. Even if the wishlist were implemented by one or a few app vendors, the free market would provide apps without such measures. Likewise, the pushback from the digital privacy industry would be immense.

What’s Missing / Makes no Sense

What’s oddly missing from the eSafety Commission’s position statement? Parents. Parents are not mentioned once, which is odd for a position statement so concerned about children. Parents are responsible for their children, and hence parents have a part to play in education and monitoring of social media and messaging apps. The government is not a replacement for good parenting.

The position statement incorrectly claims that WhatsApp and iMessage are fully encrypted. Metadata is not encrypted by either app, and backups may not be encrypted.

The position statement claims to strike a balance between “security, privacy and safety”. Either the government can read messages and metadata sent by secure messaging apps or it can’t. There is no middle ground, no balance. “Balance” means the government having access by default.

Finally, the position statement is essentially meaningless. Most secure messaging apps have no legal entity in Australia, and hence the federal government cannot regulate them. Even if they did — I’m not a lawyer, but I can imagine regulations for apps in Google and Apple’s store — it’d still be possible to download an APK, sideload the app, and even use a VPN/Tor if the app endpoints on the Internet were blocked. Ultimately the eSafety Commission can make as many recommendations as it likes; however, no app vendor is under any real pressure the implement anything.

Ironically Signal is mentioned in a positive manner on the eSafety Commissions website.

As its core, the main philosophical question is about an individual’s right to talk to another individual without government a) knowing that they’re talking, or b) the government being able to read/listen to the conversation. Freedom of conscience and freedom of speech are core bedrocks of liberal democracies. They are not optional and certainly do not need the “balance” in the position statement.

Governments lost this fight decades ago, yet the crypto war continues under a new narrative.