The problem is not only that it's impossible to make cryptography that's only secure when the good guys use it, it's that once cryptography is made insecure, it's insecure for everyone, forever.
I'm not a privacy hardliner, and I think the socially acceptable tradeoff between privacy and security have been well established before the computer era - if the police has a well-enough established suspicion against you - they can get a warrant and search your home. That's due process.
I would accept if there was a digital version of that which targeted not the encryption itself (which could be as strong as possible) - but the endpoints, like smartphones and computers.
Let's say police had a device which they could plug into your phone, which would send a specially signed message - a digital warrant, containing all the info a real warrant would - which be permanently be burned into the ROM of your phone, after which the phone would surrender its encryption keys, and the police could dump your unencrypted disk.
The phone would be then presented as evidence at the trial, and not following due process would be a cause for mistrial, no matter what they find there.
The general public would be safe in the knowledge that as long as the police isn't hauling them in, their secrets are safe, and the government would get the tools for what they claimed they wanted - a way to catch bad guys with digital tools.
While I appreciate the aspects of (A) requiring physical rather than remote access and (B) the intrinsic "paper trail"... As others have pointed out, the trust and revocation problem of a Master Key still remains, and the device could easily be "lost" so that you can't tell the Master Key was abused.
If we accept that a back-door must exist (if!) then it would be far better to have a unique key baked into every single device at manufacture-time:
1. It tightens the "need to know" since a prosecutor with a warrant only needs the one key corresponding to the device they actually seized, not a skeleton key for all things everywhere.
2. It creates a stronger paper-trail against abuse, since requests for a bajillion keys might get noticed.
3. Hackers would need to copy entire databases, instead of exfiltrating a much smaller amount of information, making it a little easier to detect.
4. It impairs any mass remote exploit, since the exploit code would need to include the keys of every possible device it would encounter and want to compromise.
> Let's say police had a device which they could plug into your phone, which would send a specially signed message - a digital warrant, containing all the info a real warrant would - which be permanently be burned into the ROM of your phone, after which the phone would surrender its encryption keys, and the police could dump your unencrypted disk.
And when (not if) that device leaks whoever steals your phone will be able to get access all of the things in there.
I'd imagine such devices would be very tightly controlled, being hard to access for civilians, and lets say limited to 1 such device per 1m people(which would also give you an idea of what sort of frequency this is supposed to be used).
The keys for every phone would be stored in a central repo, with a separate key for every phoneX every decryptor(which has its own private key). Meaning you'd need a device and the central repo to access users data.
But lets say they manage to build a bootleg version, what would be the criminal gain for them? Reading the data doesn't mean they can impersonate you, as the device wouldn't give you access to private keys used for authentication (lets even say these are deleted), only encryption.
The criminals could brick your phone and read your texts. There's only very niche cases when this would be worth it to them, like you're the subject of a highly targeted intelligence gathering op.
> Reading the data doesn't mean they can impersonate you, as the device wouldn't give you access to private keys used for authentication (lets even say these are deleted), only encryption.
They would gain access to anything that isn't separately protected. Every application doesn't necessarily store their authentication in that separate storage that gets wiped and if there is such a storage, there isn't really anything prevents applications from using that storage for other purposes, for example storing the encryption key for application's data.
Gaining access to files, photos, email and such could be used for, for example, blackmail or identity theft.
If politicians, police, intelligence agencies and military are not ready to entrust their data behind the same "escrow" mechanism then clearly they don't trust it enough. If it was as perfectly protected from abuse as claimed then they would have no reason to not trust it as well.
> The problem is not only that it's impossible to make cryptography that's only secure when the good guys use it, it's that once cryptography is made insecure, it's insecure for everyone, forever.
Correct.
> Let's say police had a device which they could plug into your phone, which would send a specially signed message - a digital warrant, containing all the info a real warrant would - which be permanently be burned into the ROM of your phone, after which the phone would surrender its encryption keys, and the police could dump your unencrypted disk.
You are now advocating for making phones insecure for everyone, forever. No.
I'm not a privacy hardliner, and I think the socially acceptable tradeoff between privacy and security have been well established before the computer era - if the police has a well-enough established suspicion against you - they can get a warrant and search your home. That's due process.
I would accept if there was a digital version of that which targeted not the encryption itself (which could be as strong as possible) - but the endpoints, like smartphones and computers.
Let's say police had a device which they could plug into your phone, which would send a specially signed message - a digital warrant, containing all the info a real warrant would - which be permanently be burned into the ROM of your phone, after which the phone would surrender its encryption keys, and the police could dump your unencrypted disk.
The phone would be then presented as evidence at the trial, and not following due process would be a cause for mistrial, no matter what they find there.
The general public would be safe in the knowledge that as long as the police isn't hauling them in, their secrets are safe, and the government would get the tools for what they claimed they wanted - a way to catch bad guys with digital tools.