"People rioted when we scanned for CSAM in a privacy-preserving manner but don't give a shit when we do the same thing when it's not privacy preserving so I guess just do that."
How is this a win? Either is bad, who wants them to keep a database of their image hashes? In some ways this is arguably even worse. If they keep this data online leaks and/or third party access are almost guaranteed. At the very least by authorities with a perma warrant looking for "CP" or "terrorist" material.
And that's exactly the problem and why I put CP in quotation marks. With everything we know about these completely unaccountable agencies, what guarantees you it will be limited to a actual crimes against children? For the children is the oldest trick in the book. Already if we talk terrorism, it's explicitly political. One woman's freedom fighter is another man's terrorist.
Maybe I'm confused. From the Wired article and other sources, it sounds like they have abandoned the idea doing any form of hash comparison or client-side scanning. Am I reading that wrong?
If that article is correct it doesn't sound like they've abandoned the idea at all, only modified. It's still the same thing essentially, they check your file hashes for "known illegal images or other law enforcement inquiries".