Not 100% related but not 100% not-related either: I've got a script that generates variations of the domain names I use the most... All the most common typos/mispelling, all the "1337" variations, all the Levenhstein edit distance of 1, quite some of the 2, etc.
For example for "lillybank.com", I'll generate:
llllybank.com
liliybank.com
...
and countless others.
Hundreds of thousands of entries. They then are null-routed from my unbound DNS resolver.
My browsers are forced into "corporate" settings where they cannot use DoH/DoT: it's all, between my browsers and my unbound resolver, in the clear.
All DNS UDP traffic that contains any Unicode domain name is blocked by the firewall. No DNS over TCP is allowed (and, no, I don't care).
I also block entire countries' TLD as well as entire countries' IP blocks.
Been running a setup like that (and many killfiles, and DNS resolvers known to block all known porn and know malware sites etc.) since years now already. The Internet keeps working fine.
Considering how it must be getting hammered what with the "AI" nonsense, it's interesting how crt.sh continues to remain usable, particularly the (limited) direct PostgresSQL db access
To me, this is evidence that SQL databases with high traffic can be made directly accessible on the public internet
crt.sh seems to be more accessible at certain times of the day. I can remember when it had no such accessibility issues
It's the only website I know of where queries can just randomly fail for no reason, and they don't even have an automatic retry mechanism. Even the worst enterprise nightmares I've seen weren't this user unfriendly.
(the site may occasionally fail to load)