I don't see how an AI crawler is different from any others.
The simplest approach is to count the UA as risky or flag multiple 404 errors or HEAD requests, and block on that. Those are rules we already have out of the box.
It's open source, there's no pain in writing specific rules for rate limiting, thus my question.
Plus, we have developed a dashboard for manually choosing UA blocks based on name, but we're still not sure if this is something that would be really helpful for website operators.
I believe that if something is publicly available, it shouldn't be overprotected in most cases.
However, there are many advanced cases, such as crawlers that collect data for platform impersonation (for scams) or custom phishing attacks, or account brute-force attacks. In those cases, I use tirreno to understand traffic through different dimensions.
There is a noprocrast feature in your settings to specify how long you can stay on for a single session and the frequency at which you can view HN. Super helpful!
Don’t let me distract from this learning opportunity with my armchair expertise. There are a lot of articles out there for this exact topic, but here’s one that’s pretty good.
> That map projection is the worst choice possible.
For navigation, the Mercator projection is useful, because a straight line on the chart is where you go with a constant bearing. Aerial navigation is waypoint/bearing/waypoint/bearing. So most aviation maps are Mercator.
I love how emphasize is given to accessibility for older adults, such as the orange man. But I guess he gets his printouts with few words and big fonts anyways.
The way he writes indicates that he has very little experience with reading in the first place. Weird wording, strange capitalization and punctiation, etc.
Funny how they make this joke about Trump when biden got caught on camera using cue cards and having reporters questions and headshots on a cheat sheet...
Can he read? No doubt he can read some. I can't say he's illiterate. But functionally, he's nowhere near the reading and comprehension skills of what we should expect from a national leader.
Kind of tells you something about how timidly and/or begrudgingly it’s been accepted.
IMO the attitude is warranted. There is no good that comes from having higher-level code than necessary at the kernel level. The dispute is whether the kernel needs to be more modern, but it should be about what is the best tool for the job. Forget the bells-and-whistles and answer this: does the use of Rust generate a result that is more performant and more efficient than the best result using C?
This isn’t about what people want to use because it’s a nice language for writing applications. The kernel is about making things work with minimum overhead.
By analogy, the Linux kernel historically has been a small shop mentored by a fine woodworker. Microsoft historically has been a corporation with a get-it-done attitude. Now we’re saying Linux should be a “let’s see what the group thinks- no they don’t like your old ways, and you don’t have the energy anymore to manage this” shop, which is sad, but that is the story everywhere now. This isn’t some 20th century revolution where hippies eating apples and doing drugs are creating video games and graphical operating systems, it’s just abandoning old ways because they don’t like them and think the new ways are good enough and are easier to manage and invite more people in than the old ways. That’s Microsoft creep.
> Forget the bells-and-whistles and answer this: does the use of Rust generate a result that is more performant and more efficient than the best result using C?
reply