

You think Red Hat & friends are just all bad sysadmins? Source hut maybe…
I think there’s a bit of both: poorly optimized/antiquated sites and a gigantic spike in unexpected and persistent bot traffic. The typical mitigations do not work anymore.
Not every site is and not every site should have to be optimized for hundreds of thousands of requests every day or more. Just because they can be doesn’t mean that it’s worth the time effort or cost.
my friends and i can never get past one or two quotas before we’re eliminated. we end up drowning on the way back to the ship or not finding enough scrap in time ;-; still fun though