deleted by creator
Did you not pay your protection money to CloudFlare?
Man, this current age of AI really sucks.
Hey, could you say how you did that? I’m looking to put a few servers up and I’m worried about this too
deleted by creator
Thanks, this will help.
From the article …
GNOME sysadmin Bart Piotrowski shared on Mastodon that only about 3.2 percent of requests (2,690 out of 84,056) passed their challenge system, suggesting the vast majority of traffic was automated.
I put a rate limit on my nginx docker container. No clue if it worked but my customers are able to use the website now. I get a Alton of automated probing and SQL injection requests. Pretty horrible considering I built my app for very minimal traffic and use session data in places rather than pulling from DB and the ddos basically attacks corrupt sessions
The Internet has always been like that even before the AI stuff got up to stream. If you expose anything to the public Internet it takes about 5s for things to start port scanning if they can it try WordPress/Drupal exploits.
It’s the old spam problem again. Spammers pass the cost of their customers to their victims, while AI bots pass the cost of their crawling to the sites they crawl (without authorization).
I see no easy solution for this.
What if we start throtling them so we make them waste time? Like, we could throttle contiguous requests, so if anyone is hitting the server aggresively they’d get slowed down.
deleted by creator





