• Cosmic Cleric@lemmy.world
    link
    fedilink
    English
    arrow-up
    27
    ·
    9 months ago

    From the article …

    GNOME sysadmin Bart Piotrowski shared on Mastodon that only about 3.2 percent of requests (2,690 out of 84,056) passed their challenge system, suggesting the vast majority of traffic was automated.

  • tomyhaw@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    9 months ago

    I put a rate limit on my nginx docker container. No clue if it worked but my customers are able to use the website now. I get a Alton of automated probing and SQL injection requests. Pretty horrible considering I built my app for very minimal traffic and use session data in places rather than pulling from DB and the ddos basically attacks corrupt sessions

    • tempest@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      9 months ago

      The Internet has always been like that even before the AI stuff got up to stream. If you expose anything to the public Internet it takes about 5s for things to start port scanning if they can it try WordPress/Drupal exploits.

  • Cyber Yuki@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    9 months ago

    It’s the old spam problem again. Spammers pass the cost of their customers to their victims, while AI bots pass the cost of their crawling to the sites they crawl (without authorization).

    I see no easy solution for this.

  • Goun@lemmy.ml
    link
    fedilink
    English
    arrow-up
    4
    ·
    9 months ago

    What if we start throtling them so we make them waste time? Like, we could throttle contiguous requests, so if anyone is hitting the server aggresively they’d get slowed down.