How to combat large amounts of Ai scrapers
from DrunkAnRoot@sh.itjust.works to selfhosted@lemmy.world on 04 Jul 19:19
https://sh.itjust.works/post/41544308

everytime i check nginx logs its more scrapers then i can count and i could not find any good open source solutions

#selfhosted

threaded - newest

savvywolf@pawb.social on 04 Jul 19:24 next collapse

I’ve seen people suggesting and using Anubis, haven’t used it myself though.

Courantdair@jlai.lu on 04 Jul 20:03 next collapse

I was going to recommend that, very easy to setup

Mordikan@kbin.earth on 05 Jul 02:27 next collapse

I especially love the irony of Anubis using yesterday's hype thing to combat today's.

DrunkAnRoot@sh.itjust.works on 05 Jul 03:16 next collapse

i tried Anubis and it works great the only issue is it wont support multiple subdomains

RedBauble@sh.itjust.works on 05 Jul 08:10 next collapse

Second Anubis, just finished by setup yesterday i have it of a oracle cloud frre tier vps, which depending on the domain routes the traffic to services hosted on the vps itself or to my server ar home. Relatively easy to setup, blocks most requests with very few false positives (one of which for example it would aggressively challenge by thunderbird trying to reach my baikal instance). I set a bit more aggresive rules than default (i also block googlebot and bingbot, since i received a bit more requests than I’d like). In like 10 hours it straight up denied about 5000 requests from the ai-catchall ruleset (mostly amazonbot) and challenged about 10000, mostly from a block of IPs in singapore, some of the hosts having the user agent of a Macintosh with PowerPC. They all sure love to explore the public repos on my git server.

I’m in the process of changing servers for an upgrade, the old one still hosting more services while I setup the new one. The old one now does run audibly quiter. I don’t even want to think how much electricity went wasted because of those bots

VeganCheesecake@lemmy.blahaj.zone on 05 Jul 20:38 collapse

You probably don’t need me to tell you, but keep good backups. Friend of mine recently had his account nuked without any reason given, and without the possibility of recourse.

<img alt="a mail from Oracle, informing about the immediate termination of service, and deletion of all data" src="https://lemmy.blahaj.zone/pictrs/image/c8d5516a-cda4-414e-b6e3-cde257c66112.webp">

WhyJiffie@sh.itjust.works on 05 Jul 22:51 collapse

as I heard that’s pretty common at oracle, but it’s good to spread the word

FundMECFSResearch@lemmy.blahaj.zone on 05 Jul 12:55 collapse

I’ve had trouble with it using a vpn and privacy browsers. It often blocks me until I use a default browser.

HelloRoot@lemy.lol on 04 Jul 19:42 next collapse

crowdsec is arguably not completely open source, but I’m very satisfied with it.

lemonuri@lemmy.ml on 05 Jul 07:50 collapse

The scraper blocklist on crowdsec requires a paid subscription, though, or did you find another workaround?

HelloRoot@lemy.lol on 05 Jul 12:48 collapse

I don’t remember how I set it up a long time ago. But when I look at my server logs I only see myself.

Afaik I just added the biggests lists. But I don’t remember.

Bahnd@lemmy.world on 04 Jul 21:29 next collapse

Wern’t there a few AI maze projects in the works? I wonder if running one of those for a bit will cause you to be added to an ignore list, clearly they dont respect your robots file.

slazer2au@lemmy.world on 04 Jul 22:36 collapse

Tar pits I think is the term they use to pollute AI data.

db0@lemmy.dbzer0.com on 04 Jul 23:09 next collapse

You need yo block the alibaba subnets primarily. In my experience this is where most of them originate

grumuk@lemmy.ml on 05 Jul 07:24 next collapse

I’ve seen people mention Anubis, the other one I heard about in a blog post that’s maybe worth looking into is go-away.

Cyber@feddit.uk on 05 Jul 10:14 next collapse

If you’re able to, use GeoIP ranges to only allow access from the countries you want.

That immediately limits a lot of everything

Then - again if you’re able to - use a block list that covers known scrapers in case they’re in your country.

I use pfBlockerNG on my pfSense firewall for exactly this.

Fedditor385@lemmy.world on 05 Jul 10:24 next collapse

Anubis is the name of the tool. Also, Cloudflare just announced they have something against AI scrapers.

DrunkAnRoot@sh.itjust.works on 06 Jul 03:28 collapse

ive been using Anubis my only issue is i would have to run more then one instance and i dont like cloudflare personaly

Fedditor385@lemmy.world on 05 Jul 11:33 next collapse

I just realized an interesting thing - if I use Gemini, and tell it to do deep research, it actually goes to the websites it knows/finds, and looks up the content to provide up-to-date answers. So, some of those AI crawlers are actually not crawlers, but actual users who just use AI instead of coming directly to the site.

Soo… blocking AI completely could also potentially reduce exposure, especially as more and more people use AI to basically do searches instead of browsing themselves. That would also explain the amount of requests daily - could be simply different users using AI to research for some topic.

Point is, you should evaluate if the AI requests are just proxies of real users, and blocking AI blocks real users from knowing your site exists.

daddycool@lemmy.world on 05 Jul 12:06 next collapse

some of those AI crawlers are actually not crawlers, but actual users who just use AI instead of coming directly to the site. Soo… blocking AI completely could also potentially reduce exposure.

Normally, websites want users to come to their site, instead of an AI search engine “stealing” the content and presenting it as it’s own. Yes, AI search engines are more convenient for the user, but in the end it will discourage website creators and thereby cut of it’s own “food supply”.

Fedditor385@lemmy.world on 05 Jul 12:17 next collapse

I understand, but the shift in user behaviour is significant and I think websites are not taking it into account. If the users move more and more to AI, and since Google introduced AI mode it’s only a question of time until it becomes the default, we will see more and more of what we thing are AI crawlers and less and less organic users.

AI seems to be the new middleman between you and the user, and if you block the middleman, you block the user. For people with hobby websites or established sites it may make sense because people either know of them, or getting more exposure is not a wish or requirement, but for everyone else, it will be painful.

Noja@sopuli.xyz on 05 Jul 13:28 next collapse

I honestly don’t think most people replace search with AI, it will also slowly solve itself when google injects ads into the output.

lambalicious@lemmy.sdf.org on 06 Jul 01:29 collapse

So, what I’m reading is, if your “users” are bad (or bots), just get better users.

Sounds like a net win.

nfreak@lemmy.ml on 05 Jul 13:09 collapse

Yeah I’d consider blocking out both the bots and AI-users a win-win lmao

rumba@lemmy.zip on 05 Jul 14:21 next collapse

Porque no los dos?

There is no functional difference between them scraping you systematically and them coming to you on behalf of user. They’re coming to scrape you either way, being asked by someone is just going to make them do it in a smarter fashion.

Also, if you’re not using Gemini, damned if Google.com doesn’t search you with it anyway. They want these AIs trained bad, sooner or later almost all searching will be done through AI. There will eventually be no option.

You are correct that blocking all AI calls well eventually make your search results not work.

So if you want organic traffic, you have to allow ai scraping eventually. You’re just going to get diminishing returns until a point.

jjlinux@lemmy.ml on 05 Jul 21:28 collapse

Eso es correctísimo. I don’t want ANY AI in my servers looking for anything, regardless of if they are crawlers or if it’s on behalf of some lazy fuck.

DrunkAnRoot@sh.itjust.works on 05 Jul 22:34 collapse

this does not really apply because i run some frontends so there is not really any information that ai needs

Igilq@szmer.info on 05 Jul 11:40 next collapse

Well, someone had great idea to use zipbombs. I saw it somewhere but I don’t remember where.

DrunkAnRoot@sh.itjust.works on 05 Jul 22:37 collapse

Anubis has this built in if it detects bots it turns the diffuclty to impossible

fubarx@lemmy.world on 05 Jul 13:08 next collapse

If nginx, here’s an open-source blocker/honeypot: github.com/raminf/RoboNope-nginx

If you have it set up to be proxied or hosted by Cloudflare, they have their own solution: blog.cloudflare.com/declaring-your-aindependence-…

ikidd@lemmy.world on 05 Jul 21:15 next collapse

I wonder why that RoboNope doesn’t just make a fail2ban entry for anything that accesses a disallowed url and drop them entirely.

Actually this look like it would do something similiar, then dumps them to fail2ban after the re-access the honeypot page too many times: petermolnar.net/…/anti-ai-nepenthes-fail2ban/

DrunkAnRoot@sh.itjust.works on 06 Jul 03:26 collapse

ill check robonope out seems promising

madiator2011@px.madiator.com on 05 Jul 18:51 next collapse

In my case I use https://www.bunkerweb.io/ as my proxy for that, but there are other tools like for example https://github.com/TecharoHQ/anubis

DrunkAnRoot@sh.itjust.works on 05 Jul 22:42 collapse

bunkerweb looks intresting

gandalf_der_12te@discuss.tchncs.de on 06 Jul 02:39 next collapse

What’s bothering you?

  • Is it to give out data for AI training? I guess you can’t fundamentally protect against this, except by limiting how much content is provided to each address.
  • Or is it the resource strain that it causes on your server? In that case i recommend limiting how much a single client / IP address can request in a day.
DrunkAnRoot@sh.itjust.works on 06 Jul 03:18 collapse

its the strain of it i mostly run instances and frontends so the training is not a huge problem

gandalf_der_12te@discuss.tchncs.de on 06 Jul 03:48 collapse

the keyword you need is “DDoS protection” i guess

it keeps the server from getting overloaded due to too many requests

pewgar_seemsimandroid@lemmy.blahaj.zone on 06 Jul 03:00 collapse

does anubis not work?

DrunkAnRoot@sh.itjust.works on 06 Jul 03:17 collapse

i can only get it to protect one container. i have 3 that i need protected and i cant figure out how to run more then one instance of it.