~edwardloveall/scribe

6ccea391ed5e05cdd65b78fc1c7ada40f22fffe3 — Edward Loveall 4 months ago 41b391e
Add a bunch of well-known, LLM scrapers to robots.txt

Unknown if this will actually stop them, but at least I can show my intent. User agents sourced from https://darkvisitors.com/
2 files changed, 56 insertions(+), 4 deletions(-)

M CHANGELOG
M public/robots.txt
M CHANGELOG => CHANGELOG +1 -0
@@ 1,5 1,6 @@
Unreleased

* Add a bunch of well-known, LLM scrapers to robots.txt
* Add command to tag releases
* Modernize nix config
* Added scribe.manasiwibi.com instance

M public/robots.txt => public/robots.txt +55 -4
@@ 1,4 1,55 @@
# Learn more about robots.txt: https://www.robotstxt.org/robotstxt.html
User-agent: *
# 'Disallow' with an empty value allows all paths to be crawled
Disallow:
# ChatGPT-User
User-agent: ChatGPT-User
Disallow: /

# cohere-ai
User-agent: cohere-ai
Disallow: /

# anthropic-ai
User-agent: anthropic-ai
Disallow: /

# Bytespider
User-agent: Bytespider
Disallow: /

# CCBot
User-agent: CCBot
Disallow: /

# FacebookBot
User-agent: FacebookBot
Disallow: /

# Google-Extended
User-agent: Google-Extended
Disallow: /

# GPTBot
User-agent: GPTBot
Disallow: /

# omgili
User-agent: omgili
Disallow: /

# Amazonbot
User-agent: Amazonbot
Disallow: /

# Applebot
User-agent: Applebot
Disallow: /

# PerplexityBot
User-agent: PerplexityBot
Disallow: /

# PerplexityBot
User-agent: PerplexityBot
Disallow: /

# YouBot
User-agent: YouBot
Disallow: /