Skip to content

Dissuade crawlers from downloading super.wasm#6664

Merged
philrz merged 1 commit intomainfrom
robots-txt-book
Feb 25, 2026
Merged

Dissuade crawlers from downloading super.wasm#6664
philrz merged 1 commit intomainfrom
robots-txt-book

Conversation

@philrz
Copy link
Contributor

@philrz philrz commented Feb 25, 2026

What's Changing

This adds a https://superdb.org/robots.txt to dissuade crawlers from downloading the /super.wasm file.

Why

  1. The file is 12.8 MB when downloaded and is considered un-indexable by crawlers.
  2. I've recently learned that a big part of our bandwidth bill has been going toward serving this file to crawlers.

Details

We just received a bill from Netlify in which we had to pay for an overage on bandwidth. I've dug into it and it looks like around January 5th a crawler in Singapore started aggressively hitting superdb.org (see this HackerNews link for examples of others dealing with the same). Since this /super.wasm is so much larger than the other content on the site, nearly all of the overage bandwidth went toward serving that file repeatedly, so dissuading crawlers from downloading it will hopefully squash most of the problem.

It sounds like there's no guarantee they'll obey this, though. I've already tactically deployed the robots.txt and unfortunately the crawling from Singapore has not yet ceased. However guidance from Claude AI has been to wait at least 24 hours before making any conclusions since apparently crawlers will often finish flushing out an existing queue of work before they re-run from the top, at which time they'll hopefully pick up on the robots.txt and do the right thing. And if they still don't stop after 24 hours, I've got other more aggressive ideas to try to get them to back off.

@philrz philrz requested a review from a team February 25, 2026 20:25
@philrz philrz self-assigned this Feb 25, 2026
@philrz philrz merged commit 87c8cc5 into main Feb 25, 2026
2 checks passed
@philrz philrz deleted the robots-txt-book branch February 25, 2026 20:58
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants