[Fasterfox] Abusing sites with pre-fetching

Piotr Kubala pik-emwac at sponsor.com.pl
Sat Dec 3 10:29:38 EST 2005

1. It is not true, that pre-fetching does not load dynamic content. It could be
dynamic content with .htm extension too, and, in fact, there is.
2. In our site one user with pre-fetching behaves like hundreds of usual users,
especially on full website map page - and we must pay for this.
3. Prefetcher behaves like a robot and should follow robot rules - both written
in robots.txt and in <meta name="robots"> tag. I mean "nofollow" should stop
prefetcher. There are traps for rude robots - invisible links to pages denied by
robots.txt. Usual user cannot see these links and does not visit these traps.
Pre-fetcher try to access forbidden pages and, in effect, causes denial of
access for user.

I think pre-fetching SHOULD obey robot rules.
I would really appreciate indentifying prefetcher as special user-agent (i.e.
"prefetcher") and possibility to create rules for it:

1. In robots.txt:

User-agent: Prefetcher
Disallow: ...

2. In META-tags:

<META name="prefetcher" content="..."

(index, noindex, follow, nofollow, cache, nocache)

I mean there are pages that pre-fetcher could prefetch but it could also exist
pages forbidden to it. You cannot fix it with white- and blacklist in
prefetcher. You need a protocol webmasters could tell prefetcher how to not be


More information about the Fasterfox mailing list