[Fasterfox] RE: Abusing sites with pre-fetching
tony.gentilcore at sbcglobal.net
Wed Dec 7 22:20:20 EST 2005
Thanks for the feedback. Several other individuals have reported similar
There is a bug filed for discussion here:
I'm currently looking into a way to resolve this. Right now I'm leaning
towards your robots.txt idea.
From: Piotr Kubala [mailto:pik-emwac at sponsor.com.pl]
Sent: Saturday, December 03, 2005 3:30 AM
To: fasterfox at mozdev.org
Subject: Abusing sites with pre-fetching
1. It is not true, that pre-fetching does not load dynamic content. It could
dynamic content with .htm extension too, and, in fact, there is.
2. In our site one user with pre-fetching behaves like hundreds of usual
especially on full website map page - and we must pay for this.
3. Prefetcher behaves like a robot and should follow robot rules - both
in robots.txt and in <meta name="robots"> tag. I mean "nofollow" should stop
prefetcher. There are traps for rude robots - invisible links to pages
robots.txt. Usual user cannot see these links and does not visit these
Pre-fetcher try to access forbidden pages and, in effect, causes denial of
access for user.
I think pre-fetching SHOULD obey robot rules.
I would really appreciate indentifying prefetcher as special user-agent
"prefetcher") and possibility to create rules for it:
1. In robots.txt:
2. In META-tags:
<META name="prefetcher" content="..."
(index, noindex, follow, nofollow, cache, nocache)
I mean there are pages that pre-fetcher could prefetch but it could also
pages forbidden to it. You cannot fix it with white- and blacklist in
prefetcher. You need a protocol webmasters could tell prefetcher how to not
More information about the Fasterfox