[Project_owners] Spiders, Crawlers and Spam Harvesters

Pete Collins pete at mozdev.org
Mon Oct 27 18:21:29 EST 2003


Ok, I found a problem. We have many robots.txt files in alias dirs like 
bugs/, lxr/ etc.

However, there was a misconfiguration and *.txt files weren't being 
served. *So*, every spider and crawler under the sun was spidering 
bugzilla, lxr, cvsweb etc. Ouch!

I made the fixes and *hopefully* these bots will obey the robots.txt and 
the <meta name="robots" content="noarchive"> I just added.


--pete

-- 
Pete Collins
www.mozdev.org
www.mozdevgroup.com




More information about the Project_owners mailing list