I swear I’ll stop talking about spiders at some point, because I know nobody gives a shit, but… I’ve been messing around with that trap script a lot this weekend. One of the things that really drives up my bandwidth is when people try to grab hundreds of pages for offline browsing. Well, I think I figured out a way to prevent that. It doesn’t matter what tool you use either — any tool (or browser) that tries to download my site for offline browsing, without following the instructions in my robots.txt, will hopefully be caught by this trap and banned right away. I don’t think this will affect anybody who’s just surfing and doing things the normal way, only the genuine site-grabbers. But I’m fully prepared to eat my words and un-ban anyone that gets locked out by accident. I can’t promise, but I think this is the finishing touch that will allow me to step back and not have to constantly study my traffic the way I have been. I’ll be so glad, I really have better things to do that comb through my own trash!
Saturday July 26, 2003 – 3:56 am