Linkshare Affiliate Network
Click here to watch The Conversion Blogging Video

Revisiting robots.txt

April 18th, 2007 · 6 Comments

Following on from my experiment to create a WordPress robots.txt file, I’ve noticed that Big G has seriously reduced the number of my pages they list in their index. It was previously sitting at 260+ pages but is now showing only 103 in the main index with 130+ pages now listed in the supplemental index.

What the Hell went wrong?!? 😯

I could revert back to Plan A which was to have no robots.txt file at all but, instead, I’m going to give Plan B a whirl. Plan B is to replicate Evertons robots.txt file which has increased his Google traffic by 16% in 4 days. I’ll keep an eye on things through Google Webmaster Tools and let you know how things pan out.

UPDATE: I’ve added in the automated sitemap submission line cuz I forgot first time around 😳

Tech Tags: , , , ,

Tags: Search Marketing

6 responses so far ↓

  • Mike // May 4, 2007 at 5:29 pm

    Wading through a pile of unread emails, I came across an article from Sitepro News which suggests that creating a blank robots.txt file can help reduce server bandwidth.

    The article cites a website which was getting hammered by the spiders to the tune of 8 Gbyte per month despite the fact that the site only had 200 daily visitors and less than 100 posts. By implementing a blank robots.txt file, the monthly bandwidth dropped from 8GB to 500MB.

    Link to article

    Is there nothing that little robots.txt can’t do? 😉

    I’m going to give it a whirl on a couple of my sites and let you know.

Trackbacks

  1. Creating the ultimate WordPress robots.txt file | Twenty Steps
  2. June Is Busting Out All Over
  3. Updated WordPress robots.txt example optimized for Google and SEO
  4. Wordpress Robots.txt | SEO Scout | Suchmaschinenoptimierung
  5. Memanfaatkan Robots.txt Untuk Mengamankan PageRank Blog Review, Mungkinkah? | Fanari dash Id dot Com | My Official & Commercial Blog