Following on from my experiment to create a WordPress robots.txt file, I’ve noticed that Big G has seriously reduced the number of my pages they list in their index. It was previously sitting at 260+ pages but is now showing only 103 in the main index with 130+ pages now listed in the supplemental index.
What the Hell went wrong?!? 😯
I could revert back to Plan A which was to have no robots.txt file at all but, instead, I’m going to give Plan B a whirl. Plan B is to replicate Evertons robots.txt file which has increased his Google traffic by 16% in 4 days. I’ll keep an eye on things through Google Webmaster Tools and let you know how things pan out.
UPDATE: I’ve added in the automated sitemap submission line cuz I forgot first time around 😳