One of the reasons I started seeing an increase in revenue at the tail end of 2006 was thanks to some work I put in around October time. I’d been lazy with the way I’d put together a few of my sites and, as a result, these sites had been confined to Google’s supplemental index.
If you don’t know what the supplemental index is, a full definition can be found at the Google Webmaster Help Center. Basically if your site ends up in the supplemental index then there is little chance of you appearing in the search results and, therefore, your search engine traffic will be pretty minimal.
I made a few little tweaks on a couple of sites and am now starting to see the benefit. They’re pretty easy fixes and shouldn’t take you too long to implement.
The first thing you need to do is see whether you’ve got a problem with how Google has indexed your site and whether you’ve got a lot of pages in the supplemental index. To do this, open up Google and enter this as your search: site:www.yourdomain.com This will return all of the pages from your site in Googles index.
In the example above, the first supplemental result didn’t come until after 180+ results. There are 260 pages indexed for this site in Google so I’m quite happy with that. However, before I tweaked Twenty Steps and a number of my other sites, I was getting supplementals on the first page of results. That’s bad. The sites had only just come out of the Sandbox and I was still no closer to getting any traffic.
So what did I do?
Firstly I ensured that I had unique title tags for each page. I had been lazy previously and made my title tag something like “Site Name – Page Name”. Google sees this as being a bit spammy so I took the site name out of the title altogether and just made the title something relevant to the content and made sure I had one or two keyphrases in there for good measure. As I said in my previous article about tips to improve your search engine ranking aim for 65 characters.
Next I ensured that each page had a unique meta description. Again, I’d been lazy and used the same one across the sites. I rewrote every description across the site making it laser focused on the content, again using keyphrases. It might seem like a lot of effort to write unqiue descriptions for 100 pages but it will certainly pay dividends in the end.
Next were meta keywords. I didn’t spend too much time on these as Google don’t seem to pay too much attention to them but I figured since I was doing all this hard work that I might as well include them. Laser focused keywords relating to the page content and not, as I’d done previously, site content.
I trimmed down the actual URLs I’d been using to make them more streamlined whilst still retaining keywords in the URL. Google seems to see long hyphenated URLs as being spammy so keep the URL short and punchy but still retain your keywords.
Some other things which can place you in the supplemental results are having orphaned pages, duplicate content and deep linked pages. Personally this wasn’t an issue for me but it’s something to consider with your own site. In a lot of ways it makes sense to ensure that all of your pages are linked and avoid going any deeper than two pages deep.
With this site, and a couple of my other WordPress sites, I took out all of the META tags. This seems to have worked a treat. I also rearranged the way the title tag works so that the page title comes first followed by the site name. I’m pretty sure there are some WP plugins out there which will help create unique META data. I may well have a look at them and give them a whirl but in the meanwhile, I’ve ditched the lot.
It took a lot of hard work to come up with unique descriptions, tags and such like but I’m starting to see the results. What’s the point of sitting in the Sandbox for 6 months only to then sit in the supplemental index?