It may come as a surprise to learn that Google has some great resources on offer to help you improve your rankings within its own search engine. Presenting your main content in text, correctly using your on-page META data and legitimately gathering links to your site are just some of the areas covered in detail – all to help you improve your ranking success.
But Google goes a step further than just providing content for you to read and implement. Hidden away in the dark recesses of all this is a handy collection of mini applications that every website owner should be aware of and use to their advantage. Aptly called ‘Webmaster Tools’, this online collection of features allows you to interface directly with the main Google search index.
You can find Webmaster Tools here: www.google.com/webmasters/tools/. To get going you need a valid Google Account and a website domain you want to work on. Once both are entered, Google will tell you when its robot last visited the site and the total number of pages it found and indexed while it was there.
In addition to this basic detail it also lists additional options such as: Diagnostics, Site Maps, Tools and the goldmine of the solution, Statistics. But before you can start to access all this extra good stuff you need to confirm with Google that you are authorized to act on behalf of the domain you have registered.
You do this by following a process of verification. There are two options available to you. You can either place a unique piece of META data on your home page or upload a uniquely named text file into the website’s root directory, for instance www.example.com/uniquename.htm. When either is complete the Google system will check all is fine before uncovering the complete range of tools and data available.
The Site Map option is one of the first things to set up once you are verified. Once complete this will help the Google robot locate and index all the pages on your site. A site map is a single page that lists all your site’s pages in a format that search robots like Google‘s can read. Most website content management systems create site maps automatically and in a format that is Google friendly. Otherwise, you will need to create one yourself using the many free site map creation tools available online. Soon after you submit your map to Google it gathers the information and advises you if there were any problems processing what was provided.
The Diagnostics section of Webmaster Tools will highlight any other problem areas – specifically to do with how your website content is presented. For instance, the search robot may have found some broken links when it last visited, or there may have been pages that took too long to load. This area also warns you about Title tags and META description content that are too long, too short or are duplicated across pages. All of these content warnings will need fixing to help you improve your search engine rankings and therefore your organic search traffic.
The detailed information made available in the Links section of Webmaster Tools supports the importance Google places on this area when classifying your website. Not only does this section list the number of inbound links your site receives, where they come from and what pages they go to, but it also reveals the internal links you have between pages of your site. Both of these pieces of information are quite difficult to find from Google through other means.
The Tools section includes some features to help you maintain the way your content is shown in the Google Index. For example, if your website set-up allows both http://example.com/index.html and http://www.example.com/index.html, you can use this section to tell Google which of the two formats you prefer. Plus, if you have any content in the index that you want removed then here is the place to make such a request.
You can always control the areas of your website that are indexed by Google by controlling where its robot visits through the use of a simple text file aptly called robots.txt. Usually found in the root directory of your site, e.g. www.example.com/robots.txt, this file is frequently used to tell the Google robot NOT to index certain areas of the site. Google advises that it checks for this file each day so changes made here should be actioned reasonably promptly. Again in the Tools section, you can check that this robots.txt file is being picked up and actioned on.
Of course I’m leaving the best to last – Statistics. This area uncovers for the first time how your website is performing in Google’s Organic listings. Now, like your Google AdWords reports, you can see not only your site’s top searched queries but also those terms that produced the most clicks. And in each case you can segment this data by search type (image/text), geographic region and time (1 week ago, last month and onwards).
The top search query report reveals the top 20 keywords that Google searchers typed in that resulted in your website being shown in their results. For each keyword, you are shown: its rank (from 1 to 20), the term itself, and where in the search results your site was shown. Slice this data by geographic region and you can start to see the effectiveness of any optimisation you are working on.
While this information is handy it’s the next report that reveals the true gold of Google’s Webmaster Tools.
While it is interesting to see where in the search results your site is appearing it is even more valuable to see which appearances generate clicks. Of course in an ideal world there should be an exact match between these two tables – every time your site came up in the results people clicked on the link to learn more. However, this is never the case. But now you can compare the two – the keywords that were presented with those that generated clicks – to see where the gaps lay.
For instance, you may be surprised to find that certain keywords that scored high on searches may not rank so well when it comes to generating clicks. And remember that while high rankings are good, its clicks that bring with it new prospects. To dig further into why this anomaly is occurring, run your own searches for these high-ranking, low-clicking terms and see what else is around your result that could be pulling the traffic away.
Your Title tag and META description tag should be doing the heavy lifting of convincing people to click on your search result. Make some changes in these areas if you see a page that is all show and no clicks.
Website Tools tells us that this storehouse of valuable keyword data is refreshed weekly but for most websites a monthly review would suffice. That way you can allow enough time to make some changes and see the results come through.
So, there you have it – just a few points to hopefully convince you to spare some time this month to familiarise yourself with Google’s Webmaster Tools. Set up an account and verify your domain and in a few moments all that Google knows will be revealed – you may be in for a pleasant surprise.