Webmaster Tools by Google

Kristarella at „This and that“ had an interesting article about Google’s Analytics tool.

Although I use my Analytics account on a regular basis, I used it as a reason to have a closer look at my stats. While doing so, I remembered that I also have an account for Google’s Webmaster Tools (including Sitemaps). So I went there and discovered that I forgot to maintain it for quite some while. As I dug through the options and information provided I discovered a lot of useful options and features a webmaster should use.

Google Webmaster Tools

The Overview

The sitemap overview, or dashboard as Google calls it, shows you a list of the domains you added for monitoring.
If you have no domains listed, add one.

When adding one you will have to verify that your are really the webmaster of that site.
There are two methods to verify:

  • upload a special named html file
  • add a special meta tag

However you decide, the decision is period and you need to keep the file or the meta tag online.

Diagnostic View

After you selected a domain you want to have more information about, you will navigate to the diagnostic view with the summary page as the index page.

It provides you with basic information like

  • Was your page crawled by the googlebot?
  • Is your site listed in the Google Index?
  • Are there any crawl errors?

The Summary is self-explanatory and easy to use. Just click on the „Details“ link if there are any errors or whatever and you will know what is wrong and how to fix it.
In the Diagnostic View you have a few more tools you can use, I will explain them later because I consider them as „advanced“ and very useful.

Beside the Diagnostic View or category, you have Statistics, Links and Sitemaps.

Statistics

The Statistics View is in my opinion a very great tool. It gives you information about Crawl Stats, Query Stats, Page Analysis and Index Stats.

Google Webmaster Tools - Crawl StatsThe Crawl Stats could need a few more upgrades because at first you don’t understand the PageRank information, or at least I need a bit time to come behind it. The PageRank shows four bars, with order from a high page rank to no page rank.

These bars are grey and depending on the percentage of the ranked pages they will be filled with a green color. At first glance you have no real idea how this works until you discover that that green represents the percentage of pages for PageRank X. However, you don’t have an exact percentage amount nor do you know which pages have which rank.
Hopefully Google will change it soon because it would make things a bit easier.

Query Stats are like the keywords list in Google Analytics although the list is less powerful. If you don’t want to use Google Analytics, maybe this is enough for you although I highly recommend Google Analytics.

Page analysis shows you how the Googlebot sees your website. I didn’t see much sense in this so far. If you know how this can be useful, please let me know in a comment 🙂

Index stats gives you a few links to special searches from Google. Not really that interesting, but maybe it is just because some of those links return no result for me…

Links

Behaves like the Google Analytics link referrer tracking at least for the External links option. Again, not really useful if you are using Google Analytics. Internal links on the other are somewhat interesting because you can see how your pages are cross linking to each other. I just wish Google’s Webmaster tools would draw some kind of a map, visualizing the cross linking.

Sitemaps

This one is really useful. You can submit the location of a XML sitemap to Google and make sure that Google uses the sitemap for indexing your page. Also, it checks your sitemap for errors and tells you what you need to fix to have a valid xml sitemap. This is really handy!

Advanced and especially Useful

Google Webmaster Tools - Diagnostic View MenuI skipped a few tools in the Diagnostic View earlier for a reason. I want to pay more attention to the „robots.txt analysis“, „Crawl rate“, „Preferred domain“, „Enhanced image search“ and „URL Removals“ tools.

robots.txt analysis
This tool allows you to test a robots.txt configuration before you use it on the live site. If you are already using a robots.txt it will show the current setup.
If you have a lot of errors reported, you can use this tool to modify and test the robots.txt until it works as you want it. As soon as it works, copy it into the actual robots.txt and update it. Next time you get crawled it should work as intended. That way you reduce the risk of blocking the indexing of your page by accident which can result in a worse position on the Google Index. Handy, isn’t it?

Crawl rate
If your server is under a heavy load you can use it to reduce the crawl rate from the Googlebot. It also has a dark side because your site will be less often crawled. I think a better server performance is more useful than a high crawl rate.
If you are lucky, you will also have a very nice option unlocked. Depending on how often you add new content to the page you are able to tell Googlebot to crawl your page more often. I happened to unlock this option earlier of this article and enabled to see if it is useful or not. Unfortunately it also has a dark side because it puts more load on the server. It is your decision if you have the capacity to have some more load or not.

Preferred domain
You are using the www prefix for your site? Set it as your preference. You are not using the www? Tell google that you prefer not using it.
Google don’t promise that your preference will show up on the google search results, but they use it also recommendation. I played with it a while ago and for my pages it did show my preference on the results. Maybe it was because of my keywords, maybe not. Try it and you will know how it works for you.

Enhanced image search
I don’t know exactly what it does, but I’m using it.
As I understand it, Google crawls the images on your website and if you opt-in to the enhanced image search it will assign labels to the images. These labels will improve the indexing and search quality of those images
For more details about this option read the corresponding Google Webmaster Help Center entry.

URL Removals
This possibly the best tool available! You use Google and find pages shown as search result you no longer provide? Send a removal request to google. Within 48 hours these pages will be removed from the index.
Actually it would be enough to give the Googlebot a 404 (page not found) or 410 (gone) error, but depending on your server configuration and some other factors, it can be that the Googlebot sees a 200 (Ok) and that means it won’t remove the non existing pages. If you have such a case, use the URL Removal tool.
I assume you could even use it as you delete the pages, ensuring that the pages are gone already before your page is crawled the next time.

Back to the Dashboard

Google Webmaster Tools - Dashboard toolsLast but not least I want to you tell you about 3 tools located on the dashboard which could be useful for you.

Report spam in our index
You use Google Search and the results your receive look like they are spam? Tell Google! The thing they can do is using the information your provide to improve their search algorithm and filters to ignore such spam.

Submit a reconsideration request
If your website violated Google’s webmaster guidelines and you made changes to your site so that it adheres to the guidelines, inform Google by using this tool.
Also, if you suspect that your recently acquired domain may have previously violated our webmaster guidelines, use the tool to tell Google that the owner and the site changed.

Report paid links
Most of us webmasters know how Search Engine Optimization (SEO) works and are aware that „paid links“ won’t help increasing your website’s pagerank. However, there are black sheep and they try to use paid links to increase traffic to their pages in an attempt to increase their pagerank.
You know a website that buys or sells links, inform Google. They are always welcome information from their users because they use this information to increase the quality of their search results.

Wrap up

I think the Google Webmaster tools are very useful. However, I don’t rely on them alone. As I said earlier, I’m also using Google Analytics and in combination with the Webmaster tools I’m able to receive a lot of information I can use to optimize my websites.

What about you? Are you using Google Webmaster tools? Do you think I missed something or should explain something more detailed?

4 Gedanken zu „Webmaster Tools by Google

  1. I really love the tools, especially the ability to set the crawl rate. Although on my blog it’s not needed to alter but a forum which I ran in the past was totally being flooded by Google with about 5.000 crawls a day, making Google eat up about 6GB of bandwidth on a monthly basis. The forum though was running on a crappy hosting which offered just a measly 10GB of bandwidth back then so Google nearly used up that all by itself.

    A nice article though 🙂

Schreibe einen Kommentar