How to protect your website with Google’s Webmaster Tools
Google Webmaster Tools should be an essential tool in your quest for web dominance. OK, that’s probably a little exaggerated, but the toolkit that Google’s Webmaster Tools provides is essentially what it says in the title – a set of tools to help you master the performance of your website.
“Performance” is interpreted by many to infer search engine optimisation in connection with the Webmaster Tools - which is a very good application of the toolset as they provide up-to-date feedback on how your site is being found, what keywords are prominent, what people are searching for and how many of these convert to click-throughs. So they are good for SEO and SEM.
However, they are also great for managing the actual performance in the original sense for fine tuning and tweaking your website to give the best experience to your visitors – this is from within the “Diagnostics” section. Here are two examples.
Website Security (Malware)
Complex websites – especially those with any form of login such as content management systems, WordPress blogs, etc – are more vulnerable to hacking. Hackers may want to steal information about your customers or plant malware to infect unsuspecting visitors. The latter is particularly true for open source systems such as WordPress and phpBB (etc).
The great thing about Google Webmaster Tools is that they scan your site when the search engine crawler indexes the page and alert you (via email) if they find anything abnormal. If your site is hacked, you have an insight into what it is infected by, and even though your site may be flagged by Google as “harmful” to visitors, once you fix the problem you can use Webmaster Tools to resubmit your site to be crawled (usually within 24 hours) to remove its “harmful” status.
Without Webmaster Tools your site could be flagged as dangerous for some time before you realise - and so lose you valuable traffic – plus it can take some time for the site to be marked as “safe” as you are dependent on ad hoc crawls from Google to pick up the new status.
No More Dead Ends (Page Loading/Access)
Another great feature of the Webmaster Tools is their ability to flag broken links – both inbound and internal (titled “crawl errors”) – along with an error code telling you why the page is listed.
If you are getting lots of error 500s, this most likely means you have a coding or database optimisation issue which is causing requested pages to fail to load for the spider. If you look at your “crawl stats” you can also see how slow or fast search engine indexing is. In a current project, I noticed very slow indexing and also lots of error 500s. I tracked this back to database optimisation issues, and after changes were implemented indexing is now significantly faster (by a factor of 10 overall), the number of pages indexed has increased significantly and the error 500s have stopped appearing. The end result is a better performing site, both for visitors (they also don’t see error 500s and pages load more quickly) and in terms of what Google can index (which in turn is better for SEO as there is more content visible to the search engine).
Where there are lots of broken links, you can use this list to go back and fix them. If they are inbound links, you can update your .htaccess file (on a LAMP server) to catch and redirect them with a different error code. This has a benefit for SEO as when a search engine follows the external link it is told nicely that the page has moved and it should index the new page in place of the old one. You are being helpful to the search engine and so you get a more cleanly indexed site.
However, the real benefit is to the visitor because they don’t get 404 errors (and so decide to go elsewhere) which results in more readers for your site.