Google Webmaster Tools has been fairly popular for some time but it’s popularity is beginning to grow even more quickly. Much of this new found fame is due to the messages being received by a wide range of webmasters (over 700,000 recently), primarily from, or what seems to be from, the changes rolled out in Google’s Penguin update. Equally significant is the significant buzz and publicity surrounding the fact that receiving these messages is generally bad news for your site – so that’s prompted many webmasters as well as general site owners to be concerned about what these messages mean.
Crawl errors and other server malfunctions can be a major problem as having them last too long could result in Google not visiting and indexing your pages. Many times, webmasters don’t even realize when these problems are occurring or just don’t catch them quickly enough and loose rankings for pages which fail to remain indexed since Google can’t access, read or inspect them any longer. They’re also red flags to problems within your hosting environment or programmed pages. With more and more CMS ‘content management systems’ in use, programming errors are becoming much more frequent and problematic as well as instances where hackers are able to “break in”, not to mention the misuse of servers from those looking to control email servers for spam though php injections, to host malware on your pages and go unnoticed, and/or take over for other forms of server and web site misuse.
Search engine crawlers are, for the most part, silent and rarely require much monitoring as far as how much they’re crawling your site, if, when, and how often (unless your sites’ huge with millions and millions of pages). General users who regularly post on forums might occasionally see Google, Yahoo and Bing bots as logged in users on the forum, but otherwise, especially to the laymen, crawlers and bots don’t stick out too much. Developers see them often in their web logs or stats (server statistics) but usually don’t know or realize if there is any problem with a search engines visit or how they’re indexing your site. Webmasters see them come and go, but other than that, don’t have much more of a chance to view or examine the actual crawl results – unless either their site is operating really slow or pages aren’t getting indexed long after they’ve being published.
It is important to note that certain server configurations can prevent Googlebot and other search engines from making connections or accessing a “robots.txt” file (a file search engines check for permission to crawl). Naturally, Google also considers downed or overloaded sites to be experiencing problems or unreliable – a characteristic you don’t want to be labeled with if you’re looking for success in search engines. Pages which have been deleted or removed, possibly by accident, or are found through broken links cause 404 errors (the response which means the “Page is Not Found”). These are the sorts of connectivity errors that just about everyone who runs a website experiences at some time or another, but since the goal of search engine optimization is to achieve high rankings, those focused on SEO want to be sure that their sites are easily accessible to search engines – at all times. If they aren’t, all other efforts could be useless. So with this new feature, any of these server errors would prompt a report in Webmaster Tools which could be emailed to the account holder, or anyone the account holder designates. This way webmasters can not only see and address major site-wide problems, but receive alerts for things Google finds at the time Google finds them.
Websites which have broken pages, begin to run slower than Googlebot can crawl effectively, or redirect to broken pages or errors through poorly configured servers can find them themselves suffering in search, sometimes pretty severely. Although Google likes to give pages the benefit of the doubt before DE indexing material, pages which no longer appear will eventually be deleted from their results pages. This is a particularly serious problem, especially if your site is being hacked or broken into, site owners need to know right away that trouble is brewing on their server or they could wind up with the kiss of death: one of those “This site may harm your computer” messages. You want to see your traffic take a nose dive? Get one of those messages next to your link in Google and watch what happens to your traffic.
As far as simple crawl errors, they have nothing to do with the Google, the individual web sites cause them. Crawl errors simply mean that Google’s crawler isn’t able to visit particular pages because of problems with the servers request for pages that are hosted on a web site. The search engine is powerless to correct other people’s problems on their server. However, when webmasters receive these alerts they at least have the opportunity to correct them. The first step to fixing a problem is realizing the problem exists.
Google offered these alerts and messages for some time, but what’s new is that Google will now email them to the Webmaster Tools account holder or even a contact specified within Webmaster Tools like a developer or dedicated consultant. That’s the new part; although they offered the alerts in the past, many webmasters tend to not check the system often, which means an alert can sit for weeks or months at a time before anyone sees it.
Below if a video from Matt Cutts of Google from when they first released Webmaster Tools describing new features Google is offering to users and webmasters to check robots.txt files, discover crawl errors that Googlebot finds, and see other problems.
Google knows and understands that this doesn’t serve its purpose and that by the time an error is noticed it could very well be due to a sites traffic falling off a cliff from an undiscovered problem that went too long, so communication continues to be something Google moves to improve as they obviously feel more and more responsible for what users find on the web. Either that or they continue to work towards getting as many users to use their tools as possible, because the more you use, the more they know – like Google Analytics. I bet it’s nice for Google to know about the traffic they’re sending you. I bet it’s even nicer to know about the traffic they don’t send you.
To make sure you have email set up for your Webmaster Tools account (if you want them) visit: https://www.google.com/webmasters/tools/preferences?hl=en
About The Author: John Colascione is Chief Executive Officer of Internet Marketing Services Inc. He specializes in Website Monetization, is a Google AdWords Certified Professional, authored a ‘how to’ book called ”Mastering Your Website‘, and is a key player in several Internet related businesses through his search engine strategy brand Searchen Networks®
Leave a Reply