Skip to main content

First Google Webmaster Tools Update in 2014

When you log in and navigate over to “Search Queries” you’ll glimpse a timeline annotation saying: “An enhancement to our peak search queries facts and figures was directed retroactively on 12/31/13
Click on “Top Pages” and you’ll notice a vast amount of new and improved keyword data for your URLs:
There are currently noticeable performance issues with the webmaster tools, particularly when the URL is expanded. Loading keywords seems to take a while and sometimes it times out.

Data Comparison

We’ve compared our query data backups for the same website from 1 December 2013 to 31 December 2013 and found that the data before and after the update is indeed different:

Data Accuracy Benefits

The main benefit in increased accuracy of webmaster tools data is the ability to predict scenarios and focus your priorities on keywords and pages with greater potential to grow and generate revenue. We’ve  built our own version of the phrase potential calculator available on http://www.phraseresearch.com. The tool takes raw unaltered Google Webmaster Tools query export CSV and looks up position for each keyword. The end result is a table sortable by potential score, or in other words, a list of keywords likely to bring greatest benefit with the least amount of effort.

Random URL Test

We picked a random, low traffic URL which would have been affected by the recent improvements in keyword reporting and compared the data in Google Analytics, Google Webmaster Tools and Raw Server Log File. Here are the results:

Why is data not matching?

I hear many complaints that Google Webmaster Tools data is not reliable enough to be actionable. While assessing the accuracy of the data many use Google Analytics as a benchmark. Can you see how this could be a problem? Google Analytics data is not and has never been about absolute accuracy and many factors such as data sampling at large volumes or technical implications including presence and reliability of javascript on tracked URLs. As usual, if you want absolute accuracy you can always turn to log file analysis. Our tests are yet to show a report in Analytics that matches log file data.

Privacy Considerations

Google Webmaster Tools used to omit queries with less than ten clicks/impressions and last year I was told that one of the reasons for this was to protect user privacy (example: somebody typing in their email, password or other sensitive information and somehow reaching your site).
To protect user privacy, Google doesn’t aggregate all data. For example, we might not track some queries that are made a very small number of times or those that contain personal or sensitive information. Source: Google
Now that the click limit has been been lifted, there simply has to be another safeguard in place. Perhaps Google’s systems have the ability to detect unusual queries which deviate from the usual pattern on a semantic level and exclude them from Webmaster Tools reports. We’ll try to get an answer from Google and report back.

Phrase Clustering

Another thing which is not obvious straight away is that Google Webmaster Tools data is merged info similar groups. For example any of the surrounding search terms end up displayed as the search phrase in the centre of the following diagram:
Keyword Merging
At first you may think that any of the four phrase variations are missing from your data set. Instead they’re camouflaged behind the “canonical phrase” and contribute towards its total numbers including:
  1. Impressions
  2. Clicks
  3. Position
  4. CTR
  5. Change Parameters
To be fair, most of the above queries are likely to trigger the same document in search results anyway and it makes sense to show the cumulative figure instead of inflating the report with search phrase variants.

Data Limit

Finally Google Webmaster Tools top search queries are limited to 2000 search phrases:
Specific user queries for which your site appeared in search results. Webmaster Tools shows data for the top 2,000 queries that returned your site at least once or twice in search results in the selected period. This list reflects any filters you’ve set (for example, a search query for [flowers] on google.ca is counted separately from a query for [flowers] on google.com). Source: Google
This means that you are likely to recover only a fraction of the lost keyword data if you run a very large site (for example an online store or a catalogue). Note: We checked the data and it appears as if the original limit has been lifted, but the question remains, are there any limits currently in place?
More on this subject: http://dejanseo.com.au/tail-chase/

Official Announcement

Google’s Webmaster Central Blog has just covered the update on a new post titled “More detailed search queries in Webmaster Tools” saying:
“The search queries feature gives insights into the searches that have at least one page from your website shown in the search results. It collects these “impressions” together with the times when users visited your site – the “clicks” – and displays these for the last 90 days.”
According to John Mueller from Google this update will completely roll out in the next few days.
In other news Google is currently also rolling out a separate update that will show better search query data for websites with separate mobile sites.
“If you have a separate site for smartphones (eg m.example.com) then this will make it a bit easier for you to keep track of the queries going there.”

Comments

Popular posts from this blog

High PR Dofallow Blog Commenting Sites

http://www.dmarkcato.com/2011/02/27/27-february-2011/comment-page-1/#comment-1424 http://www.ancientegyptholidays.com/blog/2011/02/world-of-dmcs-the-easy-solution-for-incentive-travel-and-events-organization-worldwide/#comment-11935 http://www.travelblogs.com/articles/lunch-with-yulia/comment-page-1#comment-59690 http://www.journeyetc.com/hotel-reviews/top-8-most-romantic-paris-hotels/?cid=54859 http://www.dmc.com.hk/DMC/?p=179&cpage=1#comment-6999 http://anmatservices.wordpress.com/2008/12/15/profile/#comment-12 http://vanabode.com/blog/alternative-living/earning-money-traveling/comment-page-1/#comment-244 http://www.blogger.com/comment.g?blogID=2336044328955501444&postID=1122584686999351146&page=1&token=1299852233530 http://www.travelwell.in/travelwellblog/index.php/2011/02/the-e-ticket-5-gadgets-to-help-you-travel-well/comment-page-1/#comment-1465 http://www.jujuparent.com/juju/2009/08/31/Tips-for-successf...

Two Weeks In, Google Talks Penguin Update, Ways To Recover & Negative SEO

It’s been about two weeks since Google launched its  Penguin Update . Google’s happy the new spam-fighting algorithm is improving things as intended. But some hurt by it are still wondering how to recover, and there remain concerns about “negative SEO” as a threat. I caught up with Matt Cutts, the head of Google’s web spam team, on these and some related questions. Penguin: “A Success” The goal of any algorithm update is to improve search results. So how’s Penguin been for Google? “It’s been a success from our standpoint,” Cutts said. What About Those Weird Results? Of course, soon after Penguin was released, people quickly started citing  examples of odd results . The official Viagra site wasn’t listed, while hacked sites were. An empty web site was listed for “make money online,” and there were reports of other empty sites ranking well. Scraper sites were reported outranking the sites they scraped. How could Penguin be a success with these types of things happening? Cutts s...

Whitehat SEO Techniques in 2013: Learning From Blackhat SEO

The phrases “white hat” and “black hat” are loaded guns, and we only use them because they’re so ubiquitous. The reality is, when you tell yourself you are a “white hat,” you can end up fooling yourself into thinking that your strategy will habitually work, and Google will not ever turn it back on you. Worse still, you can close your brain off to insights that spectacularly improve business results. Don’t misunderstand us. Ethics are vital. If you don’t currently realise why it’s wholeheartedly vital for SEO to be crystal clear and ethical in the years going forward, take a gaze at what we composed over at seek motor Journal. (Hint: the algorithm is only a very little part of why ethics matter.) But there’s a distinction between ethics and restrictive marks, and if you aren’t discovering anything from “black hats,” you’re probably missing some key insights, like these: 1. Testing is Always Better than Blind Faith Before you head directly to the comment section and ...