Skip to main content

First Google Webmaster Tools Update in 2014

When you log in and navigate over to “Search Queries” you’ll glimpse a timeline annotation saying: “An enhancement to our peak search queries facts and figures was directed retroactively on 12/31/13
Click on “Top Pages” and you’ll notice a vast amount of new and improved keyword data for your URLs:
There are currently noticeable performance issues with the webmaster tools, particularly when the URL is expanded. Loading keywords seems to take a while and sometimes it times out.

Data Comparison

We’ve compared our query data backups for the same website from 1 December 2013 to 31 December 2013 and found that the data before and after the update is indeed different:

Data Accuracy Benefits

The main benefit in increased accuracy of webmaster tools data is the ability to predict scenarios and focus your priorities on keywords and pages with greater potential to grow and generate revenue. We’ve  built our own version of the phrase potential calculator available on http://www.phraseresearch.com. The tool takes raw unaltered Google Webmaster Tools query export CSV and looks up position for each keyword. The end result is a table sortable by potential score, or in other words, a list of keywords likely to bring greatest benefit with the least amount of effort.

Random URL Test

We picked a random, low traffic URL which would have been affected by the recent improvements in keyword reporting and compared the data in Google Analytics, Google Webmaster Tools and Raw Server Log File. Here are the results:

Why is data not matching?

I hear many complaints that Google Webmaster Tools data is not reliable enough to be actionable. While assessing the accuracy of the data many use Google Analytics as a benchmark. Can you see how this could be a problem? Google Analytics data is not and has never been about absolute accuracy and many factors such as data sampling at large volumes or technical implications including presence and reliability of javascript on tracked URLs. As usual, if you want absolute accuracy you can always turn to log file analysis. Our tests are yet to show a report in Analytics that matches log file data.

Privacy Considerations

Google Webmaster Tools used to omit queries with less than ten clicks/impressions and last year I was told that one of the reasons for this was to protect user privacy (example: somebody typing in their email, password or other sensitive information and somehow reaching your site).
To protect user privacy, Google doesn’t aggregate all data. For example, we might not track some queries that are made a very small number of times or those that contain personal or sensitive information. Source: Google
Now that the click limit has been been lifted, there simply has to be another safeguard in place. Perhaps Google’s systems have the ability to detect unusual queries which deviate from the usual pattern on a semantic level and exclude them from Webmaster Tools reports. We’ll try to get an answer from Google and report back.

Phrase Clustering

Another thing which is not obvious straight away is that Google Webmaster Tools data is merged info similar groups. For example any of the surrounding search terms end up displayed as the search phrase in the centre of the following diagram:
Keyword Merging
At first you may think that any of the four phrase variations are missing from your data set. Instead they’re camouflaged behind the “canonical phrase” and contribute towards its total numbers including:
  1. Impressions
  2. Clicks
  3. Position
  4. CTR
  5. Change Parameters
To be fair, most of the above queries are likely to trigger the same document in search results anyway and it makes sense to show the cumulative figure instead of inflating the report with search phrase variants.

Data Limit

Finally Google Webmaster Tools top search queries are limited to 2000 search phrases:
Specific user queries for which your site appeared in search results. Webmaster Tools shows data for the top 2,000 queries that returned your site at least once or twice in search results in the selected period. This list reflects any filters you’ve set (for example, a search query for [flowers] on google.ca is counted separately from a query for [flowers] on google.com). Source: Google
This means that you are likely to recover only a fraction of the lost keyword data if you run a very large site (for example an online store or a catalogue). Note: We checked the data and it appears as if the original limit has been lifted, but the question remains, are there any limits currently in place?
More on this subject: http://dejanseo.com.au/tail-chase/

Official Announcement

Google’s Webmaster Central Blog has just covered the update on a new post titled “More detailed search queries in Webmaster Tools” saying:
“The search queries feature gives insights into the searches that have at least one page from your website shown in the search results. It collects these “impressions” together with the times when users visited your site – the “clicks” – and displays these for the last 90 days.”
According to John Mueller from Google this update will completely roll out in the next few days.
In other news Google is currently also rolling out a separate update that will show better search query data for websites with separate mobile sites.
“If you have a separate site for smartphones (eg m.example.com) then this will make it a bit easier for you to keep track of the queries going there.”

Comments

Popular posts from this blog

Whitehat SEO Techniques in 2013: Learning From Blackhat SEO

The phrases “white hat” and “black hat” are loaded guns, and we only use them because they’re so ubiquitous. The reality is, when you tell yourself you are a “white hat,” you can end up fooling yourself into thinking that your strategy will habitually work, and Google will not ever turn it back on you. Worse still, you can close your brain off to insights that spectacularly improve business results. Don’t misunderstand us. Ethics are vital. If you don’t currently realise why it’s wholeheartedly vital for SEO to be crystal clear and ethical in the years going forward, take a gaze at what we composed over at seek motor Journal. (Hint: the algorithm is only a very little part of why ethics matter.) But there’s a distinction between ethics and restrictive marks, and if you aren’t discovering anything from “black hats,” you’re probably missing some key insights, like these: 1. Testing is Always Better than Blind Faith Before you head directly to the comment section and ...

High PR Free Dofollow Web 2.0 and Blogs Sites

For those of you who desire to start blogging, actually does not require a large-scale total cost to be having a blog because so many free websites that we can use for blogging such as blog and web 2.0 sites. There are hundreds of free blog and web 2.0 sites available on the internet so that you can have any number of. Free blog web 2.0 sites, in supplement to good use for persons who are just beginning out blogging is furthermore frequently utilised as a dummy blog to construct strong back links for them because each free blog or web 2.0 location has a different IP and according to SEO, this is very good. List Of Free Dofollow Blog And world wide web 2.0 Sites The following register of free dofollow blog and web 2.0 sites, entire with google pagerank, nofollow or dofollow connections and other features. 2itb.com – PR4, dofollow Angelfire.com – PR7, nofollow Areavoices.com – PR6, dofollow, accepts XMLRPC Bcz.com – PR3 Blinkweb.com – PR5, accepts XMLRPC...