Saturday, January 11, 2014

First Google Webmaster Tools Update in 2014
Vinay Upadhyay11:27 AM 0 comments

When you log in and navigate over to “Search Queries” you’ll glimpse a timeline annotation saying: “An enhancement to our peak search queries facts and figures was directed retroactively on 12/31/13
Click on “Top Pages” and you’ll notice a vast amount of new and improved keyword data for your URLs:
There are currently noticeable performance issues with the webmaster tools, particularly when the URL is expanded. Loading keywords seems to take a while and sometimes it times out.

Data Comparison

We’ve compared our query data backups for the same website from 1 December 2013 to 31 December 2013 and found that the data before and after the update is indeed different:

Data Accuracy Benefits

The main benefit in increased accuracy of webmaster tools data is the ability to predict scenarios and focus your priorities on keywords and pages with greater potential to grow and generate revenue. We’ve  built our own version of the phrase potential calculator available on The tool takes raw unaltered Google Webmaster Tools query export CSV and looks up position for each keyword. The end result is a table sortable by potential score, or in other words, a list of keywords likely to bring greatest benefit with the least amount of effort.

Random URL Test

We picked a random, low traffic URL which would have been affected by the recent improvements in keyword reporting and compared the data in Google Analytics, Google Webmaster Tools and Raw Server Log File. Here are the results:

Why is data not matching?

I hear many complaints that Google Webmaster Tools data is not reliable enough to be actionable. While assessing the accuracy of the data many use Google Analytics as a benchmark. Can you see how this could be a problem? Google Analytics data is not and has never been about absolute accuracy and many factors such as data sampling at large volumes or technical implications including presence and reliability of javascript on tracked URLs. As usual, if you want absolute accuracy you can always turn to log file analysis. Our tests are yet to show a report in Analytics that matches log file data.

Privacy Considerations

Google Webmaster Tools used to omit queries with less than ten clicks/impressions and last year I was told that one of the reasons for this was to protect user privacy (example: somebody typing in their email, password or other sensitive information and somehow reaching your site).
To protect user privacy, Google doesn’t aggregate all data. For example, we might not track some queries that are made a very small number of times or those that contain personal or sensitive information. Source: Google
Now that the click limit has been been lifted, there simply has to be another safeguard in place. Perhaps Google’s systems have the ability to detect unusual queries which deviate from the usual pattern on a semantic level and exclude them from Webmaster Tools reports. We’ll try to get an answer from Google and report back.

Phrase Clustering

Another thing which is not obvious straight away is that Google Webmaster Tools data is merged info similar groups. For example any of the surrounding search terms end up displayed as the search phrase in the centre of the following diagram:
Keyword Merging
At first you may think that any of the four phrase variations are missing from your data set. Instead they’re camouflaged behind the “canonical phrase” and contribute towards its total numbers including:
  1. Impressions
  2. Clicks
  3. Position
  4. CTR
  5. Change Parameters
To be fair, most of the above queries are likely to trigger the same document in search results anyway and it makes sense to show the cumulative figure instead of inflating the report with search phrase variants.

Data Limit

Finally Google Webmaster Tools top search queries are limited to 2000 search phrases:
Specific user queries for which your site appeared in search results. Webmaster Tools shows data for the top 2,000 queries that returned your site at least once or twice in search results in the selected period. This list reflects any filters you’ve set (for example, a search query for [flowers] on is counted separately from a query for [flowers] on Source: Google
This means that you are likely to recover only a fraction of the lost keyword data if you run a very large site (for example an online store or a catalogue). Note: We checked the data and it appears as if the original limit has been lifted, but the question remains, are there any limits currently in place?
More on this subject:

Official Announcement

Google’s Webmaster Central Blog has just covered the update on a new post titled “More detailed search queries in Webmaster Tools” saying:
“The search queries feature gives insights into the searches that have at least one page from your website shown in the search results. It collects these “impressions” together with the times when users visited your site – the “clicks” – and displays these for the last 90 days.”
According to John Mueller from Google this update will completely roll out in the next few days.
In other news Google is currently also rolling out a separate update that will show better search query data for websites with separate mobile sites.
“If you have a separate site for smartphones (eg then this will make it a bit easier for you to keep track of the queries going there.”

Tuesday, December 31, 2013

The Complete List of Google Ranking Factor Infografic
Vinay Upadhyay11:23 AM 2 comments

At first glance, the information here might seem a bit overwhelming for some business owners. But there are three factors that probably have the most effect on your site's SEO:
  • Target keywords. It's important to have your target keywords in your site's title tags. Google relies on this tag to determine the topic of your page. Industry studies have shown that pages with their target keyword in the title tag tend to perform better than pages without.
  • Backlinks. Focus on building quality links pointing to your site. Although link quality is important, the total number of links pointing to your site is an important ranking signal.
  • Social signals. Get people to share your content on social media sites. Google is paying more attention to social signals, including retweets, Facebook shares and Google +s.
While it can be worthwhile for business owners to be at least familiar with some of the topics here, this infographic can be a valuable resource to share with those on your team who are managing your site's day-to-day SEO operation.
A Look at Google's 200 Search Ranking Factors (Infographic)

Monday, December 30, 2013

500 High PR DoFollow Forum Backlink Websites List 2013-2014
Vinay Upadhyay7:14 AM 1 comments

Friends, Today I will share 500 High PR DoFollow Forum Websites List 2013-2014. some day before, I was assembled This 500 Forum Backlink Websites list 2013-2014 . here you can create of mighty back connection for your blog or website. This List should be cooperative for every Search motor Optimizer.

If you think this free 500 High PR DoFollow Forum Backlink Websites List really helpful for you. please share this with your all Social friends.

Sunday, December 29, 2013

High PR Free Dofollow Web 2.0 and Blogs Sites
Vinay Upadhyay6:08 AM 3 comments

For those of you who desire to start blogging, actually does not require a large-scale total cost to be having a blog because so many free websites that we can use for blogging such as blog and web 2.0 sites. There are hundreds of free blog and web 2.0 sites available on the internet so that you can have any number of.

Free blog web 2.0 sites, in supplement to good use for persons who are just beginning out blogging is furthermore frequently utilised as a dummy blog to construct strong back links for them because each free blog or web 2.0 location has a different IP and according to SEO, this is very good.

List Of Free Dofollow Blog And world wide web 2.0 Sites

The following register of free dofollow blog and web 2.0 sites, entire with google pagerank, nofollow or dofollow connections and other features. – PR4, dofollow – PR7, nofollow – PR6, dofollow, accepts XMLRPC – PR3 – PR5, accepts XMLRPC, dofollow – PR6, accepts XMLRPC, dofollow – PR5, dofollow – PR8, accepts post by email, dofollow – PR4,dofollow – PR4, dofollow – PR2, dofollow – PR4, dofollow – PR3, dofollow – PR4, dofollow – PR5, dofollow – PR6, dofollow
Evood.vom – PR2, dofollow – PR, dofollow, filtered content – PR6, dofollow, filtered content, accepts XMLRPC – PR5, dofollow – PR6, dofollow – PR5, dofollow – PR6, nofollow – PR3, dofollow – PR4, dofollow – PR5, dofollow – PR4, dofollow – PR8, nofollow, accepts post by email, accepts XMLRPC – PR5, dofollow – PR8, nofollow, accepts post by email – PR7, dofollow, filtered content – PR4, dofollow – PR4, dofollow – PR4, dofollow – PR5, dofollow – PR7, dofollow – PR7, dofollow, accepts post by email – PR6, dofollow, accepts post by email – PR6, dofollow – PR7, dofollow – PR5, dofollow – PR7, dofollow – PR5, dofollow – PR4, dofollow – PR,3 dofollow – PR5, dofollow – PR7, nofollow, filtered content and just english – PR3, dofollow – PR5, nofollow – PR6, dofollow – PR8, dofollow – PR8, dofollow, accepts post by email – PR8, dofollow, accepts post by email – PR6, dofollow – PR2, dofollow – PR3, dofollow – PR7, dofollow – PR6, dofollow – PR8, dofollow – PR6, nofollow – PR7, dofollow – PR7, nofollow – PR9, dofollow, accepts post by email, accepts XMLRPC – PR8, dofollow – PR4, dofollow – PR6, nofollow – PR5, dofollow

If we have one blog on each site above then we would have a lot of blogs without having to put out the cost to make it, but remember that it is not easy to have a lot of blogs and we should be able to make content updates on a regula basis in every blogs.
I hope that list of free dofollow blog and web 2.0 above is useful for you.
Thank you for reading

Saturday, December 28, 2013

Whitehat SEO Techniques in 2013: Learning From Blackhat SEO
Vinay Upadhyay9:11 AM 2 comments

The phrases “white hat” and “black hat” are loaded guns, and we only use them because they’re so ubiquitous. The reality is, when you tell yourself you are a “white hat,” you can end up fooling yourself into thinking that your strategy will habitually work, and Google will not ever turn it back on you. Worse still, you can close your brain off to insights that spectacularly improve business results.

Don’t misunderstand us. Ethics are vital. If you don’t currently realise why it’s wholeheartedly vital for SEO to be crystal clear and ethical in the years going forward, take a gaze at what we composed over at seek motor Journal. (Hint: the algorithm is only a very little part of why ethics matter.)

But there’s a distinction between ethics and restrictive marks, and if you aren’t discovering anything from “black hats,” you’re probably missing some key insights, like these:

1. Testing is Always Better than Blind Faith

Before you head directly to the comment section and write a rage-fueled rant, let me issue out the detail that these are generalized declarations. They don’t request to every single “white hat” or “black hat” out there. But here we proceed:

White hats are less expected to test things than Black hats.

This is a regrettable reality about our industry. While there are abounding of very good number crunchers on the “inbound” side of SEO, like, say, Dr. Pete, your average white hat SEO is less likely to put things to the check than your mean black hat SEO. There are a couple of causes for this:

·         Black hats can check some theories much much quicker than white hats, because they can use automated programs and conceive controlled trials that aren’t practical with white hat methods

·         A large piece of white hats are “reformed” black hats who couldn’t stomach checks that kept getting them penalized, and have determined to simply pursue the advice of industry professionals rather than

·         Some confuse white hat SEO for doing precisely whead covering Google suggests, and therefore don’t bother checking any thing

Again, I’m not saying these declarations are factual for all, or even most, white hat SEOs. I’m simply saying that covering more white hats are guilty of this specific infringement than Black hats.
Things don’t have to be this way.

As we’ve said sometimes, it’s a bit ironic to put the phrase “optimizer” in your title if you aren’t doing any genuine checking for optimization. Even the inferior alteration rate optimizers realise this. It’s strange how couple of SEOs (on either side of the barrier) actually check their favourite ideas about the algorithm, or run the figures to glimpse how well their cherished methods and strategies are playing out.

·         We recently composed an in deepness direct for KISSmetrics on SEO testing. Here are a few of the takeaways from that post:

·         You can check quirks of the algorithm by fine-tuning lone things and assessing how they leverage traffic
·         You can put SEO schemes to the check on “real world” sites by running two distinct content strategies at the same time, and measuring which content assembly choices up the most lifetime worth (note that lifetime worth does not identical number of visits, subscribers, etc.)

·         You can use traditional divide checking to find out which types of pages are most expected to pick up natural Links, or Links from outreach

We are dwelling in the age of big facts and figures. There’s just no excuse to depart cash on the table by relying on assumptions instead of hard details. Intuition is vital, but it’s most useful when you are also putting it to the test.

2. It’s Okay to Spend Money to Make Money

As we all know, Black head covering SEOs have no qualms spending money to make cash. They will purchase Links, pay for inclusion in networks, pay for automated link-building devices, purchase multiple IP hosting, and buy sites to set up their own private blog systems.

As all white hat SEOs currently understand, these tactics aren’t worth buying into in if you care about long period results. For the Black hats who understand how to do it, these methods can make a quick buck, but they are very far taken from the emblem building that legitimate businesses need to survive. Sites that grade using these types of tactics are short-lived at best, and finally get struck down by algorithm revisions, manual reconsiders, or user spam reports.

So, what can we probably discover from Black hats on this topic?

It’s a rudimentary message that marketers in every other field understand quite well: it’s alright to pay for outcomes. Marketers purchase ad space on TV systems, they pay per click, they hire talent, and they invest. And there absolutely are white hat SEOs who realise just how unbelievable outcomes can be when you have money to invest.

Regrettably, the entire “don’t purchase links” mentality has actually hurt our ability to believe of SEO as a “put cash in and get cash out” field of marketing.

We can even discover direct courses from some of these Black hat tactics:

Buying Links – While we can’t straight up purchase links or even offer “free products and services” in exchange for Links, it’s flawlessly fine to charter talent from persons with influence on the web. The over-emphasis on guest mails and link-begging has directed some of us to accept as true that you just can’t offer cash to people when you’re trying to set up an online occurrence. That’s a terrible way of looking at things. When you charter microcelebrities, influential bloggers, well-known photographers, and so on, you will attract traffic, and you will profit from Links. You just need to be willing to charter persons who habitually profit from natural links, no issue what they do. It’s that easy. Not to mention the detail that buying no-follow Links for the referral traffic is flawlessly fine, and gravely underrated.

Personal blog systems – While setting up a personal link mesh of sites that “pretend” not to be affiliated with you is a awful idea if you care about a long-term online presence, we can take a sheet from the basic approach. It’s perfectly legitimate to purchase blogs, forward them to folders or subdomains on your location, and when likely, charter the blogger. This allows you to purchase not just a connection profile, but mindshare. Conglomerates understand the worth of acquisitions. Why do so few SEOs?

Pay for inclusion in systems – connecting a link network, particularly a publicly advertised one, is an extremely awful idea for emblems. But there’s not anything ethically incorrect with buying visibility on systems. Advertorials (not to mention advertisements) are an incredible way to increase exposure, when utilised properly. What numerous people don’t recognize is that you can really profit from Links by buying publicity. Traffic turns into Links, and if the content is better, it turns into more, higher value links. That’s how Google works out-of-doors of the most comparable niches, and it’s a detail that you can use to construct entirely natural Links with ad exposure.

Pay For Tool – While fully automated link construction devices are an awful concept, devices like Followerwonk can make connection building outreach much more effective and effective. describing tools like AdvancedWebRanking make it simpler to track and discover from your crusades, and devices like KISSmetrics can teach us about our one-by-one customer’s demeanour. It’s very difficult to do any real optimization without tools in your arsenal.

SEO is business. We need to talk the dialect of ROI, and believe about more innovative and productive ways to spend money, if we desire to be taken gravely.

3.It’s Worth Taking Advantage of What Works Today

White hat SEOs are playing the long game. They’re involved in schemes that will continue to work for years and years, because they don’t want to hurl their clients under the bus, and misplace their reputation effectively overnight. This is the only intelligent way to run an SEO agency.

And yet, it’s clear that some black hats can make a lot of cash very rapidly by taking benefit of loopholes in the algorithm. Sites can rank for silly competitive periods like “car protection” in 3 days utilising Links from hacked websites. They can rank for terms with 40k monthly visits in 4 days utilising personal link networks.
And let’s all face facts: everyone likes to make money now, not subsequent. So is there something we can learn from the cheaters?

Long period scheme is crucial, but it shouldn't live in isolation.

When there’s an opening to make cash today, you should take benefit of it, as long as it doesn’t compromise the future of your emblem. There is absolutely not anything wrong with taking benefit of the way Google’s algorithm works today, as long as you can support what you are doing as legitimate marketing, and as long as you are buying into the income in schemes that will extend to work for the long haul.


While it can be helpful for SEO agencies to expanse themselves from spammers, it can also become unsafe if it bounds your thinking. Ethics are crucial for the success of your business, but they shouldn’t be utilised as an apologise to close your ears and cover your eyes. Open minds are a should if you want to contend in this growing market.

Wednesday, May 30, 2012

Google Updates "Can Competitors Harm Ranking" Statement
Vinay Upadhyay6:44 AM 5 comments

May 29, 2012 • 8:44 am

The time stamp on the Google document that answers the question Can competitors harm ranking? was updated on May 22nd. Although I swear it was updated a couple months ago and I thought I covered it (maybe I was dreaming, prophecy ;-) ).
Now it reads:
Google works hard to prevent other webmasters from being able to harm your ranking or have your site removed from our index. If you're concerned about another site linking to yours, we suggest contacting the webmaster of the site in question. Google aggregates and organizes information published on the web; we don't control the content of these pages.
Before it read:
There's almost nothing a competitor can do to harm your ranking or have your site removed from our index. If you're concerned about another site linking to yours, we suggest contacting the webmaster of the site in question. Google aggregates and organizes information published on the web; we don't control the content of these pages.
Google: Can Competitors Harm Ranking
There are several threads saying that this change means that negative SEO is possible and Google admits it by changing this statement. We have threads at WebmasterWorld,TrafficPlanet and Google Webmaster Help.
In fact, the Google Webmaster Help thread was from April 18th, so I am not sure what was changed on May 22nd, because the content copied and pasted in that thread is exactly what I see there. I am thinking maybe the video above the content was added, since Google published that video last week.
What do you make of this statement change, which again happened at least back by April 18th. Update, it was around March 14th as Hobo-Web has it on their blog then.
Forum discussion at WebmasterWorldTrafficPlanet and Google Webmaster Help.

Tuesday, May 29, 2012

65% Of SEOs Hurt By Google's Penguin Update
Vinay Upadhyay2:02 AM 1 comments

May 28, 2012 • 8:35 am

About a month ago, we polled our readers asking how they were impacted by the Google Penguin update.
We have well over a 1,000 responses and I wanted to share them with you. Keep in mind, those who were negatively impacted are probably more likely to take the poll. That being said, 65% said they were negatively impacted by Penguin, while only 13% said they were positively impacted.
Penguin Poll Results
This is way more than the Panda update where only 40% said they were negatively impacted by the Panda update. Again, it depends who takes the poll but huge differences. Note, Panda was a larger update and should have been felt by more sites on the web. But Penguin was likely targeting specific SEO techniques, because it was originally named theover optimization penalty.
This post was pre-written and scheduled to be posted today.

Monday, May 28, 2012

Official: Google Penguin 1.1 Now Live
Vinay Upadhyay1:47 AM 0 comments

May 26, 2012 For the past few weeks, we have been reporting on speculation in the forums that Google released a Panda refresh. The Penguin 1.1 update is now officially live.
All of those speculative posts were down right wrong, according to Google. But Google's Matt Cutts hastweeted that a Penguin update is now live - the moment many of you have been waiting for.
Minor weather report: We pushed 1st Penguin algo data refresh an hour ago. Affects <0.1% of English searches.
Personally, I think all those times we reported on Penguin updates - I think those were live tests of what we see now. I think Google was testing this and some webmasters picked up that their sites either recovered or newly hit. Of course, there are many who were under the false understanding that they were hit by Penguin. But for the most part, I believe these were live tests. I can easily be wrong.
Those updates were on or about May 13thMay 15th and last night on May 24th.
So now we have it - I hope those who were hit by the original Penguin update on April 24th has recovered.
Forum discussion continued at WebmasterWorld.

Saturday, May 26, 2012

New Google Penguin Update Rumors: Penguin 1.1
Vinay Upadhyay1:50 AM 1 comments

May 25, 2012 • 8:26 am
Tedster, the administrator at WebmasterWorld, started a thread at WebmasterWorld asking if others feel that Google pushed out or is pushing out an update to the Penguin algorithm.
He believes they are due to the increase in discussion in the various forum threads of people claiming recoveries and also new people claiming they were hit. That is the same metric we use to detect and report on possible Google updates - but this time, I am citing Tedster.
We thought we saw Penguin updates before, once around May 13th and then again around May 15th but Google said no, it wasn't Penguin, Panda or anything else.
Is this the real Penguin update? Is this version 1.1 of Penguin?
I do not know - did you recover?
Forum discussion at WebmasterWorld.

Friday, May 25, 2012

Google Penalties for “Over Optimization” in the Works
Vinay Upadhyay6:43 AM 0 comments

Will the next Panda update from Google include a penalty for over optimization? That seemed to be the message from Matt Cutts, the head of Google’s webspam team in a panel discussion at the South by Southwest Conference in Austin, Texas this weekend.
At a panel discussion titled, Dear Google & Bing: Help Me Rank Better! Cutts hinted that a penalty for “too much SEO” is in the works and could be implemented within a few weeks or months. The idea behind the penalty is to give sites with great content an edge over sites that are merely good at optimization.
Google’s Oracle
Mention of the new penalties came in response to a question about how mom and pop sites could compete with companies spending thousands of dollar on SEO. Cutts responded by saying that in a perfect world, webmasters wouldn’t need SEO.
He then went on to say that while he doesn’t normally pre-announce changes, Google engineers are indeed working on changes that will detect “too much SEO.”
How much is too much SEO? In true Google was sort of vague, but he did specifically mention sites that, “use too many keywords,” or, “exchange way too many links.” What that means in terms of actual numbers is anybody’s guess.
Cutts has addressed the question of over-optimization in the past and has said that optimization is sometimes, “a euphemism for ‘kind of spammy.”
Not Really Surprising
Given the direction of previous Panda updates and other directives from Google, an over-optimization penalty isn’t really surprising. The search engine giant has been sounding a drum beat about quality content for months now.
Google’s ideal web is a place where the top ranked page is the page with the most relevant content. In that regard, Google seems more benevolent than oppressive, but that’s not how the SEO world is likely to see this latest twist from Google.
Still, when Cutts talks, the SEO world listens and when he mentions a penalty for over optimization chances are he’s not just shooting the breeze.
Do you think that a penalty for over optimization is a good idea? Share your opinion with us on ourSearch Engine Optimization Forum