Wednesday, May 30, 2012

Google Updates "Can Competitors Harm Ranking" Statement

May 29, 2012 • 8:44 am

The time stamp on the Google document that answers the question Can competitors harm ranking? was updated on May 22nd. Although I swear it was updated a couple months ago and I thought I covered it (maybe I was dreaming, prophecy ;-) ).
Now it reads:
Google works hard to prevent other webmasters from being able to harm your ranking or have your site removed from our index. If you're concerned about another site linking to yours, we suggest contacting the webmaster of the site in question. Google aggregates and organizes information published on the web; we don't control the content of these pages.
Before it read:
There's almost nothing a competitor can do to harm your ranking or have your site removed from our index. If you're concerned about another site linking to yours, we suggest contacting the webmaster of the site in question. Google aggregates and organizes information published on the web; we don't control the content of these pages.
Google: Can Competitors Harm Ranking
There are several threads saying that this change means that negative SEO is possible and Google admits it by changing this statement. We have threads at WebmasterWorld,TrafficPlanet and Google Webmaster Help.
In fact, the Google Webmaster Help thread was from April 18th, so I am not sure what was changed on May 22nd, because the content copied and pasted in that thread is exactly what I see there. I am thinking maybe the video above the content was added, since Google published that video last week.
What do you make of this statement change, which again happened at least back by April 18th. Update, it was around March 14th as Hobo-Web has it on their blog then.
Forum discussion at WebmasterWorldTrafficPlanet and Google Webmaster Help.

Labels:

Tuesday, May 29, 2012

65% Of SEOs Hurt By Google's Penguin Update

May 28, 2012 • 8:35 am

About a month ago, we polled our readers asking how they were impacted by the Google Penguin update.
We have well over a 1,000 responses and I wanted to share them with you. Keep in mind, those who were negatively impacted are probably more likely to take the poll. That being said, 65% said they were negatively impacted by Penguin, while only 13% said they were positively impacted.
Penguin Poll Results
This is way more than the Panda update where only 40% said they were negatively impacted by the Panda update. Again, it depends who takes the poll but huge differences. Note, Panda was a larger update and should have been felt by more sites on the web. But Penguin was likely targeting specific SEO techniques, because it was originally named theover optimization penalty.
This post was pre-written and scheduled to be posted today.

Labels:

Monday, May 28, 2012

Official: Google Penguin 1.1 Now Live

May 26, 2012 For the past few weeks, we have been reporting on speculation in the forums that Google released a Panda refresh. The Penguin 1.1 update is now officially live.
All of those speculative posts were down right wrong, according to Google. But Google's Matt Cutts hastweeted that a Penguin update is now live - the moment many of you have been waiting for.
Minor weather report: We pushed 1st Penguin algo data refresh an hour ago. Affects <0.1% of English searches.
Personally, I think all those times we reported on Penguin updates - I think those were live tests of what we see now. I think Google was testing this and some webmasters picked up that their sites either recovered or newly hit. Of course, there are many who were under the false understanding that they were hit by Penguin. But for the most part, I believe these were live tests. I can easily be wrong.
Those updates were on or about May 13thMay 15th and last night on May 24th.
So now we have it - I hope those who were hit by the original Penguin update on April 24th has recovered.
Forum discussion continued at WebmasterWorld.

Labels:

Saturday, May 26, 2012

New Google Penguin Update Rumors: Penguin 1.1


May 25, 2012 • 8:26 am
Tedster, the administrator at WebmasterWorld, started a thread at WebmasterWorld asking if others feel that Google pushed out or is pushing out an update to the Penguin algorithm.
He believes they are due to the increase in discussion in the various forum threads of people claiming recoveries and also new people claiming they were hit. That is the same metric we use to detect and report on possible Google updates - but this time, I am citing Tedster.
We thought we saw Penguin updates before, once around May 13th and then again around May 15th but Google said no, it wasn't Penguin, Panda or anything else.
Is this the real Penguin update? Is this version 1.1 of Penguin?
I do not know - did you recover?
Forum discussion at WebmasterWorld.

Labels:

Friday, May 25, 2012

Google Penalties for “Over Optimization” in the Works


Will the next Panda update from Google include a penalty for over optimization? That seemed to be the message from Matt Cutts, the head of Google’s webspam team in a panel discussion at the South by Southwest Conference in Austin, Texas this weekend.
At a panel discussion titled, Dear Google & Bing: Help Me Rank Better! Cutts hinted that a penalty for “too much SEO” is in the works and could be implemented within a few weeks or months. The idea behind the penalty is to give sites with great content an edge over sites that are merely good at optimization.
Google’s Oracle
Mention of the new penalties came in response to a question about how mom and pop sites could compete with companies spending thousands of dollar on SEO. Cutts responded by saying that in a perfect world, webmasters wouldn’t need SEO.
He then went on to say that while he doesn’t normally pre-announce changes, Google engineers are indeed working on changes that will detect “too much SEO.”
How much is too much SEO? In true Google was sort of vague, but he did specifically mention sites that, “use too many keywords,” or, “exchange way too many links.” What that means in terms of actual numbers is anybody’s guess.
Cutts has addressed the question of over-optimization in the past and has said that optimization is sometimes, “a euphemism for ‘kind of spammy.”
Not Really Surprising
Given the direction of previous Panda updates and other directives from Google, an over-optimization penalty isn’t really surprising. The search engine giant has been sounding a drum beat about quality content for months now.
Google’s ideal web is a place where the top ranked page is the page with the most relevant content. In that regard, Google seems more benevolent than oppressive, but that’s not how the SEO world is likely to see this latest twist from Google.
Still, when Cutts talks, the SEO world listens and when he mentions a penalty for over optimization chances are he’s not just shooting the breeze.
Do you think that a penalty for over optimization is a good idea? Share your opinion with us on ourSearch Engine Optimization Forum

Labels:

Friday, May 18, 2012

Google’s Penguin Update Makes The Wall Street Journal


The Google Penguin Update is now mainstream after The Wall Street Journal covered it in a feature story named As Google Tweaks Searches, Some Get Lost in the Web.
The story interviews a few small business owners who were hit hard by the update. One business owner saw his sales drop to $25,000 this month, down from $68,000 the previous month. Another small website owner saw roughly 30% of his traffic disappear over night. And another lost 20% of their traffic. Most of the article goes through small business owners who lives have changed for the worse due to this update. But there are some stories that lead me to believe they are not directly related to Penguin.
Again, there were several updates last month – Penguin was happened on the 24th. But there were two Panda refreshes, link network penalties, bugs and many more updates.
Even if all those cases were not Penguin related, Penguin is now main stream after hitting a major publication like the Wall Street Journal.
You can read the story over here.

Labels:

Two Weeks In, Google Talks Penguin Update, Ways To Recover & Negative SEO


It’s been about two weeks since Google launched its Penguin Update. Google’s happy the new spam-fighting algorithm is improving things as intended. But some hurt by it are still wondering how to recover, and there remain concerns about “negative SEO” as a threat. I caught up with Matt Cutts, the head of Google’s web spam team, on these and some related questions.

Penguin: “A Success”

The goal of any algorithm update is to improve search results. So how’s Penguin been for Google?
“It’s been a success from our standpoint,” Cutts said.

What About Those Weird Results?

Of course, soon after Penguin was released, people quickly started citing examples of odd results. The official Viagra site wasn’t listed, while hacked sites were. An empty web site was listed for “make money online,” and there were reports of other empty sites ranking well. Scraper sites were reported outranking the sites they scraped.
How could Penguin be a success with these types of things happening?
Cutts said that many of these issues existed before Penguin launched and were not caused by the new spam-fighting algorithm.
Indeed, the Viagra issue, which has now been fixed, was a problem before Penguin hit. Penguin didn’t cause it.

False Positives? A Few Cases

How about false positives, people who feel they’ve been unfairly hit by Penguin when they weren’t doing any spam?
“We’ve seen a few cases where we might want to investigate more, but this change hasn’t had the same impact as Panda or Florida,” Cutts said.
The Panda Update was Google’s big update that targeted low-quality spam last year. TheFlorida Update was a major Google update in 2003 intended to improve its search quality.
I’d agree that both of those seemed to have impacted more sites than Penguin has, based on having watched reactions to all these updates. Not everyone will agree with me, of course. It’s also worth the regular reminder that for any site that “lost” in the rankings, someone gained. You rarely hear from those who gain.
Bottom line, Google seems pretty confident that the Penguin Update is indeed catching people who were spamming, as was intended.

Why Spam Still Gets Through

Certainly when I’ve looked into reports, I’ve often found spam at the core of why someone dropped. But if Penguin is working, why are some sites that are clearly spamming still getting through?
“No algorithm is perfect. While we’d like to achieve perfection, our litmus test is, ‘Do things get better than before?’,” Cutts said.
Cutts also explained that Penguin was designed to be quite precise, to act against pages when there was an extremely high-confidence of spam being involved. The downside is that some spam might get through, but the upside is that you have fewer false positives.

How Can You Recover?

One of the most difficult things with this update is telling people how to recover. Anyone hit by Penguin was deemed to be spamming Google.
In the past, if you spammed Google, you were told to file a reconsideration request. However, Google’s specifically said that reconsideration requests won’t help those hit by Penguin. They’ll recover naturally, Google says, if they clean the spam up.
However, one of the main reasons I’ve seen when looking at sites hit by Penguin seems to be bad linking practices. People have used sponsored WordPress themes, or poor quality reciprocal linking, have purchased links or participated in linking networks, such as those recently targeted by Google.
How do people pull themselves out of these link networks, if perhaps they don’t have control over those links now?
“It is possible to clean things up,” Cutts said, and he suggested people review two videos he’s done on this topic:
“The bottom line is, try to resolve what you can,” Cutts said.

Waiting On Penguin To Update Again

If you do clean things up, how will you know? Ideally, you’ll see your traffic from Google recover, the next time Penguin is updated.
That leads to another important point. Penguin, like Panda, is a filter that gets refreshed from time-to-time. Penguin is not constantly running but rather is used to tag things as spam above-and-beyond Google’s regular spam filtering on a periodic basis.
Is Penguin a site-wide penalty like Panda or page-specific? Cutts wouldn’t say. But given that Panda has site-wide impacts, I think it’s a fair assumption that Penguin works the same.
What that means is that if some of your site is deemed Penguin-like, all of it may suffer. Again, recovery means cleaning up the spam. If you’ve cleaned and still don’t recover, ultimately, you might need to start all over with a fresh site, Cutts said.

New Concerns Over Negative SEO

Before Penguin, talk of “negative SEO” had been ramping up. Since then, it seems to have gotten worse in some places. I’ve seen post-after-post making it sound as if anyone is now in serious danger that some competitor can harm them.
At the core of these fears seems to be a perfect storm of assumptions. Google recently targeted some linking schemes. That caused some people to lose traffic. Google also sent outwarnings about sites with “artificial” or “unnatural” links. That generated further concerns in some quarters. Then the Penguin Update hit, which caused more people to lose traffic as they were either hit for link spam or no longer benefited from link spam that was wiped out.
These things made it ripe for people to assume that pointing bad links at a site can hurt it. But as I wrote before, negative SEO concerns aren’t new. They’ve been around for years. Despite this, we’ve not seen it become a major concern.
Google has said it’s difficult for others to harm a site, and that’s indeed seemed to be the case. In particular, pointing bad links at a good site with many other good signals seems to be like trying to infect it with a disease that it has antibodies to. The good stuff outweighs the bad.
Cutts stressed again that negative SEO is rare and hard. “We have done a huge amount of work to try to make sure one person can’t hurt another person,” he said.
Cutts also stressed again what Google said before. Most of the those 700,000 messages to publishers that Google sent out earlier this year were not about bad link networks. Nor were they all suddenly done on the same day. Rather, many sites have had both manual and algorithmic penalties attached to them over time but which were never revealed. Google recently decided to open up about these.

After Negative SEO Campaign, A Link Warning

Of course, new messages do go out, which leads to the case of Dan Thies. His site wastargeted by some trying to show that negative SEO works. He received an unnatural link warning after this happened. He also lost some rankings. Is this the proof that negative SEO really works?
Thies told me that his lost rankings were likely due to changes he made himself, when he removed a link across all pages on his site that led back to his home page. After restoring that, he told me, he regained his rankings.
His overall traffic, he said, never got worse. That tends to go against the concerns that negative SEO is a lurking threat, because if it had worked enough to tag his site as part of the Penguin Update, he should have seen a huge drop.
Still, what about link warning? Thies did believe that came because of the negative SEO attempt. That’s scary stuff. He also said he filed three reconsideration requests, which each time returned messages saying that there were no spam actions found. Was he hit with a warning but not one that was also associated with a penalty?
I asked Cutts about the case, but he declined to comment on Thies’s particular situation. He did say that typically a link warning is a precursor to a ranking drop. If the site fixes the problem and does a reconsideration request quickly enough, that might prevent a drop.

Solving The Concerns

I expect we’ll continue to see discussions of negative SEO, with a strong belief by some that it’s a major concern for anyone. I was involved in one discussion over at SEO Book about this that’s well worth a read.
When it’s cheaper to buy links than ever, it’s easy to see why there are concerns. Stories like what happened to Thies or this person, who got a warning after 24,000 links appeared pointing at his site in one day, are worrisome.
Then again, the person’s warning came after he apparently dropped in rankings because of Penguin. So did these negative SEO links actually cause the drop, or was it something else? As is common, it’s hard to tell, because the actual site isn’t provided.
To further confuse matters, some who lost traffic because of Penguin might not be victims of a penalty at all. Rather, Google may have stopped allowing some links to pass credit, if they were deemed to be part of some attempt to just manipulate rankings. If sites were heavily dependent on these artificial links, they’d see a drop just because the link credit was pulled, not because they were hit with a penalty.
I’ve seen a number of people now publicly wishing for a way to “disvow” links pointing at them. Google had no comment about adding such a feature at this time, when I asked about this. I certainly wouldn’t wait around for it now, if you know you were hit by Penguin. I’d do what you can to clean things up.
One good suggestion out of the SEO Book discussion was that Google not penalize sites for bad links pointing at them. Ignore the links, don’t let the links pass credit, but don’t penalize the site. That’s an excellent suggestion for defusing negative SEO concerns, I’d say.
I’d also stress again that from what I’ve seen, negative SEO isn’t really what most hit by Penguin should probably be concerned about. It seems far more likely they were hit by spam they were somehow actively involved in, rather than something a competitor did.

Recovering From Penguin

Our Google Penguin Update Recovery Tips & Advice post from two weeks ago gave some initial advice about dealing with Penguin, and that still holds up. In summary, if you know that you were hit by Penguin (because your traffic dropped on April 24):
  • Clean up on-page spam you know you’ve done
  • Clean up bad links you know you’re been involved with, as best you can
  • Wait for news of a future Penguin Update and see if you recover after it happens
  • If it doesn’t, try further cleaning or consider starting over with a fresh site
  • If you really believe you were a false positive, file a report as explained here
Just in, by the way, a list of WordPress plug-ins that apparently insert hidden links. If you use some of these, and they have inserted hidden links, that could have caused a penalty.
I’d also say again, take a hard look at your own site. When I’ve looked at sites, it’s painfully easy to find bad link networks they’ve been part of. That doesn’t mean that there’s not spam that’s getting past Penguin. But complaining about what wasn’t caught isn’t a solution to improving your own situation, if you were hit.

Labels:

Hit By Penguin? Google Query Hack To Confirm?

The phrases “white hat” and “black hat” are loaded guns, and we only use them because they’re so ubiquitous. The reality is, when you tell yourself you are a “white hat,” you can end up fooling yourself into thinking that your strategy will habitually work, and Google will not ever turn it back on you. Worse still, you can close your brain off to insights that spectacularly improve business results.
Don’t misunderstand us. Ethics are vital. If you don’t currently realise why it’s wholeheartedly vital for SEO to be crystal clear and ethical in the years going forward, take a gaze at what we composed over at seek motor Journal. (Hint: the algorithm is only a very little part of why ethics matter.)
But there’s a distinction between ethics and restrictive marks, and if you aren’t discovering anything from “black hats,” you’re probably missing some key insights, like these:
1. Testing is Always Better than Blind Faith

Before you head directly to the comment section and write a rage-fueled rant, let me issue out the detail that these are generalized declarations. They don’t request to every single “white hat” or “black hat” out there. But here we proceed:

White hats are less expected to test things than Black hats.

This is a regrettable reality about our industry. While there are abounding of very good number crunchers on the “inbound” side of SEO, like, say, Dr. Pete, your average white hat SEO is less likely to put things to the check than your mean black hat SEO. There are a couple of causes for this:

·         Black hats can check some theories much much quicker than white hats, because they can use automated programs and conceive controlled trials that aren’t practical with white hat methods
·         A large piece of white hats are “reformed” black hats who couldn’t stomach checks that kept getting them penalized, and have determined to simply pursue the advice of industry professionals rather than
·         Some confuse white hat SEO for doing precisely whead covering Google suggests, and therefore don’t bother checking any thing
Again, I’m not saying these declarations are factual for all, or even most, white hat SEOs. I’m simply saying that covering more white hats are guilty of this specific infringement than Black hats.
Things don’t have to be this way.
As we’ve said sometimes, it’s a bit ironic to put the phrase “optimizer” in your title if you aren’t doing any genuine checking for optimization. Even the inferior alteration rate optimizers realise this. It’s strange how couple of SEOs (on either side of the barrier) actually check their favourite ideas about the algorithm, or run the figures to glimpse how well their cherished methods and strategies are playing out.
·         We recently composed an in deepness direct for KISSmetrics on SEO testing. Here are a few of the takeaways from that post:
·         You can check quirks of the algorithm by fine-tuning lone things and assessing how they leverage traffic
·         You can put SEO schemes to the check on “real world” sites by running two distinct content strategies at the same time, and measuring which content assembly choices up the most lifetime worth (note that lifetime worth does not identical number of visits, subscribers, etc.)
·         You can use traditional divide checking to find out which types of pages are most expected to pick up natural Links, or Links from outreach
We are dwelling in the age of big facts and figures. There’s just no excuse to depart cash on the table by relying on assumptions instead of hard details. Intuition is vital, but it’s most useful when you are also putting it to the test.

2. It’s Okay to Spend Money to Make Money

As we all know, Black head covering SEOs have no qualms spending money to make cash. They will purchase Links, pay for inclusion in networks, pay for automated link-building devices, purchase multiple IP hosting, and buy sites to set up their own private blog systems.
As all white hat SEOs currently understand, these tactics aren’t worth buying into in if you care about long period results. For the Black hats who understand how to do it, these methods can make a quick buck, but they are very far taken from the emblem building that legitimate businesses need to survive. Sites that grade using these types of tactics are short-lived at best, and finally get struck down by algorithm revisions, manual reconsiders, or user spam reports.
So, what can we probably discover from Black hats on this topic?
It’s a rudimentary message that marketers in every other field understand quite well: it’s alright to pay for outcomes. Marketers purchase ad space on TV systems, they pay per click, they hire talent, and they invest. And there absolutely are white hat SEOs who realise just how unbelievable outcomes can be when you have money to invest.
Regrettably, the entire “don’t purchase links” mentality has actually hurt our ability to believe of SEO as a “put cash in and get cash out” field of marketing.
We can even discover direct courses from some of these Black hat tactics:
Buying Links – While we can’t straight up purchase links or even offer “free products and services” in exchange for Links, it’s flawlessly fine to charter talent from persons with influence on the web. The over-emphasis on guest mails and link-begging has directed some of us to accept as true that you just can’t offer cash to people when you’re trying to set up an online occurrence. That’s a terrible way of looking at things. When you charter microcelebrities, influential bloggers, well-known photographers, and so on, you will attract traffic, and you will profit from Links. You just need to be willing to charter persons who habitually profit from natural links, no issue what they do. It’s that easy. Not to mention the detail that buying no-follow Links for the referral traffic is flawlessly fine, and gravely underrated.
Personal blog systems – While setting up a personal link mesh of sites that “pretend” not to be affiliated with you is a awful idea if you care about a long-term online presence, we can take a sheet from the basic approach. It’s perfectly legitimate to purchase blogs, forward them to folders or subdomains on your location, and when likely, charter the blogger. This allows you to purchase not just a connection profile, but mindshare. Conglomerates understand the worth of acquisitions. Why do so few SEOs?
Pay for inclusion in systems – connecting a link network, particularly a publicly advertised one, is an extremely awful idea for emblems. But there’s not anything ethically incorrect with buying visibility on systems. Advertorials (not to mention advertisements) are an incredible way to increase exposure, when utilised properly. What numerous people don’t recognize is that you can really profit from Links by buying publicity. Traffic turns into Links, and if the content is better, it turns into more, higher value links. That’s how Google works out-of-doors of the most comparable niches, and it’s a detail that you can use to construct entirely natural Links with ad exposure.
Pay For Tool – While fully automated link construction devices are an awful concept, devices like Followerwonk can make connection building outreach much more effective and effective. describing tools like AdvancedWebRanking make it simpler to track and discover from your crusades, and devices like KISSmetrics can teach us about our one-by-one customer’s demeanour. It’s very difficult to do any real optimization without tools in your arsenal.
SEO is business. We need to talk the dialect of ROI, and believe about more innovative and productive ways to spend money, if we desire to be taken gravely.

3. it’s Worth Taking Advantage of What Works Today

White hat SEOs are playing the long game. They’re involved in schemes that will continue to work for years and years, because they don’t want to hurl their clients under the bus, and misplace their reputation effectively overnight. This is the only intelligent way to run an SEO agency.
And yet, it’s clear that some black hats can make a lot of cash very rapidly by taking benefit of loopholes in the algorithm. Sites can rank for silly competitive periods like “car protection” in 3 days utilising Links from hacked websites. They can rank for terms with 40k monthly visits in 4 days utilising personal link networks.
And let’s all face facts: everyone likes to make money now, not subsequent. So is there something we can learn from the cheaters?
Long period scheme is crucial, but it shouldn’t live in isolation.
When there’s an opening to make cash today, you should take benefit of it, as long as it doesn’t compromise the future of your emblem. There is absolutely not anything wrong with taking benefit of the way Google’s algorithm works today, as long as you can support what you are doing as legitimate marketing, and as long as you are buying into the income in schemes that will extend to work for the long haul.

Conclusion


While it can be helpful for SEO agencies to expanse themselves from spammers, it can also become unsafe if it bounds your thinking. Ethics are crucial for the success of your business, but they shouldn’t be utilised as an apologise to close your ears and cover your eyes. Open minds are a should if you want to contend in this growing market.

Labels:

Thursday, May 17, 2012

Google Update Brewing? What Is Going On?


May 16, 2012 • 9:13 am

As of yesterday, the WebmasterWorld thread and some other forums starting lighting up again with discussion and SEO chatter around Google's search results starting the shuffle and fluctuate.
These are typically signs of an update happening in Google. Maybe a Panda or Penguin refresh? Maybe a new Google bug? Maybe a new penalty or an old one being run again? Maybe Google is testing an algorithm or update? Or maybe it is something completely different or people are on crack?
I do not know but earlier this week, when I saw similar discussion around this, I asked Google if there was a Penguin refresh and they said no. It wasn't Penguin, Panda or anything else.
So do I go back to Google and bother asking about yesterday's increase in SEO chatter?
Did you notice ranking and traffic changes between Monday and today?
Forum discussion at WebmasterWorld.
Image credit to ShutterStock for brewing pot - I added Google logo

Labels:

Google Penguin 1.1 Update Underway? Google Says No.


Google announced 14-may-2012
There are some reports of major shuffling going on in the Google search results as of yesterday. I believe from the reports that it may be a Penguin update - but I do not have confirmation from Google on this yet.
Update: A Google spokesperson told me there was no Penguin refresh or update yet. Google said it was not a Panda refresh either and there is no update as far as they are concerned. Interesting... continuing story below...
There are WebmasterWorld threads and if you scan the Google Webmaster Help forums you will see tons of threads about ranking changes in the past 24 hours or so.
One of the WebmasterWorld threads has one guy claiming he recovered from being hit by Penguin. He said yesterday, "I had around 30 sites hit by Penguin on the 24/4, yesterday the first one resurfaced back to number 2 for it's keywords which is encouraging."
He even explained what he did to "recover":
What did I do - the site was just 15 pages, the inner pages were all thin content boiler plate stuff, so I deleted them all to see what would happen and left the home page which is 500 words of original content.
Links - did i touch incoming links, no I am going to try anchor text dilution on some other sites where I suspect this problem but did not create any more links on the recovered site.
Now, you have to be careful with this recovery reports because who knows if the site was truly hit by Penguin or a weird penalty. They did say his traffic dropped on the 24th which is a good sign of it being Penguin.
But there is a huge increase in reports of traffic and ranking drops yesterday on the discussion forums. The question is, is this related to Mother's Day or an algorithm change?
I will email Google for a statement on this.
Forum discussion at WebmasterWorld and Google Webmaster Help.
Update: A Google spokesperson told me there was no Penguin refresh or update yet. Google said it was not a Panda refresh either and there is no update as far as they are concerned. Interesting...
Image credit to ShutterStock for shuffling penguin

Labels:

Tuesday, May 8, 2012

Google Announced 50+ Search Updates, Which Are Penguin Related?


In Google fashion, late on Friday, Google released their now monthly update on the changes they made to Google search over the past month. It is really great that Google does this and this time they shared 53 changes in April. Here is last months update.
Below I grouped and listed out the more important changes, at least the ones I find to me most important.
But let's try to see which items in this list are Penguin related. Can we even figure that out?

Penguin Related?

  • Anchors bug fix
  • Keyword stuffing classifier improvement
  • More authoritative results
  • Improvement in a freshness signal
  • No freshness boost for low-quality content
  • Improvements to how search terms are scored in ranking
If I had to guess, these and maybe more, are all related to the Penguin update.
Here are some more that I find important but wouldn't specifically related to Penguin, Panda or others:

Ranking Changes:

  • Improvement in a freshness signal. [launch codename "citron", project codename "Freshness"] This change is a minor improvement to one of the freshness signals which helps to better identify fresh documents.
  • No freshness boost for low-quality content. [launch codename "NoRot", project codename "Freshness"] We have modified a classifier we use to promote fresh content to exclude fresh content identified as particularly low-quality.
  • Smoother ranking changes for fresh results. [launch codename "sep", project codename "Freshness"] We want to help you find the freshest results, particularly for searches with important new web content, such as breaking news topics. We try to promote content that appears to be fresh. This change applies a more granular classifier, leading to more nuanced changes in ranking based on freshness.
  • Improvements to how search terms are scored in ranking. [launch codename "Bi02sw41"] One of the most fundamental signals used in search is whether and how your search terms appear on the pages you're searching. This change improves the way those terms are scored.
  • Backend improvements in serving. [launch codename "Hedges", project codename "Benson"] We've rolled out some improvements to our serving systems making them less computationally expensive and massively simplifying code.
  • Keyword stuffing classifier improvement. [project codename "Spam"] We have classifiers designed to detect when a website is keyword stuffing. This change made the keyword stuffing classifier better.
  • More authoritative results. We've tweaked a signal we use to surface more authoritative content.

Link Analysis Changes:

  • Anchors bug fix. [launch codename "Organochloride", project codename "Anchors"] This change fixed a bug related to our handling of anchors.

Index Updates:

  • Increase base index size by 15%. [project codename "Indexing"] The base search index is our main index for serving search results and every query that comes into Google is matched against this index. This change increases the number of documents served by that index by 15%. *Note: We're constantly tuning the size of our different indexes and changes may not always appear in these blog posts.
  • New index tier. [launch codename "cantina", project codename "Indexing"] We keep our index in "tiers" where different documents are indexed at different rates depending on how relevant they are likely to be to users. This month we introduced an additional indexing tier to support continued comprehensiveness in search results.

Search Listings:

  • More domain diversity. [launch codename "Horde", project codename "Domain Crowding"] Sometimes search returns too many results from the same domain. This change helps surface content from a more diverse set of domains.
  • Categorize paginated documents. [launch codename "Xirtam3", project codename "CategorizePaginatedDocuments"] Sometimes, search results can be dominated by documents from a paginated series. This change helps surface more diverse results in such cases.
  • Country identification for webpages. [launch codename "sudoku"] Location is an important signal we use to surface content more relevant to a particular country. For a while we've had systems designed to detect when a website, subdomain, or directory is relevant to a set of countries. This change extends the granularity of those systems to the page level for sites that host user generated content, meaning that some pages on a particular site can be considered relevant to France, while others might be considered relevant to Spain.
  • Disable salience in snippets. [launch codename "DSS", project codename "Snippets"] This change updates our system for generating snippets to keep it consistent with other infrastructure improvements. It also simplifies and increases consistency in the snippet generation process.
  • More text from the beginning of the page in snippets. [launch codename "solar", project codename "Snippets"] This change makes it more likely we'll show text from the beginning of a page in snippets when that text is particularly relevant.
  • Tweak to trigger behavior for Instant Previews. This change narrows the trigger area for Instant Previews so that you won't see a preview until you hover and pause over the icon to the right of each search result. In the past the feature would trigger if you moused into a larger button area.
  • Better query interpretation. This launch helps us better interpret the likely intention of your search query as suggested by your last few searches.
  • News universal results serving improvements. [launch codename "inhale"] This change streamlines the serving of news results on Google by shifting to a more unified system architecture.
  • More efficient generation of alternative titles. [launch codename "HalfMarathon"] We use a variety of signals to generate titles in search results. This change makes the process more efficient, saving tremendous CPU resources without degrading quality.
  • More concise and/or informative titles. [launch codename "kebmo"] We look at a number of factors when deciding what to show for the title of a search result. This change means you'll find more informative titles and/or more concise titles with the same information.
  • "Sub-sitelinks" in expanded sitelinks. [launch codename "thanksgiving"] This improvement digs deeper into megasitelinks by showing sub-sitelinks instead of the normal snippet.
  • Better ranking of expanded sitelinks. [project codename "Megasitelinks"] This change improves the ranking of megasitelinks by providing a minimum score for the sitelink based on a score for the same URL used in general ranking.
  • Sitelinks data refresh. [launch codename "Saralee-76"] Sitelinks (the links that appear beneath some search results and link deeper into the site) are generated in part by an offline process that analyzes site structure and other data to determine the most relevant links to show users. We've recently updated the data through our offline process. These updates happen frequently (on the order of weeks).
  • Less snippet duplication in expanded sitelinks. [project codename "Megasitelinks"] We've adopted a new technique to reduce duplication in the snippets of expanded sitelinks.

Local Changes:

  • More local sites from organizations. [project codename "ImpOrgMap2"] This change makes it more likely you'll find an organization website from your country (e.g. mexico.cnn.com for Mexico rather than cnn.com).
  • Improvements to local navigational searches. [launch codename "onebar-l"] For searches that include location terms, e.g. [dunston mint seattle] or [Vaso Azzurro Restaurant 94043], we are more likely to rank the local navigational homepages in the top position, even in cases where the navigational page does not mention the location.
  • More comprehensive predictions for local queries. [project codename "Autocomplete"] This change improves the comprehensiveness of autocomplete predictions by expanding coverage for long-tail U.S. local search queries such as addresses or small businesses.

Images & Videos:

  • Improvements to SafeSearch for videos and images. [project codename "SafeSearch"] We've made improvements to our SafeSearch signals in videos and images mode, making it less likely you'll see adult content when you aren't looking for it.
  • Improved SafeSearch models. [launch codename "Squeezie", project codename "SafeSearch"] This change improves our classifier used to categorize pages for SafeSearch in 40+ languages.for more
  •  Forum discussion at Google+, HighRankings Forum, DigitalPoint Forums andWebmasterWorld.

Labels: