A response posted last week by Google engineer Matt Cutts into Google’s ranking policies opens the door to a whole new question
By Kalman Labovitz, Senior Analyst at RankAbove
Last week Google engineer Matt Cutts issued a response to the question of how Google decides that one iteration of a ranking algorithm delivers better quality results than another. (You can watch the video here.)
“Whenever an engineer is evaluating a new search quality change, and they want to know whether it’s an improvement, one thing that’s useful is we have hundreds of quality raters who have previously rated URLs as good or bad, spam, all these sorts of different things.
“So when you make a change, you can see the flux, you can see what moves up and what moves down…And you can say OK, given the changed search results, take the URLs that moved up. Were those URLs typically higher rated than the URLs that moved down by the search quality raters?”
He adds that the number of clicks the search results from one iteration gets over the other is another metric they use, while taking into account the spammy sites which succeed in generating lots of clicks.
We’ve gathered that there seem to be three primary stages, or levels, of evaluating an algorithm:
(1) quality rater metrics
(2) live test metrics, e.g. a side-by-side comparison of algorithms
(3) final review by the search quality launch team
These all sound fair, depending on one particularly fateful question:
How often do the quality raters review and update their ratings?
We have yet to hear an answer to this question, which we’ve seen asked several times again, in response to Cutts’ video. If a website they flagged for spam undergoes serious changes and optimization, including perhaps even a transfer of ownership, can it redeem itself? How and when can a website remove a flag received from the human review factor?
Take this list of sites (under “Who lost in the Farmer update”) that were hit by Panda 1.0, the update that had a tremendous impact on websites notorious for their low-quality content, rolled out in February, 2011. How many of those sites recovered?
We were able to find traffic numbers dating back to 2011 for several of the sites, and see that some did, and some didn’t, even 3 years later. For example, Hubpages.com recovered its traffic in July of 2011, but americantowns.com never jumped back.
One of the websites was even nice enough to share their story and analytics numbers with us. This was a huge website hit hard by the same update, mainly due to a glut of low quality content. The website underwent a pretty significant overhaul beginning soon after Panda, and yet years later, it’s not reaping the benefits of good SEO and high-quality content that it should be.
In this site’s case, we’re quite sure that the quality rater metrics are responsible. And we’re tempted to draw the same conclusion when coming across other sites that haven’t recovered despite great efforts taken and a great deal of time lapsed. Fortunately, we usually witness the opposite scenario; we’ve helped countless websites recover successfully from penalties incurred, which is why stories like these strike a chord with us.
And it’s the stories like these that leave us asking:
Once flagged by the “quality raters,” what are your chances at [timely] atonement? Just how often do those raters re-evaluate previous ratings?
Photo Credit: Shutterstock/ SEO flow chart