X
    Categories: GoogleLinksSEOState of the Industry

The Penguin Algorithm: An Issue of Ethics

On April 24th, 2012 Matt Cutts, the head of Webspam at Google, announced they would be releasing the first “Penguin Algorithm”.

Although the focus these days is mostly about links many people forget Penguin was not released to only target link spam, it was also created to target “…sites that we believe are violating Google’s existing quality guidelines.” So, before it was named the Penguin update, it was referred to as the “over-optimization” algorithim.*

(It is also important to note, no one from Google has stated that it is not still targeting over-optimization or quality signals.)

To that end we’ve launched Panda changes that successfully returned higher-quality sites in search results. And earlier this year we launched a page layout algorithm that reduces rankings for sites that don’t make much content available “above the fold.”

In the next few days, we’re launching an important algorithm change targeted at Webspam. The change will decrease rankings for sites that we believe are violating Google’s existing quality guidelines.”.

(http://googlewebmastercentral.blogspot.com/2012/04/another-step-to-reward-high-quality.html)

Penguin Devastation.

The first Penguin release had fairly devastating results for many sites. Over 3.1% of all queries were affected which, though it doesn’t sound like a lot, if you figure how many queries Google sees a day is actually quite significant. Unlike other updates in the past, the sites that attempted to fix their issues did not seem to recover. True or not, it became a generally held belief that sites hit by Penguin 1.0 would never recover. In fact, to this day many site have not and the recovery rate from Penguin 1.0 was estimated to be as low as 1%.

Penguin Spreads.

While Penguin 1.0 was devastating to many sites, the first targets were only the homepages of the site affected. After Penguin 2.0 was released, all pages of a site could experience serious devaluations.

Here is a summary chart of the dates of Penguin updates and queries affected.

Penguin Releases (and the Impact)

  • Penguin 1.0 > April 24, 2012 (3.1% of queries)
    • Penguin 1.2 > May 26, 2012 (0.1%)
    • Penguin 1.3 > Oct. 5, 2012 (0.3% of queries)
  • Penguin 2.0 > May 22, 2013 (impacted 2.3% of queries)
    • Penguin 2.1 > Oct. 4, 2013 (1% of queries)

Disavow Files.

With the release of Penguin came the first disavow files, these files were filled with links that sites were reporting as “spammy” or bad. Though not confirmed, common theory was that Google had run out of ways to detect link networks and the like, so it needed data users could provide. It makes sense. If enough people reported the same sites, same IP addresses or C classes, Google could more easily detect link networks that were more invisible to them or out of reach of their toolset.

Google’s announcement of the demise of another paid link source became commonplace.

Spam.

The focus of this update, as most updates from Google, was of course Webspam. Few people, even people who utilized the blackhat methods Google was targeting, could really fault Google for implementing an algorithm that would help with the takedowns of link networks.

However, unlike the days before Penguin, when a link takedown just meant you lost link value to your site, Penguin actually would remove most or all of an offending sites traffic by sending it plunging down the rankings. Number 1 yesterday, nowhere to be found today. That is how fast it happened.

For many this is also where the conflict begins. Were the Penguin updates too harsh?

Didn’t They Deserve It?

When someone criticizes the Penguin update as being too harsh that thought is often met with the claim that sites that did “bad things” deserved the “bad things” that happened to them. A reap what you sow kind of thinking. While that may or may not be true depending on how you view the relationship between SEO and Google, many of the sites affected had no idea that they were doing “bad things” because the person they hired to do the work did the “bad things” without properly informing the client. So while some did know what they were doing and took that risk willingly, it is not as simple as they deserved it, many didn’t.

Enter Negative SEO.

In addition to the loss of position by website owners that purchased blackhat services either knowingly or unknowingly, came the issue of Negative SEO.

How Does Negative SEO Work?

Negative SEO is a technique used to take down competitor sites by directing bad links at that site. When running Penguin Updates, Google uses a percentage threshold of good to bad links to evaluate a site’s link profile. If a site has X% bad links it loses position when the Penguin algorithm comes through, if it had Y% good links it might even gain position. Since this % threshold could be used to effectively damage other sites, those in competitive markets owners suddenly saw their sites inundated with bad links from porn, pharma or other spam link networks.

Those Links Don’t Hurt.

At first, Google denied these “negative” links could hurt your site. However, in the wild testing and a multitude of case studies have shown this is simply not the case. While Google would sometimes ignore the inbound Negative SEO spam links, they often would not and that site would receive a devaluation during the next Penguin update.

Recovery.

So Google released an imperfectly applied punishment for sites that may have or may not have known their link profiles were “bad”. Given the issues in application as well as the resulting loss of rankings, SEOs and site owners started having issues with this process. A business losing almost all site traffic overnight can have a very negative effect on a bottom line and people’s lives.

Of course Google often threw this issue off to a no worries policy, you just have to wait for the Penguin update to come through and you will get your rankings back right? Well first no. Many sites lose so many links during the recovery process that the link profile needs to be rebuilt. However, there is a much larger issue than link loss at play.

Time Between Updates.

Penguin updates are the ONLY way to recover from a bad Penguin update. Then if you have done enough work, Penguin might reward you with an uplift in traffic and the chance to recover your site. Of course, you might also find the algorithm did not find your work sufficient and now you had to wait for a second wave. Except when would it arrive?

No Updates In Sight.

The first Google Penguin release was April 24th 2012. The next major update was March 13th 2013. The update following that was Oct 6th, 2013 with a few quick ones following each other in Oct 2014 and then everflux in Nov 2014-Dec 2014.

We can see a pattern here. The first major update was nearly one year from the first. The next time period between major updates was six months. The next was twelve to thirteen months. Now, it is almost June 2015 and we have been waiting again six months for a new update. During this time sites affected by Penguin CANNOT recover their sites.

Six months? Thirteen months? Just how many months does it take to put a company out of business?

“But aren’t there real-time updates now?”

No Real-Time Penguin.

Recently there was some confusion about whether the Penguin updates had become real-time or not. This confusion comes from a Google Webmaster Hangout with John Mueller where he stated that the algorithms were, in fact, running in real-time. This is not actually what was meant, however. Mariya Moeva of Google Russia clarifies this for us in a Google Hangout where she explains John’s statement.

Without the updated data, even though these algorithms are built into the real time infrastructure, you won’t see any changes to your rankings around those specific algorithms. “

https://www.seroundtable.com/google-panda-penguin-real-time-data-manual-20283.html

So we do have real time crawling, but updates (meaning potential recovery) can only happen when the algorithm is manually implemented.

Can’t you just start a new site?

Perhaps, but it is not that simple. Google now redirects inbound site links when it can tell the sites are the same sites by the same owner (It’s a courtesy) even when not redirected. Since the first site was penalized for links, you can get a new site up and running and find the next time the Penguin update occurs, your site is in the same position as the old one.

Given the devastation of a Penguin “penalty”, the way it is applied, how it can be used against you and the very long periods of time between updates it might be time to ask – is this right? And to take it one step further, is it ethical?

The Ethics of Penguin.

Penguin is one of the most devastating algorithms to come out of Google. As mentioned, once affected, sites have to wait many months maybe a year to try to have an opportunity to recover their visibility and traffic. During this time companies are often forced to lay off staff or even close their businesses.

This seems like a hefty price to pay because you made a mistake, either knowingly or not, or someone attacked your website.

Now we have a point where the very real question of ethics comes into focus.

Evil Google?

The sheer time between updates, the fact that it happens to sites whose owners have been targeted or simply did not know, combined with how much traffic a site loses makes it a very lethal update.

What makes it worse is there is really no reason for it. Before Penguin existed you would simply lose your link values if you had a “spammy” link profile and for sites Google found particularity egregious, a manual penalty would work nicely.

This practice had the same effect on the site as the Penguin “penalty” with one very large difference. With a loss of links you can recover by simply building good links and with a manual penalty you have someone to which you can appeal. The goal was to remove website spam, but instead many legitimate businesses are losing their livelihoods.

Terminator Algorithms.

It is easy to understand why Google started the Penguin updates and if it could be run every 30 days like clockwork most would likely have no issue with its presence. Even if a site could recover from the devaluation that quickly, they would still need to rebuild their link profile and that would take more than a few months. This would be a hard slap to the site owner and Google would still get its disavow data.

However, this is not how it works. The collateral damage is high and often complete.

The Cost.

Of course Google should be able to update its algorithms to manage their search. Not likely, many would argue that point. They should be able to punish those who violate the guidelines; after all it is their search engine. Heck the thought of an impending site dying on the SEO vine because of a Google update is just the part of the game those who utilize black hat tactics love. Did they get away with it? Did Google figure it out? Knowing Google didn’t definitely creates a bit of a rush. However, should they be allowed to destroy businesses?

Penguin is the terminator of algorithms. Though Google has encouraged people for years to participate in the Internet economy it keep releasing a set of sites killing algorithms. Site killing algorithms from which, many companies cannot recover.

My Mom.

Like many people my mom lived on the money she made from her business. It supplemented her meager bit of social security and disability. She was not able to work regular jobs due to severe lung disease, but she managed to make quite a tidy sum selling belt buckles online. She was proud of this business. She deserved to be so. Yet, what if she had made a mistake? What if she woke up one morning to no traffic because someone targeted her business? Or a bad provider bought links? What would have happened to her? She would not have survived financially. Thankfully we never had to find out, but she was lucky. So many others are not.

For a business owner, a Penguin update is no different than someone burning your business to the ground with no known time frame in which you can rebuild. Does Google need to remember that behind these websites they send to “Penguin Hell” are people like my mom who relied on their businesses to put food on their table, to pay their rent, to pay employees, to live? In its quest for dominance, has Google forgotten there are human faces with families and lives behind these websites that get caught up in the collateral damage of their algorithmic practices?

I think they have forgotten. Maybe they need a reminder.

(Note I don’t do links)

The following two tabs change content below.
Kristine currently resides in Las Vegas, NV and owns SitesWithoutWalls.com and The Vetters working with Dave Davies of Beanstalk SEO - both are full-service Internet-based consulting firms that focus on bringing the best in the business together, on a project-by-project basis, to help make sites better "By Making Them Work." Kristine has worked for sixteen years in the creation, development, implementation and maintenance of websites in all sectors including government, academia, entertainment and e-commerce with a focus on usability, architecture, human factors, W3C, Section 508 and WCAG accessibility compliance as well additional specializations in SEO, ORM and Social Media. She works regularly on auditing sites for clients and assists them in recovering from traffic degradation, user conversion issues and Google penalties. During her career she has consulted on or implemented web sites for entities such as SuperPages.com, USA.gov, AOL, The Department of Homeland Security, Reba McEntire and Ulla Popken as well as traveled to China with IBM and the UN to instruct Chinese province officials on W3C and WCAG standards.
Kristine Schachinger :Kristine currently resides in Las Vegas, NV and owns SitesWithoutWalls.com and The Vetters working with Dave Davies of Beanstalk SEO - both are full-service Internet-based consulting firms that focus on bringing the best in the business together, on a project-by-project basis, to help make sites better "By Making Them Work." Kristine has worked for sixteen years in the creation, development, implementation and maintenance of websites in all sectors including government, academia, entertainment and e-commerce with a focus on usability, architecture, human factors, W3C, Section 508 and WCAG accessibility compliance as well additional specializations in SEO, ORM and Social Media. She works regularly on auditing sites for clients and assists them in recovering from traffic degradation, user conversion issues and Google penalties. During her career she has consulted on or implemented web sites for entities such as SuperPages.com, USA.gov, AOL, The Department of Homeland Security, Reba McEntire and Ulla Popken as well as traveled to China with IBM and the UN to instruct Chinese province officials on W3C and WCAG standards.