X
    Categories: GoogleSEO

Google’s Fred Algo is a New Type of Quality Algo

The latest so-called “Fred” update has brought one thing to the forefront – SEOs can’t decide what Fred is and what specific tactic it is targeting.  Is it links?  Is it ads?  Is it ad heaviness?  Is it private blog networks, aka PBNs?

Many people are identifying symptoms of Fred, but I feel Fred is much broader in scope than simply “ad heavy” or “links” although those are clearly thrown into the mix.  All signs point to Fred as being more of a next generation quality algo identifying various aspects of sites that make a page or site low quality, and then demoting accordingly.   In other words, sites that are created to benefit the site owner and created for Google SEO purposes, but not so much the end user  who ends up on one of those pages from Google search results.

Fred Symptoms

Barry Schwartz was definitely on the right track because he noticed some of the lower quality characteristics that these sites had, but it doesn’t explain all of the sites that were impacted by Fred.  For example, many people have connected the update with links and the use of either paid links or a private blog network (PBN).  But then others were certain links were not the root of Fred traffic losses but could have been more on the quality aspect of it, such as bombarding visitors with ads.

Pretty much every algo piece we know about has been mentioned by at least one SEO in conjunction with Fred, including Panda, Peguin, “above the fold” algo, Pirate, etc.  But again, Fred doesn’t seem to fit nicely into any of those, at least not when you look at it from beyond a single site.  When looking at a greater collection of Fred impacted URLs, it seemingly makes no sense – on the surface.

Fred Through a Broader Lens

I think SEOs need to look at Fred through a broader lens, and you will see that all the sites have one thing in common – no matter how well the site owner tries to disguise the site as being one that isn’t used solely for affiliate/ad revenue or links – they were primarily designed with Google in mind and not users who might end up on those sites.  In other words, those sites benefit Google and the site owner, but they don’t benefit the average Google searcher who sees them in the search results.

Now, Google targeting low quality is nothing new.  Low quality content is why they created Google Panda.  The Fred update doesn’t seem to just target content.  We are seeing sites impacted for the fact their primary purpose was to be used for link juice to “money sites” instead… again low quality sites for Google’s benefit only, since they were really only used for linking out.

Fred also seems to be targeting the type of sites that would be rated low or lowest according to the Google Quality Rater Guidelines.

Fred & Ads

For those impacted by Fred, sure the solution could be remove ads, but that would likely only fix the subset of sites that matched the low quality markers Google targeted – after all, many low quality spammy sites tend to load them up with ads.  But there are many higher quality sites that also load up users with ads that rank fine, so simply being ad heavy isn’t the root cause of the drop.

If you have a lot of ads and were hit by Fred, look at the site beyond the fact you have ads, and look at things like placement.  For example, ads that disrupt content flow (ie those who have an ad block every 1-2 paragraphs of content) can be seen as a low quality marker since you are wanting people to click those ads instead of consuming the content – and this is a point the Google Quality Rater Guidelines make clear.

If your ads are not just obtrusive (all ads above the fold) but disruptive (such as ads every 1-2 paragraphs of content) that is definitely a sign of low quality in the guidelines.

Fred & Affiliates

Likewise with affiliate ads.  There are many high quality affiliate sites out there, and Google has said on numerous occasions that affiliate sites can rank well.     But there are probably significantly more affiliate sites that are low quality, duplicated thin content that provides no value to a searcher that lands there.

So if you do have affiliate content and were hit by Fred, then look at those characteristics that made Google see the entire page – as in the page beyond the fact it has an affiliate link – as low quality.

Fred & Links

Which brings us to links, another often-cited reason for Fred hits, as many – but not all – sites were used for linking purposes… either abused for incoming or outgoing links.  But yes, quality plays a role here too.  Were those links for the benefit of the user?  Or were they for the benefit of Google and SEO?  Was the site strictly used as a link vehicle to link to another more important money site?  If the site and/or links were used for Google/SEO, then this is a problem.

Take a hard look at links on/to the site and look for low quality patterns that Google can identify, particularly ones that match PBNs for both the sites powering the money site, and the money site itself.

Fred & Content

Which brings us to content.  In a lot of examples, the quality isn’t the best.  Some of it is clearly put through a spinner – some better than others – because the reading of it seems a bit off.  On hit affiliate style sites, it is merely cookie cutter content taken straight from a datafeed without attempts to make it unique or to give it any “added value”.

Now, added value?  Many of these sites lack something else Google looks for in the Quality Rater Guidelines – supplemental content, things that aren’t part of the main content but still bring value to the page.  Many of the examples are clearly lacking what would make a page more valuable than a similar competitor’s page.

Fred & SEO

Looking at Fred from an SEO perspective, it isn’t clear if overoptimization is included or not, since I did see overoptimization as one of the targets to an earlier update this year.    However, if it was a precursor to what we are seeing today, then overoptimization could play a role.

And many sites do match SEO patterns when they are low quality.  For example, with PBNs, you often see a very similar type of anchor text optimization, since they are often very careful to not have too many links with the same keyword – something that can actually look unnatural when you consider how the same thing looks on brand sites with many more links.

Low Quality Characteristics

Google seems to be getting smarter at identifying low quality characteristics algorithmically.  These were all things that would often have to be captured manually by Google through manual actions for things like thin content, pure spam and links.  This could also mean that it makes it harder for those sites to know what was wrong and fix

This could also mean that it makes it harder for those sites to know what was wrong and fix them since with manual actions it is often pretty clear cut with a path to be reindexed if the problems were fixed.  But algorithmic suppressions are algorithmic, meaning if Fred caused a site to tank in the search results, there is a less clear picture of how to return.  And a spammy site is still a spammy site if its sole purpose is for Google alone and not users.

Fred & Insights from Quality Rater Guidelines

The Google Quality Rater Guidelines are also invaluable for those trying to recover.  Read up the Low and Lowest quality sections, as well as Needs Met, and determine what characteristics Google describes that matches your own site – and be certain you aren’t looking at your own site through rose-colored glasses… even high quality sites have room for improvement.

Final Thoughts

Fred seems to be all about site quality in its many forms, and shouldn’t be pigeon-holed as just “link related” or “just ad-heavy related”.  And because of this, it resulted in SEOs not having a very clear idea of what Fred is.  When looking at it closer, however, it is clear that this is a next generation quality algo from Google targeting sites that hold no benefit to a searcher landing on them.

And the manual action aspect of this is interesting since it does seem to encompass some of the issues that were often handled by manual actions.

If you were hit by Fred, or are concerned a similar algo or Fred refresh (if such a thing is possible) analyze critically who the site benefits most – Google and the site owner, or the user who ends up on the page through a Google search?  If it is the former, you likely need to increase the quality to make it an example of “a great content site” or “a great affiliate site”.

And don’t forget, Google can adjust or turn up the dial on any of these algos.  So if you weren’t hit this time, but you know your site has some of these low quality characteristics Fred seems to be targeting, taking the time to improve quality now would help Fred-proof (not to mention Panda-proof, Penguin-proof etc) your sites going forward.

But the future of Google and search is clear… create content and sites for users, not for Google.

The following two tabs change content below.

Jennifer Slegg

Founder & Editor at The SEM Post
Jennifer Slegg is a longtime speaker and expert in search engine marketing, working in the industry for almost 20 years. When she isn't sitting at her desk writing and working, she can be found grabbing a latte at her local Starbucks or planning her next trip to Disneyland. She regularly speaks at Pubcon, SMX, State of Search, Brighton SEO and more, and has been presenting at conferences for over a decade.
Jennifer Slegg :Jennifer Slegg is a longtime speaker and expert in search engine marketing, working in the industry for almost 20 years. When she isn't sitting at her desk writing and working, she can be found grabbing a latte at her local Starbucks or planning her next trip to Disneyland. She regularly speaks at Pubcon, SMX, State of Search, Brighton SEO and more, and has been presenting at conferences for over a decade.