• About Us
  • Contributors
  • Guides
  • Speaking Engagements
  • Write for The SEM Post
  • Submit a tip or contact us!
  • Newsletters

The SEM Post

Latest News About SEO, SEM, PPC & Search Engines

  • Google
  • SEO
  • Mobile
  • Local
  • Bing
  • Pay Per Click
  • Facebook
  • Twitter
  • State of the Industry
You are here: Home / Google / Some Technical Site Problems Are Not a Ranking Problem in Google

Some Technical Site Problems Are Not a Ranking Problem in Google

August 12, 2015 at 4:19 am PST By Jennifer Slegg

  • Facebook
  • Twitter
  • Google+
  • Pinterest
  • LinkedIn
  • Email
  • WhatsApp
  • Evernote
  • SMS

technical site problems googleWhile many SEOs put a great emphasis on having technically flawless sites, the reality is that many website owners have sites with plenty of technical problems and issues.  And many believe that a technically perfect site will outrank a competitor’s site with errors.  In the last Google Webmaster Hangout, John Mueller said that in general, technical problems on a site are not a ranking problem.

The question: Would there be a small ranking benefit if we compared the same site once with a lot of 404 and soft 404s present and other technical problems such as wrong hreflangs and no canonical tags in comparison to the same site in “perfect” technical condition ?

Here is his response.

Again there are the two aspects here.  On one hand the crawling and indexing part and on the other hand, the ranking part.

When we look at the ranking part and we essentially find all of these problems then in general, that’s not going to be a problem.  Where you might see some effect is with the hreflang markup because with the hreflang markup we can show the right pages in the search results, it’s not that those pages would rank better but you’d have the right pages ranking in the search results.

With regards to 404s and soft 404s, those are all technical issues that any site can have, and that’s not something we would count against a website.

This also confirms what Gary Illyes said on Twitter earlier this week that 404s do not cause a Google penalty.

On the other hand, for crawling and indexing, if you have all of these problems with your website, you have a complicated URL structure that’s really hard to crawl,  that is really hard for us to figure out which version of these URLs we should be indexing, there’s no canonical, all of that kind of adds up and makes it harder for us to crawl and index these pages optimally.

So what might happen is we get stuck and crawl a lot of cruft and then not notice there’s some great new content that we’re missing out on.  So that’s something that could be happening there.

It’s not that we would count technical issues against a site when it comes to ranking but rather that these technical issues can cause technical problems that can result in things not being processed optimally.

Bottom line, fix your technical issues because there are definite benefits in helping Google understand the correct content to rank.  But technical issues in itself won’t cause additional ranking problems.

Added: Obviously, there are a ton of other issues outside of this question that can cause major ranking issues… SEO experts are always discovering new ways that webmasters have managed to cause issues with ranking, along with some of the usual issues seen, whether it is accidentally blocking Googlebot from crawling a site or not bothering to upgrade known known exploits in popular WordPress plugins that causes the hacked warning to show up in the search results.

  • Facebook
  • Twitter
  • Google+
  • Pinterest
  • LinkedIn
  • Email
  • WhatsApp
  • Evernote
  • SMS
The following two tabs change content below.
  • Bio
  • Latest Posts
My Twitter profileMy Facebook profileMy Google+ profileMy LinkedIn profile

Jennifer Slegg

Founder & Editor at The SEM Post
Jennifer Slegg is a longtime speaker and expert in search engine marketing, working in the industry for almost 20 years. When she isn't sitting at her desk writing and working, she can be found grabbing a latte at her local Starbucks or planning her next trip to Disneyland. She regularly speaks at Pubcon, SMX, State of Search, Brighton SEO and more, and has been presenting at conferences for over a decade.
My Twitter profileMy Facebook profileMy Google+ profileMy LinkedIn profile

Latest posts by Jennifer Slegg (see all)

  • 2022 Update for Google Quality Rater Guidelines – Big YMYL Updates - August 1, 2022
  • Google Quality Rater Guidelines: The Low Quality 2021 Update - October 19, 2021
  • Rethinking Affiliate Sites With Google’s Product Review Update - April 23, 2021
  • New Google Quality Rater Guidelines, Update Adds Emphasis on Needs Met - October 16, 2020
  • Google Updates Experiment Statistics for Quality Raters - October 6, 2020

Filed Under: Google, SEO

Sign up for our newsletter


Comments

  1. Alan Bleiweiss says

    August 12, 2015 at 5:48 am

    Once again John Mueller completely fails to communicate factual information in a clear enough way that the vast majority of site managers and SEOs need to hear. I don’t honestly know why he does this – either he is reckless intentionally (I doubt this very much) or he just wants to be nice and answer questions people ask him, even though he has NO CLUE.

    Technical issues DIRECTLY impact rankings. I’ve been doing SEO for 16 years, and audits since 2007. I’ve specialized in audits for three years.

    If enough technical issues exist, there are countless ways rankings will plummet.

    In just ONE example, if, as John references, there are crawl problems, Google will become massively confused. Except the impact will NOT be limited to his claimed “we might not find new shiny pages” issue.

    Where does he think massive exponential duplicate content comes from? Obviously he knows it’s from horrific site architecture, among other things.

    And I absolutely guarantee you that if a site has exponential duplicate content from critically flawed architecture, that’s going to cripple most sites at least to some degree in rankings.

    There are so many other ways technical issues will inevitably harm rankings it makes my head spin reading this post.

    • Jennifer Slegg says

      August 12, 2015 at 7:08 am

      That’s why I had included the exact question asked, so people had the context behind it. But I did add that there are many other ways webmasters can cause technical issues that do affect rankings as well.

  2. Alan Bleiweiss says

    August 12, 2015 at 5:50 am

    Here’s another:

    Have enough page processing speed flaws? You bet that’s going to directly weaken rankings.

    Even Matt Cutts himself said at his last SMX Advanced appearance, that as a general rule, if pages take longer than 20 seconds, rankings will suffer.

    That’s an ENTIRELY technical issue.

  3. Joe Preston says

    August 12, 2015 at 6:16 am

    This is helpful and provides some clarity. What I would expect is that soft 404’s would cause a wasteful use of your crawl “budget” with googlebot. Ilyes’s statements kind of muddied the waters for me, because for me “crawling and indexing” vs “ranking” aren’t as important to keep in mind as separate processes as they are to a Google engineer. If the content isn’t junk, optimal indexation is a key component of optimizing organic traffic (for a sufficiently large pageset).

Founder & Editor

Jennifer Slegg (2052)

Sign up for our daily news recap & weekly newsletter.


Follow us online

  • Facebook
  • Google+
  • Linkedin
  • Pinterest
  • Twitter

Latest News

2022 Update for Google Quality Rater Guidelines – Big YMYL Updates

We finally have the first Google Quality Rater Guidelines update of 2022, and like usual, it is … [Read More...]

Recent Posts

  • 2022 Update for Google Quality Rater Guidelines – Big YMYL Updates
  • Google Quality Rater Guidelines: The Low Quality 2021 Update
  • Rethinking Affiliate Sites With Google’s Product Review Update
  • New Google Quality Rater Guidelines, Update Adds Emphasis on Needs Met
  • Google Updates Experiment Statistics for Quality Raters
  • Analyzing “How Google Search Works” Changes from Google
  • Google Quality Rater Guidelines Update: New Introduction, Rater Bias & Political Affiliations
  • Google Updates Quality Rater Guidelines: Reputation for News Sites; Video Content Updates; Quality for Information Sites
  • Google Makes Major Changes to NoFollow, Adds Sponsored & UGC Tags
  • Google Updates Quality Rater Guidelines Targeting E-A-T, Page Quality & Interstitials

Categories

  • Affiliate Marketing
  • Amazon
  • Apple
  • Bing
  • Branding
  • Browsers
  • Chrome
  • Content Marketing
  • Design
  • Domains
  • DuckDuckGo
  • Email
  • Facebook
  • Firefox
  • Foursquare
  • Google
    • Analytics
    • Google RankBrain
    • Quality Rater's Guidelines
  • History of Search
  • Industry Spotlight
  • Instagram
  • Internet Explorer
  • Links
  • Local
  • Mobile
  • Native Advertising
  • Other Search Engines
  • Pay Per Click
  • Pinterest
  • Publishers
  • Security
  • SEO
  • Snapchat
  • Social Media
  • State of the Industry
  • The SEM Post
  • Tools
  • Twitter
  • Uncategorized
  • User Experience
  • Video Marketing
  • Week in Review
  • Whitepapers
  • Wordpress
  • Yahoo
  • Yelp
  • YouTube
December 2025
MTWTFSS
« Aug  
1234567
891011121314
15161718192021
22232425262728
293031 

Meta

  • Log in
  • Entries RSS
  • Comments RSS
  • WordPress.org

Copyright © 2025 · News Pro Theme On Genesis Framework · WordPress · Log in