In part of today’s Google Webmaster Hangout, an interesting question came up about why Google decided to go to a real-time Penguin, if it was to appease webmasters who were penalized with the recovery time, or if it was simply the way Penguin evolved with the ability to analyze links on the fly, which then transitioned into a discussion about the weight of links in the search algo once Penguin goes live.
First, on why Google is making the switch to real time Penguin. Mueller’s response is pretty interesting for those who follow Penguin.
I think it’s a bit of both. It’s not that awesome that it takes so long for these things to update, that’s definitely one issue there. But on the other hand, if we can make it so that it doesn’t require as much hand holding from our side, then that’s great as well.
He was asked then if a move to real time Penguin means that there wouldn’t be as much weight given to links in the algorithm, so there wouldn’t be as much of a risk if links were accidentally mis-classified.
I don’t know. I wouldn’t say it’s specifically like that, I don’t know that we could say we’re putting less weight to links, but sometimes I wish I could just say that in a hangout so that people would stop asking all these link building questions, but essentially, especially within a website we do need those links to understand how the website is structured, how to find all those pages within the website. So to some extend, some of this is something that we can’t discard completely.
I think that the algorithms are always evolving and as we add new factors, then we have to reconsider how we handle old factors and try to find a new balance there. It’s not that we can just say well if you put this special symbol on your pages you’ll get an extra bonus, it’s not that you can rank minus two above everything else, above the whole search results page, we still have to kind of shuffle all of the factors we use into the ranking, essentially.
If someone is searching for something, we try and show the most relevant results and bring that and why it makes sense for the user.
It is interesting he talks specifically about the use of internal links on a site, which is pretty essential for Google discovering pages within a site, especially for those sites that do not use sitemaps. And he doesn’t bring up external links specifically, which is the most interesting aspect of the possibility of link values being rebalanced in the algo.
Will the value of links be slightly devalued, especially for external links, once Penguin goes live? It does raise some interesting questions if there is the chance Google might weight links a bit differently going forward. But as we have seen with other search engine link experiment with not using links as a ranking factor, such as Yandex, who ended their experiment after a year, reducing the role of links too much in the algo can have negative consequences.
Of course, Google could have already made changes to link values – or to other older algo factors as they have added new ones this year, outside of linking – as we have seen many times when we notice fluctuations in the search rankings. So even if there are changes to link values, it might not be tied directly to the upcoming Penguin update in the new year.
That said, perhaps it was just one of those “how the algo works” comments from Meuller and not tied to links specifically, but rather just the algo as a whole. But nonetheless, interesting to consider.
It is certainly a fascinating discussion, especially since links are valued so highly in the search algo, and one to keep an eye on in the future.
Latest posts by Jennifer Slegg (see all)
- Google Treats Hreflang in Sitemaps and HTML the Same - June 13, 2018
- Google Dropping Meta Tag Description Length Warnings from Search Console - June 7, 2018
- Google Search Console Crawl Stats to Update Soon - June 6, 2018
- Google: Don’t Block Googlebot from Slow Resources on Page - June 5, 2018
- Google’s Googlebot Crawling, Search Visibility and Rankings - June 4, 2018