Doing disavows and link removals is always an interesting thing. And too often, sites are too aggressive in removing good quality links, which means that not only are sites losing rankings from the spam links artificially boosting the ranks, but they are also losing rankings due to the great quality links being caught in the link machete and being removed too.
The question was brought up in the last Google Webmaster Office Hours with John Mueller, from someone who had an SEO agency disavow almost all the links to a site to recover from an unnatural linking penalty, but the site owner was now interested in removing some of the “good” links from the disavow file.
Sure. If you disavow too much and you disavow things that are actually normal good links, you can definitely remove those from the disavow file, submit the new file with those links removed, and then we’ll be able to take those into account again. That’s probably not something where you will see a big change immediately happen, but over time as we recrawl those URLs, we’ll be able to take that into account again.
In general, while I recommend taking kind of a rough look at your links when it comes to link penalties, link manual actions, I wouldn’t recommend removing everything from your site, because then you’re really removing a lot of things that might actually be kind of beneficial and normal for your website as well.
There is always the fine balance when doing link removals, where you need to remove all the bad links, but inevitably, good links get removed too. And many SEO agencies and consultants prefer the machete approach even if good links get caught in it, since it will have a better chance of recovery to show their clients.
But if you are going for a faster recovery, if you don’t have a lot of practice identifying the signs of good links from the bad ones, then maybe the machete approach is the best way, then you can go and handpick the links that are great and remove them from the disavow after – just don’t remove all the links unless you are prepared to be hit by Google again.
Latest posts by Jennifer Slegg (see all)
- Why Googlebot Shows Fluctuations on Time Spent Downloading Page Report - May 24, 2018
- Google: When to Choose Subdirectories vs Subdomains - May 22, 2018
- Google Sending New Wave of Mobile-First Indexing Enabled Notifications - May 17, 2018
- Google Testing Disambiguation Box for Local Packs - May 17, 2018
- Google & Optimizing for Local “Near Me” Searches in Search Results - May 16, 2018