When disavowing links, many webmasters go and disavow every link they have that could possibly be considered an invalid link in the eyes of Google. For sites that have been the target of massive negative SEO campaigns or the victim of over-zealous link builders, lists can get pretty huge. But now it seems that you might want to keep that list shorter.
In a Google Webmaster Office Hours, the question was raised about how long it takes for links in a disavow to be processed and reflected in an update or refresh. And the answer probably wasn’t what many want to hear, especially those who are attempting to clean up a huge number of links.
Yes, definitely. So this is something where depending on the URL sometimes we crawl them daily, sometimes we crawl them every couple of months. So if you submit a large disavow file or a disavow file that includes a lot of domain entries or just generally includes a lot of different URLs, then that is something that’s going to take quite a bit of time to kind of recrawl all of those URLs naturally and reprocess all of that information.
So I wouldn’t be surprised if you are looking at a time frame of maybe 3 to 6 to 9 months even for disavow files to be completely taken into account.
And that is not something that happens from one day to the next, this is a granular process that happens step by step. So as individual URLs are recrawled and we see them in the disavow file, that will be taken into account. So it’s not that you have to wait this long for them to be reflected, it’s just that for everything to be recrawled and reprocessed it can take a significant amount of time.
Of course, this does raise the question of where is the happy medium between disavowing enough links to lift the penalty (or prevent one) versus having a large enough file that it could take many months to complete. While getting the links removed is always the best route to go, many webmasters don’t respond to removal requests or don’t want to pay the “removal fee” to get it removed.
It also reconfirms that Google doesn’t explicitly recrawl links that are included in a disavow file – something that has concerned webmasters whose sites might be included in a list – but rather simply continue crawling those sites on their own schedule.
Latest posts by Jennifer Slegg (see all)
- 2022 Update for Google Quality Rater Guidelines – Big YMYL Updates - August 1, 2022
- Google Quality Rater Guidelines: The Low Quality 2021 Update - October 19, 2021
- Rethinking Affiliate Sites With Google’s Product Review Update - April 23, 2021
- New Google Quality Rater Guidelines, Update Adds Emphasis on Needs Met - October 16, 2020
- Google Updates Experiment Statistics for Quality Raters - October 6, 2020
Jonah Stein says
I think you are missing the point here. Just because Google doesn’t finish completely taking the disavow into account doesn’t mean that the value of the disavow is ignored until all the pages are crawled. Nor does it suggest that making the disavow smaller will speed up the benefits.
The only thing it does imply is that you may be better off disavowing domains instead of URLs, since they may be counted faster.
Jennifer Slegg says
There will be a difference between disavow files with 50 URLs and/or domains versus ones with 20,000…. and some disavow files are larger than that. But it definitely raises the question about whether people should slog through attempting to get the links removed (which is ideal) versus those who simply do disavows (which is what many only do now).
I probably shouldn’t comment when I haven’t even had my coffee yet, but what the heck. So…why exactly does Google even need to crawl the links in the disavow file? (Not asking you specifically for the answer, just thinking out loud). I mean, it just doesn’t make sense to my caffeine-less brain that a crawl would be necessary. Where’s the logic? Bot looks at disavow file. Bot sees link. Bot says ok, this link will not count. Bot has to crawl the link anyway…why? Sounds like they are doing this the opposite way around. The bot isn’t actually looking at the disavow file until after it crawls a link during a normal crawl. Bot crawls link. Bot checks to see if the link is on the disavow file. Bot says ok, this link doesn’t count. I guess in that scenario it makes sense, but only if the calculation is happening every second of every day, every time a link is crawled. And maybe that’s exactly the way it works. I used to pay more attention to how the algo got calculated, but I stopped worrying about that a few years ago. Too many algos, too much to keep up with. In any case, seems odd that it would work that way. But I’ve probably gotten it all wrong from start to finish of my thought process. Ah well, my brain needs coffee now. Feel sorry for anyone waiting on disavow results, I guess. What a mess.
I sent a file with 543 URLs over a month ago and no response