More questions about this came up at Gary Illyes’ Pubcon keynote Thursday. Jenny Halasz from JLH Marketing referenced the recent long Twitter discussion from Wednesday evening (started with this tweet and more in-depth look here) asked for more details about this issue and what to do if the content is so bad that there is no real choice but to remove it.
While responding, Illyes did make an interesting recommendation for those who are removing thin content for Panda reasons. Rather than simply use a 404 or a 410, he strongly recommends that webmasters should use noindex on those pages, ensure those pages are listed in the sitemap or add them to the sitemap and then submit the sitemap to Google.
The additional recommended step of the sitemap definitely makes sense. This would (hopefully) see those URLs removed from the index more quickly, so lessen the potential turnaround time for Google to crawl those pages on its own schedule. And while Illyes didn’t mention the remove URL tool in Gogle Search Console, it is another way you can prompt Google to remove URLs, although it isn’t exactly feasible on a larger scale, and Google considers this method only temporary, unless the pages are completely removed when the 90 day removal period is up.
It is also worth noting that Illyes said he does also recommend improving or adding more quality content over mass removal of perceived low quality pages from a site impacted by Panda. But if there is no way to improve upon it, such as really spammy UGC, then removing it is fine… but again, work on adding value in the form of quality or “thick” content.
Latest posts by Jennifer Slegg (see all)
- Google Treats Hreflang in Sitemaps and HTML the Same - June 13, 2018
- Google Dropping Meta Tag Description Length Warnings from Search Console - June 7, 2018
- Google Search Console Crawl Stats to Update Soon - June 6, 2018
- Google: Don’t Block Googlebot from Slow Resources on Page - June 5, 2018
- Google’s Googlebot Crawling, Search Visibility and Rankings - June 4, 2018