Sometimes there is an opportunity to get a great link from the traffic and exposure perspective, but you know that in the eyes of Google it could be seen as spammy. Or you want to have a link but for whatever reason, the link cannot be made a no-follow link. While some webmasters have been working around this, John Mueller from Google recently shared advice on how you can still have a link for the traffic value, but also protect yourself from potential Google penalties.
The specific example used by Mueller was a link from a directory site that had traffic value but could be seen as a paid link, as all the directory links were do follow links, but the same could be applied to a wide variety of sites that were not using nofollow.
First, use or buy a secondary domain you can use for these types of potentially dodgy links. Block Googlebot from indexing this secondary domain, but redirect the traffic from the secondary domain to the main. Not allowing Google to index the site will ensure that Google doesn’t take any incoming links to that site into account that could be then transferred onto your main site.
In general, I wouldn’t rely on directories as a way of getting links, so I think you are on the right track in that you are thinking of all of these issues. On the other hand, if this is a directory where lots of people are coming through to your site, then that might be something worth having that link on. That is something where you could see this as advertising or could see this as you are kind of advertising your site there.
If you really want to prevent the PageRank from being forwarded to your site, one thing you could do is have the link go to an intermediate site that is blocked by robots.txt and from there, redirect to your final URL. So that’s essentially one way that you could block the flow of PageRank to your site, even if you can’t control the no follow or not no follow state on the link to your site.
And using this type of technique when the links are strictly for traffic alone means that webmasters can request links from sites without being worried about potential Google ramifications if they come across as paid links or low quality links.
Latest posts by Jennifer Slegg (see all)
- Google Updates Quality Rater Guidelines Targeting E-A-T, Page Quality & Interstitials - May 17, 2019
- Google Local Service Ads Display Pricing Estimates for Specific Locations - August 31, 2018
- Google Testing “Relevant History” Section in Mobile Search Results - August 31, 2018
- Google Converts PDFs, DOCs, XLS etc into HTML for Indexing - August 30, 2018
- Why Google Shows Featured Snippets With Images from Another Site - August 29, 2018