X
    Categories: GoogleSEO

Google’s Structured Snippets in Search Results Will Reduce CTR

Google searchers have become accustomed to seeing knowledge graph information mixed into their Google search results.  Google is now taking this a step further and introducing structured snippets to some results within the Google search results.

For some queries, Google will extract specific information from the results page and include it as part of the snippet for the search result on the results page.

The structured snippets are also available for mobile users as well.

Structured Snippets are a collaboration between both the web search team and Google Research.

The WebTables research team has been working to extract and understand tabular data on the Web with the intent to surface particularly relevant data to users. Our data is already used in the Research Tool found in Google Docs and Slides; Structured Snippets is the latest collaboration between Google Research and the Web Search team employing that data to seamlessly provide the most relevant information to the user. We use machine learning techniques to distinguish data tables on the Web from uninteresting tables, e.g., tables used for formatting web pages. We also have additional algorithms to determine quality and relevance that we use to display up to four highly ranked facts from those data tables.

Of course, every time Google extracts information from a webpage and includes that information in the actual search result listing, instead of leading the searcher to click on the link to find the information themselves, would very likely reduce the CTR of those results for the website owners.  While it can be useful for searchers – and that is what Google is aiming to do – many webmasters won’t be happy with this change.

It also once again raises the question of where does this stop being “fair use” and start being plagiarism?  After all, Google is extracting the data from a website, then displaying it so users don’t actually need to click through to the webpage – the website would lose traffic, ad revenue and other conversions they may be using.  It is similar to the problem webmasters have with Google extracting significant data from a website, that would result in no one clicking through to the result, such as this example for “how to boil an egg”.

There doesn’t appear to be a way for webmasters to opt out of either, aside from blocking Googlebot from the site completely.

The following two tabs change content below.

Jennifer Slegg

Founder & Editor at The SEM Post
Jennifer Slegg is a longtime speaker and expert in search engine marketing, working in the industry for almost 20 years. When she isn't sitting at her desk writing and working, she can be found grabbing a latte at her local Starbucks or planning her next trip to Disneyland. She regularly speaks at Pubcon, SMX, State of Search, Brighton SEO and more, and has been presenting at conferences for over a decade.
Jennifer Slegg :Jennifer Slegg is a longtime speaker and expert in search engine marketing, working in the industry for almost 20 years. When she isn't sitting at her desk writing and working, she can be found grabbing a latte at her local Starbucks or planning her next trip to Disneyland. She regularly speaks at Pubcon, SMX, State of Search, Brighton SEO and more, and has been presenting at conferences for over a decade.