Categories
SEO Fruits - White Hat SEO

Internal site search results and Google indexing

So, what to do? Should we disallow the website internal search results to be indexed by Google and how to do that?
The short answer is “Yes” and either of the following easy options could be used:
• Disallowing the crawlers access in robots.txt
• Preventing from indexing using a meta tag

Google has said that they don’t want your internal website search results indexed by them. If you don’t disallow this type of dynamic pages, Google could penalize your website. As a mater of fact, this is not on the top of our Technical SEO Audit Checklist but it is worth to be mentioned as something that can prevent from possible duplicate content issues.

You can disallow internal pages to be crawled by using statement like one of the following ones in robots.txt. It depends on your website implementation what the exact pattern should be.

Disallow: /?s=*
Disallow: /?q=search/

However, blocking internal search pages via your robots.txt isn’t the best way to solve the problem. It’s much better to allow search engines to crawl this kind of pages and to prevent them from indexing by using a meta robots tag with a value of “noindex, follow”.

<meta name="robots" content="noindex, follow" />

For WordPress websites the Yoast plugin can be used. More information on how to do that with the plugin can be found on the next link.

Leave a Reply

Your email address will not be published. Required fields are marked *