Google has said that they don’t want your internal website search results indexed by them. If you don’t disallow this type of dynamic pages, Google could penalize your website. As a mater of fact, this is not on the top of our Technical SEO Audit Checklist but it is worth to be mentioned as something that can prevent from possible duplicate content issues.
You can disallow internal pages to be crawled by using statement like one of the following ones in robots.txt. It depends on your website implementation what the exact pattern should be.
Disallow: /?s=* Disallow: /?q=search/
However, blocking internal search pages via your robots.txt isn’t the best way to solve the problem. It’s much better to allow search engines to crawl this kind of pages and to prevent them from indexing by using a meta robots tag with a value of “noindex, follow”.
<meta name="robots" content="noindex, follow" />
For WordPress websites the Yoast plugin can be used. More information on how to do that with the plugin can be found on the next link.