Search

 

The /robots.txt file tells search engines what pages of your website they should crawl and list in their search results. The default setting which is recommended, allows search engines to crawl your complete website. If you would like to learn more about robots.txt or how to block pages from showing in search engines.

By default, you should set this to the following:

User-agent: * 
Allow: /

In case you would like to block certain pages from showing in search engines, you can add a Disallow tag:

User-agent: * 
Allow: / 
Disallow: /news

This will disallow all news pages from being shown in the search engine.