What the file "robots.txt" does on a website
As explained in the Google developer documentation Introduction to robots.txt: "A robots.txt file tells search engine crawlers which pages or files the crawler can or can't request from your site. This is used mainly to avoid overloading your site with requests; it is not a mechanism for keeping a web page out of Google. To keep a web page out of Google, you should use "noindex" directives, or password-protect your page".
Odoo versions prior to 14.0 do not have internal tools for editing and managing the robots.txt file, and if you need to change it, it can be tricky without developing. For these reasons, we have developed a module that gives the ability to handle the content of the "robots.txt" file in a simple way.
How the module works
Watch our video tutorial and learn how to work with the "robots.txt" file in Odoo using our module.
Module configuration
To configure the file go to the Website - Configuration - Websites menu and choose a website.
Note that you need to activate the developer mode to see this menu item.
In the section robots.txt select the file content mode.
You can use the one of the module predefined templates for the "robots.txt" content:
Allow All to allow search engines (bots) to crawl all your pages, media, and resource files.
Custom to use our template that allow to crawl images, style and script files and block the portal user personal page and internal content.
Block All to block access for all crawlers in the website if you don't want them to crawl.
You are able to modify the content sample from any template for your purposes and your SEO specialist's advice.
An additioanal information how to create the "robots.txt" file in this documentation from Google.
The Sitemap URL mode allows adding the sitemap URL automatically to the end of the file (mode: Auto) or you can add it manually where you prefer (mode: In Text).
Testing "robots.txt" file
After creating of the "robots.txt" file you can test it by the Google robots testing tool.