It is necessary for a site to be indexed by search engines as Google or Bing so that it can serve this site in SEO better. Search Engine Robots can be considered as a powerful tool in Magento 2 SEO. Web crawlers and bots take a responsibility of indexing a site and they are consisted of settings in the Magento configuration with a purpose of producing and controlling instructions. In the root of Magento installation, robots.txt file has a role of saving the instructions and most search engines recognize and follow those instructions as well.
In default, Magento produces the robots.txt file with instructions so that web crawlers are able to avoid indexing several parts of the site with files that the system utilizes internally. You are freely to select instructions for all or specific search engines based on default or demand.
Here are instructions to configure robots.txt:
1. Click Stores on the Admin sidebar. Select Configuration under Settings.
2. Select Design under General on the left and in the panel.
3. Enlarge the Search Engine Robots section then follow instructions:
a. Establish Default Robots as one of the following:
- INDEX, FOLLOW: guide web crawlers to be able to index the site and examine later for any new changes.
- NOINDEX, FOLLOW: guide web crawlers to be able to avoid indexing the site but still examine any new changes.
- INDEX, NOFOLLOW: guide web crawlers to be able to index the site only one time and no need to check back again.
- NOINDEX, NOFOLLOW: guide web crawlers to be able to avoid indexing the site and not to examine any new changes.
b. You also can enter custom instructions into the Edit Custom instruction of robots.txt file box as wish.
c. Click Reset to Default if you want to restore the default instructions
4. Save Config when you finish.