Most site have a robots.txt file that tells search engines which part of a website it's OK to explore and index, and which parts they should stay away from.
4SEO includes a simple editor to let you change the content of your robots.txt file, but also an optimizer which can make (small) changes to the default Joomla robots.txt file to improve SEO results
The robots.txt file is also used to signal to search engines that you have an XML sitemap. This is done by adding a specific line to the robots.txt file and is handled automatically by 4SEO if you enable XML sitemap handling.
Robots.txt file edition
Robots.txt file will show you the content of your robots.txt file, if you have one (if you don't, the editor will be empty and you can just your desired content).
After making any change, press the
Save button in the toolbar. The toolbar itself has only 2 actions:
Optimize for SEO and
Optimize for SEO
Pressing this button will cause 4SEO to modify the robots.txt file content for better SEO results. Namely, it will allow search engines to crawl more folders than what the default Joomla robots.txt allows.
4SEO will not automatically save the changes it makes, to let you validate them first. Be sure to press the
Save button after optimizing.
Any change made to the file content by 4SEO is clearly marked as so with comments.
Clicking this button will cause 4SEO to revert any change it made to the original content.