Skip to content

Most site have a robots.txt file that tells search engines which part of a website it's OK to explore and index, and which parts they should stay away from.

4SEO includes a simple editor to let you change the content of your robots.txt file, but also an optimizer which can make (small) changes to the default Joomla robots.txt file to improve SEO results

The robots.txt file is also used to signal to search engines that you have an XML sitemap. This is done by adding a specific line to the robots.txt file and is handled automatically by 4SEO if you enable XML sitemap handling.

Robots.txt file edition

Visiting the Robots.txt file will show you the content of your robots.txt file, if you have one (if you don't, the editor will be empty and you can just your desired content).

View of 4SEO robots.txt file editor

After making any change, press the Save button in the toolbar. The toolbar itself has only 2 actions: Optimize for SEO and Remove optimizations

Optimize for SEO

Pressing this button will cause 4SEO to modify the robots.txt file content for better SEO results. Namely, it will allow search engines to crawl more folders than what the default Joomla robots.txt allows.

This is extremely important as it's quite common to have CSS and javascript files in folders blocked by Joomla. Without access to your site Javascript and CSS, search engines cannot decide if it is, for instance, mobile-compatible or properly evaluate its speed, understand its content and more.

4SEO will not automatically save the changes it makes, to let you validate them first. Be sure to press the Save button after optimizing.

Any change made to the file content by 4SEO is clearly marked as so with comments.

Remove optimization

Clicking this button will cause 4SEO to revert any change it made to the original content.