• Home
  • Get help
  • Ask a question
Last post 10 hours 33 min ago
Posts last week 141
Average response time last week 4 hours 42 min
All time posts 67805
All time tickets 10478
All time avg. posts per day 21

Helpdesk is open from Monday through Friday CET

Please create an (free) account to post any question in the support area.
Please check the development versions area. Look at the changelog, maybe your specific problem has been resolved already!
All tickets are private and they cannot be viewed by anyone. We have made public only a few tickets that we found helpful, after removing private information from them.

#5868 – 404 error for robots.txt

Posted in ‘sh404SEF’
This is a public ticket. Everybody will be able to see its contents. Do not include usernames, passwords or any other sensitive information.
Tuesday, 25 June 2019 23:37 UTC
mark.l
 Components > sh404sef > 404 requests shows that robots.txt is a 404 error.

However, our site has robots.txt at the root, so I am confused.

The robots text is at:

https://www.xxxx.com/robots.txt

It also works at:

http://www.xxxx.com/robots.txt
https://xxxx.com/robots.txt
http://xxxx.com/robots.txt

Any suggestions are appreciated.

Mark
Wednesday, 26 June 2019 14:00 UTC
wb_weeblr
Hi

Can't really comment on that. sh404SEF just record the 404s as they happen, it does not play a role in them. This is indeed a bit weird considering the user agent making those requests look like those of well known bots from SEO-related services (Majestic, AHref, etc).

The most important thing is that this is a server thing: the .htaccess file on your server first checks if the file actually exists as a file on your server. If yes, it returns the file immediately, without firing Joomla (and therefore sh404SEF which lives only inside of Joomla).

If the file does not exist as a file on your server, then the request is passed to Joomla and that's when sh404SEF is given a chance to record it - which we do in your case.

So again I can't comment on why your webserver is not returning the file if the request is legit and the file actually exists as a file. I would suggest you check the actual logs of your web server at the time of one or more of those request and see if it is indeed /robots.txt that was requested.

Best regards
 
Thursday, 11 July 2019 05:34 UTC
system
This ticket has been automatically closed. All tickets which have been inactive for a long time are automatically closed. If you believe that this ticket was closed in error, please contact us.
This ticket is closed, therefore read-only. You can no longer reply to it. If you need to provide more information, please open a new ticket and mention this ticket's number.