• Home
  • Get help
  • Ask a question
Last post 4 hours 24 min ago
Posts last week 89
Average response time last week 30 min
All time posts 67708
All time tickets 10463
All time avg. posts per day 21

Helpdesk is open from Monday through Friday CET

Please create an (free) account to post any question in the support area.
Please check the development versions area. Look at the changelog, maybe your specific problem has been resolved already!
All tickets are private and they cannot be viewed by anyone. We have made public only a few tickets that we found helpful, after removing private information from them.

#4289 – Not urgent, a wish list item

Posted in ‘sh404SEF’
This is a public ticket. Everybody will be able to see its contents. Do not include usernames, passwords or any other sensitive information.
Tuesday, 07 November 2017 16:16 UTC
mikem22
 Hi

Couldn't see where else to post this so i did this, obviously you can delete this once you have read it.

It would be very useful to be able to search for or set a filter on 404 errors that have been found by googlebot, bingbot, etc. Often I have to go through lists of 100's of 404 errors where most are internal library and admin folders types (is there a way to skip admin urls from getting listed?) ... and of course, where people type wrong urls, etc. So being able to filter by search engine bot would save some time when going through the lists of 404 errors.

thank you

Mike
Tuesday, 07 November 2017 16:32 UTC
wb_weeblr
Hi

Thanks for your suggestion. We do not store whether a request is coming from a search bot yet, though we have plan to use that information.

Not really for 404s though, as it really does not matter to search engines that you have 404s or not. The really important thing is whether those links are on your site or not (ie your visitors are faced with a 404 when clicking on a link on your site), as this is where you can fix things. Which is why we instead try to identify the referrer, and make highly visible any 404 that comes from your own site.

Again, we will store bots requests, but probably not for that purpose: instead, we want to help identify which part of the site have not been crawled or discovered, if any. This is a required step for creating a useful sitemap for instance (and not the usual ones where all URLs are just stored in the files, or worse, only the ones from the menu items).

Rgds
 
Sunday, 12 November 2017 17:31 UTC
mikem22
Hi there ..

In SH404SEF in the 404 requests, clicking on the + sign under the details column shows where the request came from and the referring URL
In the user agent, it has the bot, e.g. bingbot, google bot.

So to be able to search/filter on user-agent would give a list ..

Mike
Monday, 13 November 2017 10:33 UTC
wb_weeblr
Hi

Not exactly. Firs of course, the additional data is optional, ie it's a user setting and not all users store it. This additional data is stored in a separate db table, so to be able to filter this data, we would need a much more complex SQL query, which would slow down the display very much as soon as there is a significant number of 404.

As this information has no real practical use, I don't see myself going through this.

Rgds
 
Monday, 13 November 2017 11:20 UTC
mikem22
Ok,

That's fair enough.

thanks for the explanation.

Mike
This ticket is closed, therefore read-only. You can no longer reply to it. If you need to provide more information, please open a new ticket and mention this ticket's number.