• Home
  • Get help
  • Ask a question
Last post 1 hour 5 min ago
Posts last week 141
Average response time last week 4 hours 42 min
All time posts 67801
All time tickets 10476
All time avg. posts per day 21

Helpdesk is open from Monday through Friday CET

Please create an (free) account to post any question in the support area.
Please check the development versions area. Look at the changelog, maybe your specific problem has been resolved already!
All tickets are private and they cannot be viewed by anyone. We have made public only a few tickets that we found helpful, after removing private information from them.

#8866 – Suggestion for 4SEO

Posted in ‘4SEO’
This is a public ticket. Everybody will be able to see its contents. Do not include usernames, passwords or any other sensitive information.
Sunday, 26 June 2022 05:22 UTC
Alex

Hello. Another suggestion for 4SEO.
It would be nice to be able to create SEO rules for certain user agents (for example, separately for the Google search robot and separately for the Yandex search robot).

By analogy with the rules in the robots.txt. Where it is possible to indicate for which root certain rules are intended.

 
Monday, 27 June 2022 08:01 UTC
wb_weeblr

Hi

Yes, that's not too hard. What would be the use case? what would you change based on whether this is a regular user, google or Yandex?

Best regards

Yannick Gaultier

weeblr.com / @weeblr

 

 
Monday, 27 June 2022 09:17 UTC
Alex

I have AJAX comments on my site. Google indexes them perfectly and understands escaped_fragment. Yandex does not understand such addresses and considers them duplicates. It would be nice to disable indexing of addresses with escaped_fragment only for Yandex.

 
Monday, 27 June 2022 09:47 UTC
wb_weeblr

Hi

Can't you exclude them from your robots.txt? these looks like a pretty specific set of URLs that can be excluded there?

Best regards

Yannick Gaultier

weeblr.com / @weeblr

 

 
Monday, 27 June 2022 09:48 UTC
Alex

Yes, just like that and I'm already doing a lot of years.

 
Monday, 27 June 2022 10:14 UTC
wb_weeblr

Hi

OK, so I'm not too keen on adding features inside of 4SEO that can be handled in another way, unless that feature is very common and complicated to do otherwise. I consider that using robots.txt is "not complicated" for someone that knows SEO enough to realize "Yandex considers it duplicate".

Also, wouldn't it be time to consider loading comments in another way? I know it's been a while but Google clearly announced they won't support fragments forever so at some point they will drop it I think.

Best regards

Yannick Gaultier

weeblr.com / @weeblr

 

 
Monday, 27 June 2022 10:32 UTC
Alex

"OK, so I'm not too keen on adding features inside of 4SEO that can be handled in another way, unless that feature is very common and complicated to do otherwise." - yes, sure. I understand it. I wrote this proposal so that you know that someone has such a need. Maybe over time, someone else will want to create rules for different robots. And then someone else and someone else, and so on. This way you can find out what is interesting to users.

"Also, wouldn't it be time to consider loading comments in another way? I know it's been a while but Google clearly announced they won't support fragments forever so at some point they will drop it I think." - I think the same. But I will need to use a different comment component. My current component I've been using for about 10 years. You probably know that getting started with new software is not always easy.

 
Monday, 27 June 2022 10:48 UTC
wb_weeblr

Hi

 Maybe over time, someone else will want to create rules for different robots

There are some use cases, for instance the Paywall structured data type: this is to let protected content accessible to Google but no to others. But for that, you need more than user agent checking, you must also check IP addresses. I already can detect search engines reliably (in 4SEO, we don't inject CoreWebVitals measurement code if visitor is a SE for instance) but I have not yet found enough proper use of that to add it to the rules engine


Thanks for the suggestion, it's one of these things where the code is here, but the use case is not clear cut yet.

Best regards

Yannick Gaultier

weeblr.com / @weeblr

 

 
Monday, 27 June 2022 10:54 UTC
Alex

My pleasure. Good luck to you!

 
This ticket is closed, therefore read-only. You can no longer reply to it. If you need to provide more information, please open a new ticket and mention this ticket's number.