Hi
Maybe over time, someone else will want to create rules for different robots
There are some use cases, for instance the Paywall structured data type: this is to let protected content accessible to Google but no to others. But for that, you need more than user agent checking, you must also check IP addresses. I already can detect search engines reliably (in 4SEO, we don't inject CoreWebVitals measurement code if visitor is a SE for instance) but I have not yet found enough proper use of that to add it to the rules engine
Thanks for the suggestion, it's one of these things where the code is here, but the use case is not clear cut yet.
Best regards
Yannick Gaultier
weeblr.com / @weeblr