htaccess – Safe way to block unwanted bots without potential impact on good ones

Currently, I have blocked several bots in htaccess (apache 2.4) like this. (Have used imaginary bot names in the below example.)

SetEnvIfNoCase User-Agent .*abcbot.* bad_bot
SetEnvIfNoCase User-Agent .*xyzbot.* bad_bot
,, ,, ,,

<RequireAll>
     Require all granted
     Require not env bad_bot
</RequireAll>

As you can see, I am checking for user agents each containing a certain string at any position (start|end|middle). But will such an approach unintentionally block good ones as well? For example, say in the future a search engine like Google decides to include one of my blocked strings in some random part of the user-agent, it may hurt my website prospects. Is there some kind of regulatory authority that will oversee and prevent overlap of strings among different user agents? Or will you recommend to include the full or at least more specific strings for user agents?emphasized text