The thing is, you don't a CAPTCHA. Just three if statements on the server will do it:
1. If the user agent is chrome, but it didn't send a "Sec-Ch-Ua" header: Send garbage.
2. If the user agent is a known scraper ("GPTBot", etc): Send garbage.
3. If the URL is one we generated: Send garbage.
4. Otherwise, serve the page.
The trick is that instead of blocking them, serve them randomly generated garbage pages.
Each of these pages includes links that will always return garbage. Once these get into the bot's crawler queue, they will be identifiable regardless of how well they hide themselves.
I use this on my site: after a few months, it's 100% effective. Every single scraper request is being blocked. At this point, I could ratelimit the generated URLs, but I enjoy sending them unhinged junk. (... and it's actually cheaper then serving static files!)
This won't do anything about vuln scanners and other non-crawler bots, but those are easy enough to filter out anyway. (URL starts with /wp/?)