@vendelan
The idea is not that they can't, it's that they won't.
If you're a human visiting a website, evaluating some JS at worst costs you a few seconds. If you're a scraper bot trying to get millions of sites a second, it slows you down.
Nacho
@nachof@mastodon.uy
Posts
-
Web design in the early 2000s: Every 100ms of latency on page load costs visitors.