Web design in the early 2000s: Every 100ms of latency on page load costs visitors.
-
Web design in the early 2000s: Every 100ms of latency on page load costs visitors.
Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.
@david_chisnall Cloudflare is a protection racket. It's disgusting.
-
Web design in the early 2000s: Every 100ms of latency on page load costs visitors.
Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.
@david_chisnall late 2020s and LLM brought us choice. If Cloudflare isn’t of your liking Anubis is happy to add some delay to page load
-
Web design in the early 2000s: Every 100ms of latency on page load costs visitors.
Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.
@david_chisnall Also along those lines, I miss the days when page contents included the most important information instead of being loaded later via JavaScript.
-
@the_wub They check user-agent and challenge anything that claims to be Mozilla (because that's what the majority of bots masquerade as).
Also, weird that Seamonkey can't pass it – I just tried with Servo, and it had no problems.
@jernej__s @the_wub Every graphical web browser claims to be Mozilla.
-
@david_chisnall @MeiLin 400-500 separate data tracking recipients on each page..m
@david_chisnall @drwho
And the worst part is that the majority of effects and stuff, you can do with pure CSS and HTML. But because everyone started to jump on the React bandwagon the moment JavaScripe infected servers in addition to browsers, it was over.In addition to tracking cookies and the like.
I am still wondering what 'legitimate interest' is, aside from a figleaf to still try and track me...
-
Web design in the early 2000s: Every 100ms of latency on page load costs visitors.
Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.
@david_chisnall
So what will you do?Nobody gets fired for buying cloudflare.
-
@david_chisnall it's funny, everytime I try to access a website that uses Cloudflare, I have to use sth else or disable my VPN && my DNS resolver.
So if they can have my data, they let me use them. So don't tell me it is about prorection against bots.
It's about gathering data - or am I just paranoid af?@hex0x93 I know nothing about Cloudflare's data practices. But I do know a lot of sites have been forced to go with Cloudflare because so many AI bots are incessantly scraping their site that the site goes down and humans can't access it - essentially AI is doing a DDOS, and when that's sustained for weeks/months/more then the Cloudflare-type system seems to be the only way to have the site actually available to humans.
I hate it but those f---ing AI bots, seriously, they are ruining the net.
-
@hex0x93 I know nothing about Cloudflare's data practices. But I do know a lot of sites have been forced to go with Cloudflare because so many AI bots are incessantly scraping their site that the site goes down and humans can't access it - essentially AI is doing a DDOS, and when that's sustained for weeks/months/more then the Cloudflare-type system seems to be the only way to have the site actually available to humans.
I hate it but those f---ing AI bots, seriously, they are ruining the net.
@david_chisnall @zeborah i know, and it probably isn't about data and stuff. But for me it is annoying, that it deems me as a bot, just because of some settings I enabled on my browser and system....^^
-
Web design in the early 2000s: Every 100ms of latency on page load costs visitors.
Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.
@david_chisnall There's the self-hosting option of sticking anubis in front of your service so that it can throttle visitors by making their browser do a bunch of work.
There's also the bouncing around between various services and proxies in order to get logged in...something I'm currently struggling to figure out because apparently I'm a dumbass that can't figure out traefik or how to properly set environment variables or something.
-
Web design in the early 2000s: Every 100ms of latency on page load costs visitors.
Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.
@david_chisnall I remember optimizing thumbnail-images to within kilobytes of their lives...
...and now apparently nobody thinks twice about requiring many MB of JS code per page-load.
(TLDR: this current nonsense is nonsense.)
-
@david_chisnall I remember optimizing thumbnail-images to within kilobytes of their lives...
...and now apparently nobody thinks twice about requiring many MB of JS code per page-load.
(TLDR: this current nonsense is nonsense.)
@woozle I'll just be happy if people stop serving images that should be jpegs or webp in png format.
-
@woozle I'll just be happy if people stop serving images that should be jpegs or webp in png format.
JPG screenshots 🔥
-
Web design in the early 2000s: Every 100ms of latency on page load costs visitors.
Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.
@david_chisnall Don't forget the oh so lovely "select all the squares with cars in them" with a picture of one single car and you're not quite sure if you should select the squares in the bottom middle or not.
I'm given to understand that those actually do jack squat for stopping bots... They just use tokens to straight up bypass or something.
I personally don't mind t he delay so much, but I do hate having to deal with that crap — especially when it fails and declares that I'm supposedly not human. (How the F was I supposed to know that the blurry red glob in the bottom right was supposed to be a lion? It wasn't even the right color!) The one with the catgirl and a loading bar is fine I guess. But all that other crap can take a flying leap.
-
@autiomaa So the bots have an option to bypass the captchas meant to catch bots but the humans don't. That tracks. 😩 @mark @david_chisnall
@internic That's not a bug, that's a feature!
I guess... -
@mark @david_chisnall I don't think that's actually the case, at least not entirely. The main issue is that the Internet is currently being inundated with LLM content crawlers to the point that it overwhelms websites or scrapes content some sites don't want sucked into AI training data. It has caused a massive number of sites to serve those bot-detection pages to everyone. So it's not quite an issue of too many visitors but actually "too many non-human visitors"
@danherbert @mark @david_chisnall Sadly, that is our reality. One siteʼs traffic was 75–80 per cent scraper (even back in 2023) so up went the Cloudflare blocks and challenges. (Before anyone @s me about this, Iʼm not a computer whiz so this is the only thing I know how to use.) And itʼs finally worked after figuring out which ASNs and IP addresses are the worst, with traffic on that site back to pre-2023 levels (which I know means an overall drop in ranking).
-
@david_chisnall I remember optimizing thumbnail-images to within kilobytes of their lives...
...and now apparently nobody thinks twice about requiring many MB of JS code per page-load.
(TLDR: this current nonsense is nonsense.)
@woozle @david_chisnall I still do! Old habits.
-
@hex0x93 I know nothing about Cloudflare's data practices. But I do know a lot of sites have been forced to go with Cloudflare because so many AI bots are incessantly scraping their site that the site goes down and humans can't access it - essentially AI is doing a DDOS, and when that's sustained for weeks/months/more then the Cloudflare-type system seems to be the only way to have the site actually available to humans.
I hate it but those f---ing AI bots, seriously, they are ruining the net.
@zeborah @hex0x93 @david_chisnall This pretty much describes us. Scrapers as well as brute-force hackers multiple times per hour (even literally per second). One siteʼs traffic was 75–80 per cent scraper.
-
@david_chisnall "Please wait while we check that your Browser is safe" while my laptop goes for a minute or two into full load and screaming hot
Perhaps ending in "We are sorry but we could not verify you are an actual human, your machine shows suspect behaviour, sent an e-mail to admin to get access"
@Laberpferd @david_chisnall proof of work is such a bad CAPTCHA. Like, who thought bots couldn't evaluate JS
-
@zeborah @hex0x93 @david_chisnall This pretty much describes us. Scrapers as well as brute-force hackers multiple times per hour (even literally per second). One siteʼs traffic was 75–80 per cent scraper.
@jackyan @zeborah @david_chisnall and it is totally understandable to protect yourself against that. It is just super annoying for ppl like me, who value and protect their privacy.
An I am no webscraper, nor am I a hacker.... -
@jackyan @zeborah @david_chisnall and it is totally understandable to protect yourself against that. It is just super annoying for ppl like me, who value and protect their privacy.
An I am no webscraper, nor am I a hacker....@hex0x93 @zeborah @david_chisnall I hear you as I get annoyed, too. I believe ours is the one with the tick box, so no stupid 'Choose the bicycles' or rejection because you use a VPN.