Web design in the early 2000s: Every 100ms of latency on page load costs visitors.
-
Web design in the early 2000s: Every 100ms of latency on page load costs visitors.
Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.
@david_chisnall BRB, going to use my landline phone to call my local lunch spot to place an order that I will walk to go get.
-
Web design in the early 2000s: Every 100ms of latency on page load costs visitors.
Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.
@david_chisnall Just had a bunch of these whilst trying to do a reverse lookup on a number used to call me this evening.
I think that the peak internet speed was in the early 1990s. Dial up was slow but pages were static html with no javascript/font/whatever else calls to other sites hosting the resources.
Each search on AltaVista would produce a first page full of genuinely useful websites one of which would be guaranteed to answer your question.
This is NOT just nostalgia.
-
@david_chisnall I don't even care about Cloudflare (and Anubis) checks – those at least rarely last more than a few seconds. What I loathe are the throbbing placeholders that seem to be everywhere now, causing simple text pages to load slower than similarly-looking pages (once the content renders) loaded on dial-up.
@jernej__s @david_chisnall Don't get me started on Anubis.
I was browsing with SeaMonkey and wanted to find out if it was possible to customise/edit a search engine in SeaMonkey. So I followed a link to the mozillazine forums.
Using SeaMonkey I could NOT get past the Anubis check, it just hung and never completed.
Maybe these systems could also check the browser strings and be clever enough to realise that a SeaMonkey user might have a genuine reason to visit the mozillazine website?
-
@jernej__s @david_chisnall Don't get me started on Anubis.
I was browsing with SeaMonkey and wanted to find out if it was possible to customise/edit a search engine in SeaMonkey. So I followed a link to the mozillazine forums.
Using SeaMonkey I could NOT get past the Anubis check, it just hung and never completed.
Maybe these systems could also check the browser strings and be clever enough to realise that a SeaMonkey user might have a genuine reason to visit the mozillazine website?
@the_wub They check user-agent and challenge anything that claims to be Mozilla (because that's what the majority of bots masquerade as).
Also, weird that Seamonkey can't pass it – I just tried with Servo, and it had no problems.
-
@internic There is such payment model on Cloudflare for the LLM companies (giving them much faster download speeds for 3rd party content scraping), but not for regular consumers.
@autiomaa So the bots have an option to bypass the captchas meant to catch bots but the humans don't. That tracks. 😩 @mark @david_chisnall
-
Web design in the early 2000s: Every 100ms of latency on page load costs visitors.
Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.
@david_chisnall Cloudflare is a protection racket. It's disgusting.
-
Web design in the early 2000s: Every 100ms of latency on page load costs visitors.
Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.
@david_chisnall late 2020s and LLM brought us choice. If Cloudflare isn’t of your liking Anubis is happy to add some delay to page load
-
Web design in the early 2000s: Every 100ms of latency on page load costs visitors.
Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.
@david_chisnall Also along those lines, I miss the days when page contents included the most important information instead of being loaded later via JavaScript.
-
@the_wub They check user-agent and challenge anything that claims to be Mozilla (because that's what the majority of bots masquerade as).
Also, weird that Seamonkey can't pass it – I just tried with Servo, and it had no problems.
@jernej__s @the_wub Every graphical web browser claims to be Mozilla.
-
@david_chisnall @MeiLin 400-500 separate data tracking recipients on each page..m
@david_chisnall @drwho
And the worst part is that the majority of effects and stuff, you can do with pure CSS and HTML. But because everyone started to jump on the React bandwagon the moment JavaScripe infected servers in addition to browsers, it was over.In addition to tracking cookies and the like.
I am still wondering what 'legitimate interest' is, aside from a figleaf to still try and track me...
-
Web design in the early 2000s: Every 100ms of latency on page load costs visitors.
Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.
@david_chisnall
So what will you do?Nobody gets fired for buying cloudflare.
-
@david_chisnall it's funny, everytime I try to access a website that uses Cloudflare, I have to use sth else or disable my VPN && my DNS resolver.
So if they can have my data, they let me use them. So don't tell me it is about prorection against bots.
It's about gathering data - or am I just paranoid af?@hex0x93 I know nothing about Cloudflare's data practices. But I do know a lot of sites have been forced to go with Cloudflare because so many AI bots are incessantly scraping their site that the site goes down and humans can't access it - essentially AI is doing a DDOS, and when that's sustained for weeks/months/more then the Cloudflare-type system seems to be the only way to have the site actually available to humans.
I hate it but those f---ing AI bots, seriously, they are ruining the net.
-
@hex0x93 I know nothing about Cloudflare's data practices. But I do know a lot of sites have been forced to go with Cloudflare because so many AI bots are incessantly scraping their site that the site goes down and humans can't access it - essentially AI is doing a DDOS, and when that's sustained for weeks/months/more then the Cloudflare-type system seems to be the only way to have the site actually available to humans.
I hate it but those f---ing AI bots, seriously, they are ruining the net.
@david_chisnall @zeborah i know, and it probably isn't about data and stuff. But for me it is annoying, that it deems me as a bot, just because of some settings I enabled on my browser and system....^^
-
Web design in the early 2000s: Every 100ms of latency on page load costs visitors.
Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.
@david_chisnall There's the self-hosting option of sticking anubis in front of your service so that it can throttle visitors by making their browser do a bunch of work.
There's also the bouncing around between various services and proxies in order to get logged in...something I'm currently struggling to figure out because apparently I'm a dumbass that can't figure out traefik or how to properly set environment variables or something.
-
Web design in the early 2000s: Every 100ms of latency on page load costs visitors.
Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.
@david_chisnall I remember optimizing thumbnail-images to within kilobytes of their lives...
...and now apparently nobody thinks twice about requiring many MB of JS code per page-load.
(TLDR: this current nonsense is nonsense.)
-
@david_chisnall I remember optimizing thumbnail-images to within kilobytes of their lives...
...and now apparently nobody thinks twice about requiring many MB of JS code per page-load.
(TLDR: this current nonsense is nonsense.)
@woozle I'll just be happy if people stop serving images that should be jpegs or webp in png format.
-
@woozle I'll just be happy if people stop serving images that should be jpegs or webp in png format.
JPG screenshots 🔥
-
Web design in the early 2000s: Every 100ms of latency on page load costs visitors.
Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.
@david_chisnall Don't forget the oh so lovely "select all the squares with cars in them" with a picture of one single car and you're not quite sure if you should select the squares in the bottom middle or not.
I'm given to understand that those actually do jack squat for stopping bots... They just use tokens to straight up bypass or something.
I personally don't mind t he delay so much, but I do hate having to deal with that crap — especially when it fails and declares that I'm supposedly not human. (How the F was I supposed to know that the blurry red glob in the bottom right was supposed to be a lion? It wasn't even the right color!) The one with the catgirl and a loading bar is fine I guess. But all that other crap can take a flying leap.
-
@autiomaa So the bots have an option to bypass the captchas meant to catch bots but the humans don't. That tracks. 😩 @mark @david_chisnall
@internic That's not a bug, that's a feature!
I guess... -
@mark @david_chisnall I don't think that's actually the case, at least not entirely. The main issue is that the Internet is currently being inundated with LLM content crawlers to the point that it overwhelms websites or scrapes content some sites don't want sucked into AI training data. It has caused a massive number of sites to serve those bot-detection pages to everyone. So it's not quite an issue of too many visitors but actually "too many non-human visitors"
@danherbert @mark @david_chisnall Sadly, that is our reality. One siteʼs traffic was 75–80 per cent scraper (even back in 2023) so up went the Cloudflare blocks and challenges. (Before anyone @s me about this, Iʼm not a computer whiz so this is the only thing I know how to use.) And itʼs finally worked after figuring out which ASNs and IP addresses are the worst, with traffic on that site back to pre-2023 levels (which I know means an overall drop in ranking).