Web design in the early 2000s: Every 100ms of latency on page load costs visitors.
-
@mark @david_chisnall Instead of fixing broken code with proper logging and code performance observability, lets stop all the effort and expect Cloudflare to care about actual humans (and not just about their PaaS billing). 😓
@autiomaa @mark @david_chisnall Honestly I'm kind or surprised there isn't a "pay Cloudflare for X connections without a challenge/captcha", because it would be another revenue stream for them.
-
Web design in the early 2000s: Every 100ms of latency on page load costs visitors.
Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.
@david_chisnall This was when the tech bros realized that it is all in comparison to everything else.
If you just make EVERYTHING worse then it doesn't matter that you're bad.
The real story of computing (and perhaps all consumer goods)
-
Web design in the early 2000s: Every 100ms of latency on page load costs visitors.
Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.
@david_chisnall it's funny, everytime I try to access a website that uses Cloudflare, I have to use sth else or disable my VPN && my DNS resolver.
So if they can have my data, they let me use them. So don't tell me it is about prorection against bots.
It's about gathering data - or am I just paranoid af? -
Web design in the early 2000s: Every 100ms of latency on page load costs visitors.
Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.
@david_chisnall I don't even care about Cloudflare (and Anubis) checks – those at least rarely last more than a few seconds. What I loathe are the throbbing placeholders that seem to be everywhere now, causing simple text pages to load slower than similarly-looking pages (once the content renders) loaded on dial-up.
-
@david_chisnall I don't even care about Cloudflare (and Anubis) checks – those at least rarely last more than a few seconds. What I loathe are the throbbing placeholders that seem to be everywhere now, causing simple text pages to load slower than similarly-looking pages (once the content renders) loaded on dial-up.
RE: https://infosec.exchange/@jernej__s/116028286564917007
@david_chisnall Oh, and also this:
-
undefined aeva@mastodon.gamedev.place shared this topic on
-
@autiomaa @mark @david_chisnall Honestly I'm kind or surprised there isn't a "pay Cloudflare for X connections without a challenge/captcha", because it would be another revenue stream for them.
@internic There is such payment model on Cloudflare for the LLM companies (giving them much faster download speeds for 3rd party content scraping), but not for regular consumers.
-
@mark @david_chisnall I don't think that's actually the case, at least not entirely. The main issue is that the Internet is currently being inundated with LLM content crawlers to the point that it overwhelms websites or scrapes content some sites don't want sucked into AI training data. It has caused a massive number of sites to serve those bot-detection pages to everyone. So it's not quite an issue of too many visitors but actually "too many non-human visitors"
@danherbert @david_chisnall I wasn't limiting "visitors" to humans.
-
This morning, Cloudflare decided that a company I wanted to place an order with shouldn't trust me, so I went to one of their competitors.
@david_chisnall There is a hilarious possible future where the government fails to do anything about monopolies but Cloudflare has a de-factor competition increase effect because it makes it so onerous for everyone to use one site that people start self-selecting to use other sites.
-
@david_chisnall I'd like to automate the process of responding to Cloudflare's checks
@jackeric that's exactly what their code is designed to prevent
It's still possible, but... not without some fighting -
@david_chisnall
It's also the tens of MByte of Frameworks and JavaScript and ad services that have to be loaded every single time.@david_chisnall @MeiLin 400-500 separate data tracking recipients on each page..m
-
Web design in the early 2000s: Every 100ms of latency on page load costs visitors.
Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.
@david_chisnall BRB, going to use my landline phone to call my local lunch spot to place an order that I will walk to go get.
-
Web design in the early 2000s: Every 100ms of latency on page load costs visitors.
Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.
@david_chisnall Just had a bunch of these whilst trying to do a reverse lookup on a number used to call me this evening.
I think that the peak internet speed was in the early 1990s. Dial up was slow but pages were static html with no javascript/font/whatever else calls to other sites hosting the resources.
Each search on AltaVista would produce a first page full of genuinely useful websites one of which would be guaranteed to answer your question.
This is NOT just nostalgia.
-
@david_chisnall I don't even care about Cloudflare (and Anubis) checks – those at least rarely last more than a few seconds. What I loathe are the throbbing placeholders that seem to be everywhere now, causing simple text pages to load slower than similarly-looking pages (once the content renders) loaded on dial-up.
@jernej__s @david_chisnall Don't get me started on Anubis.
I was browsing with SeaMonkey and wanted to find out if it was possible to customise/edit a search engine in SeaMonkey. So I followed a link to the mozillazine forums.
Using SeaMonkey I could NOT get past the Anubis check, it just hung and never completed.
Maybe these systems could also check the browser strings and be clever enough to realise that a SeaMonkey user might have a genuine reason to visit the mozillazine website?
-
@jernej__s @david_chisnall Don't get me started on Anubis.
I was browsing with SeaMonkey and wanted to find out if it was possible to customise/edit a search engine in SeaMonkey. So I followed a link to the mozillazine forums.
Using SeaMonkey I could NOT get past the Anubis check, it just hung and never completed.
Maybe these systems could also check the browser strings and be clever enough to realise that a SeaMonkey user might have a genuine reason to visit the mozillazine website?
@the_wub They check user-agent and challenge anything that claims to be Mozilla (because that's what the majority of bots masquerade as).
Also, weird that Seamonkey can't pass it – I just tried with Servo, and it had no problems.
-
@internic There is such payment model on Cloudflare for the LLM companies (giving them much faster download speeds for 3rd party content scraping), but not for regular consumers.
@autiomaa So the bots have an option to bypass the captchas meant to catch bots but the humans don't. That tracks. 😩 @mark @david_chisnall
-
Web design in the early 2000s: Every 100ms of latency on page load costs visitors.
Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.
@david_chisnall Cloudflare is a protection racket. It's disgusting.
-
Web design in the early 2000s: Every 100ms of latency on page load costs visitors.
Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.
@david_chisnall late 2020s and LLM brought us choice. If Cloudflare isn’t of your liking Anubis is happy to add some delay to page load
-
Web design in the early 2000s: Every 100ms of latency on page load costs visitors.
Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.
@david_chisnall Also along those lines, I miss the days when page contents included the most important information instead of being loaded later via JavaScript.
-
@the_wub They check user-agent and challenge anything that claims to be Mozilla (because that's what the majority of bots masquerade as).
Also, weird that Seamonkey can't pass it – I just tried with Servo, and it had no problems.
@jernej__s @the_wub Every graphical web browser claims to be Mozilla.
-
@david_chisnall @MeiLin 400-500 separate data tracking recipients on each page..m
@david_chisnall @drwho
And the worst part is that the majority of effects and stuff, you can do with pure CSS and HTML. But because everyone started to jump on the React bandwagon the moment JavaScripe infected servers in addition to browsers, it was over.In addition to tracking cookies and the like.
I am still wondering what 'legitimate interest' is, aside from a figleaf to still try and track me...