@glyph Did you quote post something?
-
@rojun @glyph There are definitely people who care about the size of the binary. The problem is that there are a lot more people who don't know or care what architecture their CPU has. Making people choose an arch-specific binary is optimizing for the uncommon case.
As I said elsewhere in the thread, I'm fine with providing arch-specific binaries for people who want them. The download page should have one big "DOWNLOAD" button that provides a universal binary, though. Put the other versions behind a link or something, so people who want them can get them, but the average user has an obvious button to click.
@jalefkowit @rojun if you actually have this problem, then you should also have a launcher executable that can manage your enormous updates. If you already have a 100MB+ *binary* then your total bundle size is probably up near a gigabyte once you take resources into account and you're past the point where users should be revisiting a "download" page. That launcher can then be a small universal binary that bootstraps into an architecture-specific download that is transparent to the user
-
@jalefkowit @rojun if you actually have this problem, then you should also have a launcher executable that can manage your enormous updates. If you already have a 100MB+ *binary* then your total bundle size is probably up near a gigabyte once you take resources into account and you're past the point where users should be revisiting a "download" page. That launcher can then be a small universal binary that bootstraps into an architecture-specific download that is transparent to the user
@jalefkowit @rojun But also, the vast majority of apps I see doing this absolutely don't have this level of overhead problem. I personally ship _tremendously_ wasteful universal2 binaries including oodles of Python infrastructure I don't use, and my total bundle size is still in the tens of megabytes, with binaries being a fraction of that.
-
@jalefkowit @rojun But also, the vast majority of apps I see doing this absolutely don't have this level of overhead problem. I personally ship _tremendously_ wasteful universal2 binaries including oodles of Python infrastructure I don't use, and my total bundle size is still in the tens of megabytes, with binaries being a fraction of that.
@glyph @jalefkowit What's in those bundles?
Niche disclaimer: I deal with copy protected stuff that creates graphics programmatically. There's little cruft in terms of, for example, assets. The copy protection creates oodles of stuff to obfuscate the binary. This results in "large" size bundles that consist mostly of actual binaries (afaik).
-
@amethyst oh yeah, I did catch that. and an MBA too, in case we didn't realize how on-the-nose the whole thing was. I guess it does make sense that would resurrect interest here
@glyph yeah at this point moz management is just laughing at their users
-
@glyph @jalefkowit What's in those bundles?
Niche disclaimer: I deal with copy protected stuff that creates graphics programmatically. There's little cruft in terms of, for example, assets. The copy protection creates oodles of stuff to obfuscate the binary. This results in "large" size bundles that consist mostly of actual binaries (afaik).
@rojun @jalefkowit "binaries" are a fraction because the actual app in my case is mostly Python bytecode which is of course architecture neutral. But also config files, icons, 3rd-party framework resources like translations are bigger than the actual app :)
-
@rojun @jalefkowit "binaries" are a fraction because the actual app in my case is mostly Python bytecode which is of course architecture neutral. But also config files, icons, 3rd-party framework resources like translations are bigger than the actual app :)
@rojun @jalefkowit (these bundles are not particularly well optimized, and looking at them, yeesh, I should probably do another pass to elide some of this stuff)
-
@glyph I have no evidence to back this up, but I have a feeling that at least one of the reasons so many people still do this is a lot of people who consider themselves Very Technical think that a binary exceeding some arbitrary size is a sign of "bloat"
-
@glyph Today's hammers are the worst hairbrushes you'll ever have to use.
(Never mind that no amount of hammer progress will make a better hairbrush, that hammers aren't actually progressing, and that hairbrushes are a mostly solved problem that does not generally involve hammers. Oh, and never mind that this analogy sucks in that hammers are still useful for something, unlike GPT-whatever-version.)
-
@glyph I am so vicariously embarrassed for the people who believe the "eventually these will become self-improving and will grow exponentially" lie
if only I had some bridges to sell
-
@glyph Today's hammers are the worst hairbrushes you'll ever have to use.
(Never mind that no amount of hammer progress will make a better hairbrush, that hammers aren't actually progressing, and that hairbrushes are a mostly solved problem that does not generally involve hammers. Oh, and never mind that this analogy sucks in that hammers are still useful for something, unlike GPT-whatever-version.)
@glyph *offering solidarity in shared posting through it about our inane decade of inane bullshit*
-
@glyph I am so vicariously embarrassed for the people who believe the "eventually these will become self-improving and will grow exponentially" lie
if only I had some bridges to sell
@glyph I would call it a child's understanding of how technology develops, but this would be insulting to children
it's a billionaire's understanding
-
@glyph I would call it a child's understanding of how technology develops, but this would be insulting to children
it's a billionaire's understanding
-
-
-
@psistarpsiii @jamesh @glyph That's kinda where the analogy fails, I suppose... really, if the slopbros had their way we'd all have nails for hair, but the nails would be about as stiff as wet noodles.
-
@glyph
Even worse are the number of influencers who parroted the talking point.
https://youtu.be/0Plo-zT8W9w
Inexcusable for those content creators who do actually understand how neural networks are really trained -
@glyph
Gonna be challenging to improve the "intelligence" of the AI chatbots especially in light of what Meta has been doing to the training data company that used to supply all the AI tech companies with human curated answers to questions that their LLMs would learn to regurgitate
https://www.msn.com/en-us/money/companies/inside-the-fallout-from-mark-zuckerbergs-chaotic-14-billion-ai-deal/ar-AA1S3XmA -
-
@glyph I am so vicariously embarrassed for the people who believe the "eventually these will become self-improving and will grow exponentially" lie
if only I had some bridges to sell
@SnoopJ @glyph
You don't need to have a bridge to sell.Just a pitch deck that mentions AI a number of times. You can even AI generate it.
You can sell a lifeless rock with just the promise that the real AI software upgrade is in the pipeline
https://youtu.be/j31dmodZ-5c -
@glyph just wait until they ship the ones with screens in them