what if ram buyout is part of a greater attack on personal computing
-
what if ram buyout is part of a greater attack on personal computing
any other indicators?
@lritter Qualcomm taking over Arduino, Raspberry Pi being dependent on Broadcom (though they started building their own chips with the Pico microcontrollers).
-
All the other crap like firmware signing and bootloader signing and such are not really addressing the actual problem space.
SecureBoot doesn't solve the "protect the owner of a machine from the machine running code the owner doesn't want it to run", but instead solves the "let the machine only run code that the maker of the machine allow it to run", which is antithetical to free markets and ownership.
@lritter That's not a new fight though. The whole struggle for the ownership of our devices has been going on for over 20 years. In the noughties they called it TCPA.
-
@lritter That's not a new fight though. The whole struggle for the ownership of our devices has been going on for over 20 years. In the noughties they called it TCPA.
-
what if ram buyout is part of a greater attack on personal computing
any other indicators?
@lritter IMHO this 'war on personal computing' had been going on in the background for decades with varying intensity, similar to how DDOS attacks happen all the time without bringing the whole internet down (mostly at least). Personal computing might go back to be the niche/hobbyist activity it was in the 70s to 90s, but it won't entirely disappear. I guess it might become an expensive hobby again though :/
-
@lritter IMHO this 'war on personal computing' had been going on in the background for decades with varying intensity, similar to how DDOS attacks happen all the time without bringing the whole internet down (mostly at least). Personal computing might go back to be the niche/hobbyist activity it was in the 70s to 90s, but it won't entirely disappear. I guess it might become an expensive hobby again though :/
@floooh it begs the question what we wrote all this infrastructure for. certainly not so a bunch of locusts can run their datacenters on it, and nothing else.
-
@floooh it begs the question what we wrote all this infrastructure for. certainly not so a bunch of locusts can run their datacenters on it, and nothing else.
@lritter hmm yeah... AFAIK the home computer revolution was mostly feeding on an overcapacity of slightly outdated 8-bit chips which suddenly became very cheap at the start of the 80s, and hobbyists and enthusiasts started to build all sorts of awesome things with this 'junk'.
Maybe we'll see a similar Cambrian Explosion after the AI bubble pops and suddenly there's a shitton of just slightly outdated GPUs and memory flooding the market waiting to be used for actually interesting stuff :)
-
@lritter hmm yeah... AFAIK the home computer revolution was mostly feeding on an overcapacity of slightly outdated 8-bit chips which suddenly became very cheap at the start of the 80s, and hobbyists and enthusiasts started to build all sorts of awesome things with this 'junk'.
Maybe we'll see a similar Cambrian Explosion after the AI bubble pops and suddenly there's a shitton of just slightly outdated GPUs and memory flooding the market waiting to be used for actually interesting stuff :)
@floooh indeed. i speculated as much myself.
-
@tomtrottel you see a self-sufficient people, i see future addicts i mean customers! customers!
@lritter its way worse than "just" creating a "drug" syndicate. its colonization. its creating slaves and kills the rest. a drug dealer does not want to kill its users.
-
SecureBoot is indeed one of those pretend-solutions, that do not really increase security of the machine, nor the customer, but only benefit the vendors.
There's only one good part, and only that one in the SecureBoot chain, and that is measured boot, i.e. checksum-chaining each subsequent piece of software before it's executed. Combine this with verity checked RO filesystems, that are linked to boot measures using a owner supplied key (=passphrase) and you're good.
@datenwolf @lritter and that isn't worth #Microsoft being in charge of what can boot on all PCs
-
@lritter hmm yeah... AFAIK the home computer revolution was mostly feeding on an overcapacity of slightly outdated 8-bit chips which suddenly became very cheap at the start of the 80s, and hobbyists and enthusiasts started to build all sorts of awesome things with this 'junk'.
Maybe we'll see a similar Cambrian Explosion after the AI bubble pops and suddenly there's a shitton of just slightly outdated GPUs and memory flooding the market waiting to be used for actually interesting stuff :)
-
@lritter its way worse than "just" creating a "drug" syndicate. its colonization. its creating slaves and kills the rest. a drug dealer does not want to kill its users.
-
@lritter yes, I was refering to my colonization analogy. you need to see the bigger picture cause this will have devestating consequence combined with what they can do with absolut enslaved computer users and the authorian tendencies that come with colonization. but never mind me, I am hope I am wrong and it is just a happy "lets turn them into addicts" :)
-
CPU microcode "security patches" - a discovered "fundamental flaw" requiring always-online validation to boot. turns every processor into a subscription service. precedent already exists with Intel Management Engine and AMD PSP.
ethernet/network chips - if networking hardware became scarce or required licensing/authentication.
discrete GPU extinction - already happening organically, but accelerated scarcity would eliminate any serious computing, gaming, or AI work on personal machines.
@lritter your new year's resolutions are very ambitious D:
-
@IngaLovinde @floooh @lritter
If AI investor funding to build and run new data centres dries up, the cloud compute power will have to be sold off cheaply. Every hacker will be making services that run in the cloud. The fediverse might benefit from this.Unviable data centres will have to be powered down and parts auctioned off. This will be an opportunity for those resourceful few who can figure out how to repurpose server rack hardware for home use
https://hackaday.com/2022/01/24/domesticating-old-server-hardware-in-the-age-of-shortages/ -
i sure would hate it if open source were the eli sunday whose milkshake will be drunk
@lritter i think the term which resonates with your doom scenario is 'learned helplessness'. Im also curious if the upcoming 'challenge-based learning'-methodology (which China also seems to adopt for schools) will combat it. Im also curious whether AI will further stigmatize 'deep' knowledge ("Since AI can write code for you, humans don't need to learn code anymore, and can use its outcome. Would you like me to run the app you described here?"). An engineer-based society might help?
-
@lritter hmm yeah... AFAIK the home computer revolution was mostly feeding on an overcapacity of slightly outdated 8-bit chips which suddenly became very cheap at the start of the 80s, and hobbyists and enthusiasts started to build all sorts of awesome things with this 'junk'.
Maybe we'll see a similar Cambrian Explosion after the AI bubble pops and suddenly there's a shitton of just slightly outdated GPUs and memory flooding the market waiting to be used for actually interesting stuff :)
@floooh @lritter
I think there will first be a race to the bottom along the enshittification curve. AI companies cannot convince enough free tier users to start paying so they will have to embed ads and sell promoted answers. Businesses will pay OpenAI/Google to train their chatbot to give a certain answer for specific topic areas, like when a user asks "where should we go out for a meal tonite?", "what is the best medication for a headache?", "what are the reviewer's saying about that movie?" -
@IngaLovinde @floooh @lritter
If AI investor funding to build and run new data centres dries up, the cloud compute power will have to be sold off cheaply. Every hacker will be making services that run in the cloud. The fediverse might benefit from this.Unviable data centres will have to be powered down and parts auctioned off. This will be an opportunity for those resourceful few who can figure out how to repurpose server rack hardware for home use
https://hackaday.com/2022/01/24/domesticating-old-server-hardware-in-the-age-of-shortages/@bornach @floooh @lritter your link is for general-purpose servers.
The "AI" servers are an entirely different thing from what I understand. Sure, you can salvage a motherboard and CPU and RAM sticks from them, but that's not a lot (like, the entire "AI" industry will only give you several hundreds thousands of relatively weak PCs, if you discard the GPUs).
Most of their hardware, most of their computing power, is in these overpowered GPUs, which are completely useless outside of "AI" loads in hyperscale data centers. Even if you get such a GPU for free, it will still not be worth it even to pay for electricity it consumes while running, not with kinds of the loads you might be able to run on it. There is no "repurposing" these GPUs afaik. -
-
@bornach @floooh @lritter your link is for general-purpose servers.
The "AI" servers are an entirely different thing from what I understand. Sure, you can salvage a motherboard and CPU and RAM sticks from them, but that's not a lot (like, the entire "AI" industry will only give you several hundreds thousands of relatively weak PCs, if you discard the GPUs).
Most of their hardware, most of their computing power, is in these overpowered GPUs, which are completely useless outside of "AI" loads in hyperscale data centers. Even if you get such a GPU for free, it will still not be worth it even to pay for electricity it consumes while running, not with kinds of the loads you might be able to run on it. There is no "repurposing" these GPUs afaik.@IngaLovinde @bornach @floooh does the A100 have no load scaling?
-
@domi @bornach @floooh @lritter but even if they have things useful for video encoding, I doubt they're more powerful than regular mid-range consumer GPUs in that regard? We do know that their main/only selling point is that they have a lot of FP4 performance.
Maybe, just maybe, the enthusiasts will then be able to reflash them with a new firmware disabling all the FP4 things and turning these extremely expensive, powerful, large, and hot GPUs into the equivalent of regular mid-range consumer GPUs, unit for unit. But if market were suddenly to get a million of even free devices that can, with effort, be turned into idk RTX 5050, that's definitely not going to cause any kind of "Cambrian explosion". Gamers already have 100x that number of similar consumer GPUs in their gaming PCs.My point is: when measured in units, there is not _that_ much "AI" hardware out there. And when measured in computing power, there is a lot, but 99% of it is useless.