@glyph Did you quote post something?
-
@glyph I am very, very tired of Next Big Things anyway.
-
@miss_rodent @glyph I'd even go beyond theoretically possible, as someone who did a lot of quantum computing work. I think the set of things that would have to be true for quantum computers to be fundamentally impossible or even inherently impractical would be pretty surprising.
At the same time, I don't have any particular evidence that we're especially close, nor that there's any easy wins in terms of Next Big Things once one is built.
-
@miss_rodent @glyph I'd even go beyond theoretically possible, as someone who did a lot of quantum computing work. I think the set of things that would have to be true for quantum computers to be fundamentally impossible or even inherently impractical would be pretty surprising.
At the same time, I don't have any particular evidence that we're especially close, nor that there's any easy wins in terms of Next Big Things once one is built.
@miss_rodent @glyph Chemistry is about as close as we have to a field where quantum computers are probably going to be commercially useful, which could unlock some several dozen new exciting materials science things, but that's pretty far from (as to @glyph's point) what VCs seem to expect, where every company will need to buy up QC time. It might even be a bigger impact in the long run, but it sure as hell isn't VC's Next Big Thing.
-
@miss_rodent @glyph Chemistry is about as close as we have to a field where quantum computers are probably going to be commercially useful, which could unlock some several dozen new exciting materials science things, but that's pretty far from (as to @glyph's point) what VCs seem to expect, where every company will need to buy up QC time. It might even be a bigger impact in the long run, but it sure as hell isn't VC's Next Big Thing.
@miss_rodent @glyph (Sorry... I tend to infodump on demand about quantum computing.)
-
@miss_rodent @glyph (Sorry... I tend to infodump on demand about quantum computing.)
@xgranade @glyph I tend to infodump too, so, very much no need to apologize/I appreciate the input, it's a field I have looked into a little bit but not very deeply or very recently, just kinda looked into it as something tangential to the residual math/physicist nerdery that lingered after I dropped out of Uni.
And yeah, it seems unlikely to be a thing corporate and finance types would be that interested in, or if it will be does not seem like it will be anytime soon.
-
@miss_rodent @glyph (Sorry... I tend to infodump on demand about quantum computing.)
@miss_rodent @glyph I guess the last thing I should add is that there's a lot of problems where we know a priori that there *isn't* a quantum advantage — if someone spends most of their CPU time now on one of those problems, then they're not likely to be a huge QC customer, making it harder for VCs to sell quantum computing as the Next Big Thing.
-
@xgranade @glyph I tend to infodump too, so, very much no need to apologize/I appreciate the input, it's a field I have looked into a little bit but not very deeply or very recently, just kinda looked into it as something tangential to the residual math/physicist nerdery that lingered after I dropped out of Uni.
And yeah, it seems unlikely to be a thing corporate and finance types would be that interested in, or if it will be does not seem like it will be anytime soon.
@miss_rodent @glyph I mean, there have been at least two or three big waves where corporate and finance types were very *very* interested in quantum computing. I think, and again to @glyph's point, there's a large degree to which that can be explained by zero or negative interest rates.
Each time, there's a concrete event that sets off the hype wave, and it dies down when people realize that QC is *always* five years away.
-
@bitprophet you're the first one to catch that (I guarantee it hurt more to write than it does to read :))
@glyph the rest of the post is good too btw!
-
@miss_rodent @glyph I mean, there have been at least two or three big waves where corporate and finance types were very *very* interested in quantum computing. I think, and again to @glyph's point, there's a large degree to which that can be explained by zero or negative interest rates.
Each time, there's a concrete event that sets off the hype wave, and it dies down when people realize that QC is *always* five years away.
@miss_rodent @glyph Like, one time wound up being a coincidence between the timing of a complicated bit of tax... er... well, not *fraud* exactly, as it was all perfectly legal, so I'll call it tax shenanigans... around the F-35 project and the Volkwagen emissions fraud that indirectly created a gigantic hype wave. That was weird.
Another time, it was the failure of an NLP-based "AI" product that created a hype wave.
-
@miss_rodent @glyph Like, one time wound up being a coincidence between the timing of a complicated bit of tax... er... well, not *fraud* exactly, as it was all perfectly legal, so I'll call it tax shenanigans... around the F-35 project and the Volkwagen emissions fraud that indirectly created a gigantic hype wave. That was weird.
Another time, it was the failure of an NLP-based "AI" product that created a hype wave.
@miss_rodent @glyph I'm being intentionally vague here in public, but I promise those are both fully accurate descriptions even if a bit cryptic.
-
@miss_rodent @glyph I'm being intentionally vague here in public, but I promise those are both fully accurate descriptions even if a bit cryptic.
-
@miss_rodent @glyph Yeah, no, fair. It's a weird one in that I think there's a lot more there there to QC than AI, by a long shot, but also it's not the line-go-up kind of thing VCs seem to hope it will be, and definitely not on the timescales they seem to think that it will happen on.
-
@ireneista @cthos @xgranade this isn't exactly a "cheerful" thought, but it's also not horribly grim: I think we already saw it break down in 2020, and we saw both how brittle it is (nobody had enough slack in their supply chain to actually weather the disruption without exposing catastrophic delays to customers) but also its resilience (customers were super mad, alternate pathways DID come online in less than a year)
-
@glyph I've aspired to "lunch-scale projects" for several years now. Intentionally meaningless—semantic choice in opposition to the differently meaningless "Web scale"—but generally:
- not for crowds (used by one person at a time, maybe a small group)
- bite-sized projects with a smaller scope
- the kind of work where I could fix the average bug before lunchtime
These ideals *have* made it hard to be persuasive in dev job interviews, alas.
All this to say I agree with the thoughts and CTA.
@randomgeek Clearly I have similar aspirations :). Thanks!
-
@miss_rodent @glyph Yeah, no, fair. It's a weird one in that I think there's a lot more there there to QC than AI, by a long shot, but also it's not the line-go-up kind of thing VCs seem to hope it will be, and definitely not on the timescales they seem to think that it will happen on.
@xgranade @glyph Yeah, I think there is a lot more potential to QC than there is to LLMs, and likely to AI, overall, though machine learning and stuff, more generally, have some very actually-useful applications in various fields. Trying to make a chatbot into a new work-eliminating machine god seems like a pretty obviously hopeless endeavor though.
I don't think the useful parts of either are going to earn the trillions that could salvage this sort of datacenter money-burning craze though. -
@xgranade @glyph Yeah, I think there is a lot more potential to QC than there is to LLMs, and likely to AI, overall, though machine learning and stuff, more generally, have some very actually-useful applications in various fields. Trying to make a chatbot into a new work-eliminating machine god seems like a pretty obviously hopeless endeavor though.
I don't think the useful parts of either are going to earn the trillions that could salvage this sort of datacenter money-burning craze though.@miss_rodent @glyph I spent about 10 years on classical ML tools for building QCs and another few on using QCs for doing ML, with a bonus serving of "using small QCs and novel ML algorithms to build larger QCs." Needless to say, I agree about there being some useful stuff under the moniker of machine learning.
-
@miss_rodent @glyph I spent about 10 years on classical ML tools for building QCs and another few on using QCs for doing ML, with a bonus serving of "using small QCs and novel ML algorithms to build larger QCs." Needless to say, I agree about there being some useful stuff under the moniker of machine learning.
@xgranade @miss_rodent love the genre of mastodon post which is "as [literally the most qualified person in the world on this specific topic], 'yes'"
-
@glyph I wonder if the industrial revolution was like this--a few years of wild continuous change and then settling down into a stable form.
-
@glyph I wonder if the industrial revolution was like this--a few years of wild continuous change and then settling down into a stable form.
@astraluma I don't think any of it narrativizes quite that precisely and neatly, and it depends exactly where you were and what you were doing at the time, but, in general, I think, "yes"
-
@xgranade @miss_rodent love the genre of mastodon post which is "as [literally the most qualified person in the world on this specific topic], 'yes'"