[2018]
- 
[2018] DUDES: We made a website where you can look up a charity and see what % of donations it spends on admin overhead ME: Hey that rules DUDES: It's called effective altruism [2025] OTHER DUDES: So those EA dudes want to pave all farmland on earth for the benefit of hypothetical robots 10,000,000,000,000 years in the future ME: Wha [DOES THIS STORY HAVE A MORAL? I CAN'T TELL.] 
- 
[2018] DUDES: We made a website where you can look up a charity and see what % of donations it spends on admin overhead ME: Hey that rules DUDES: It's called effective altruism [2025] OTHER DUDES: So those EA dudes want to pave all farmland on earth for the benefit of hypothetical robots 10,000,000,000,000 years in the future ME: Wha [DOES THIS STORY HAVE A MORAL? I CAN'T TELL.] @mcc i think its "never take anything to its logical extreme" 
- 
@mcc or, i guess in this case, illogical, bc im not sure logic applies to that particular philosophical leap 
- 
@mcc or, i guess in this case, illogical, bc im not sure logic applies to that particular philosophical leap 
- 
[2018] DUDES: We made a website where you can look up a charity and see what % of donations it spends on admin overhead ME: Hey that rules DUDES: It's called effective altruism [2025] OTHER DUDES: So those EA dudes want to pave all farmland on earth for the benefit of hypothetical robots 10,000,000,000,000 years in the future ME: Wha [DOES THIS STORY HAVE A MORAL? I CAN'T TELL.] Robert Anton Wilson, 1979: Any false premise, sufficiently extended, provides a reasonable approximation of insanity 2025: Any false premise, sufficiently extended, turns out to be an already-existing thread on something called "lesswrong dot com" and it turns out a cofounder of Paypal has already given it 10 million dollars 
- 
@ireneista @emaytch @mcc It's the exact same fallacy as Pascal's Wager. Once you assign infinite cost or reward to a decision, any "empirical" or "rational" framework applied to that loss function absolutely shits the bed. The resolution, of course, is "why in the hell do you think that's a possible outcome? what is your basis for that belief?" 
- 
@ireneista @emaytch @mcc It's the exact same fallacy as Pascal's Wager. Once you assign infinite cost or reward to a decision, any "empirical" or "rational" framework applied to that loss function absolutely shits the bed. The resolution, of course, is "why in the hell do you think that's a possible outcome? what is your basis for that belief?" @xgranade @ireneista @emaytch @mcc "computer means number go up forever" 
- 
@mcc or, i guess in this case, illogical, bc im not sure logic applies to that particular philosophical leap @emaytch @mcc at the risk of almost assuredly being That Guy, actually maybe the hints were there in the original website …something something… disrespect for glue work …rambling segway to my own REAL pet peeve… how this timeline could have been so much better if we'd tracked % of donations sent back in the mail asking for more donations instead U+220E 
- 
@jplebreton @xgranade @emaytch @mcc the singularity is the rapture for people who find computers easier to believe in than old men 
- 
[2018] DUDES: We made a website where you can look up a charity and see what % of donations it spends on admin overhead ME: Hey that rules DUDES: It's called effective altruism [2025] OTHER DUDES: So those EA dudes want to pave all farmland on earth for the benefit of hypothetical robots 10,000,000,000,000 years in the future ME: Wha [DOES THIS STORY HAVE A MORAL? I CAN'T TELL.] @mcc I once saw an EA person do a presentation and their shiny formula for the expected value of the future of humanity included terms for the star density both in the milky way as well as in the local virgo supercluster as a whole. Absolute clown show of a movement. 
- 
@emaytch @mcc at the risk of almost assuredly being That Guy, actually maybe the hints were there in the original website …something something… disrespect for glue work …rambling segway to my own REAL pet peeve… how this timeline could have been so much better if we'd tracked % of donations sent back in the mail asking for more donations instead U+220E @natevw @emaytch so there's a "soft problematic" version of EA where they get really really focused on dollars that go directly to services and this winds up over-funding things that accidentally game that number and de-funding important community work which due to the structural nature of its work means a slightly higher percentage gets spent on facilities or outreach your city gets a lot of mosquito nets but no arts funding, in blunt terms 
- 
@jplebreton@mastodon.social @xgranade@wandering.shop @ireneista@adhd.irenes.space @emaytch@mastodon.social @mcc@mastodon.social I suddenly see a connection to one of Zeno's Paradoxes, as well, where when you build a seemingly logical construct of reality (to reach a location, you must first reach halfway between there and your starting point) and take it to the limit (infinitely many halved distances between you and the destination) you end up with results that seem sound (the harmonic series does not converge!) and yet are easily disproven just by looking at reality. 
 The scientific method applied here would suggest, "oh, your model is bad". But it's clear there was no applying that here.
- 
@jplebreton@mastodon.social @xgranade@wandering.shop @ireneista@adhd.irenes.space @emaytch@mastodon.social @mcc@mastodon.social I suddenly see a connection to one of Zeno's Paradoxes, as well, where when you build a seemingly logical construct of reality (to reach a location, you must first reach halfway between there and your starting point) and take it to the limit (infinitely many halved distances between you and the destination) you end up with results that seem sound (the harmonic series does not converge!) and yet are easily disproven just by looking at reality. 
 The scientific method applied here would suggest, "oh, your model is bad". But it's clear there was no applying that here.@aud @mcc @jplebreton @emaytch @ireneista There was a great article I read a while ago about how you can understand science as the transition from rationalism to empiricism. That is, that science is the idea that you need to actually check your logic against the real world. There are many logically consistent worlds which are not ours, so it doesn't matter what you derive in your own brain if you don't have a connection out to empirical observation. Techbros could stand to take note. 
- 
@aud @mcc @jplebreton @emaytch @ireneista There was a great article I read a while ago about how you can understand science as the transition from rationalism to empiricism. That is, that science is the idea that you need to actually check your logic against the real world. There are many logically consistent worlds which are not ours, so it doesn't matter what you derive in your own brain if you don't have a connection out to empirical observation. Techbros could stand to take note. @xgranade @aud @mcc @jplebreton @emaytch and it's worth remembering that this radical position, that observation of reality is irrelevant and unimportant, is the thing that today's rationalist movement is actually named for. it's not actually about the practice of science, though we think many of its adherents don't fully appreciate that. 
- 
@xgranade @aud @mcc @jplebreton @emaytch and it's worth remembering that this radical position, that observation of reality is irrelevant and unimportant, is the thing that today's rationalist movement is actually named for. it's not actually about the practice of science, though we think many of its adherents don't fully appreciate that. @xgranade @aud @mcc @jplebreton @emaytch (the thread is about effective altruism but in practice there's heavy overlap in those communities) 
- 
@natevw @emaytch so there's a "soft problematic" version of EA where they get really really focused on dollars that go directly to services and this winds up over-funding things that accidentally game that number and de-funding important community work which due to the structural nature of its work means a slightly higher percentage gets spent on facilities or outreach your city gets a lot of mosquito nets but no arts funding, in blunt terms 
- 
@xgranade @aud @mcc @jplebreton @emaytch and it's worth remembering that this radical position, that observation of reality is irrelevant and unimportant, is the thing that today's rationalist movement is actually named for. it's not actually about the practice of science, though we think many of its adherents don't fully appreciate that. @ireneista @aud @mcc @jplebreton @emaytch Yep. It's radical and also *deeply* reactionary. Notably, it's also a huge break from even the reactionary forms of New Atheism. At one point, Sam Harris of all people was arguing that empiricism was necessary to ethics, not just philosophy. Now the whole reactionary movement seems to have left even that behind. 
- 
@ireneista @emaytch @mcc It's the exact same fallacy as Pascal's Wager. Once you assign infinite cost or reward to a decision, any "empirical" or "rational" framework applied to that loss function absolutely shits the bed. The resolution, of course, is "why in the hell do you think that's a possible outcome? what is your basis for that belief?" @xgranade @ireneista @emaytch @mcc ...but then you try to apply it to something like intercepting killer asteroids and they suddenly get all "well, let's be reasonable and practical." It's all just different forms of confirmation bias. 
- 
@xgranade @aud @mcc @jplebreton @emaytch (the thread is about effective altruism but in practice there's heavy overlap in those communities) @ireneista @aud @mcc @jplebreton @emaytch Yeah, absolutely. I disagree about some of the specifics, but "TESCREAL" as a term to point to the intersection of all of those related but distinct philosophical schools is useful nonetheless. 
- 
@xgranade @ireneista @emaytch @mcc ...but then you try to apply it to something like intercepting killer asteroids and they suddenly get all "well, let's be reasonable and practical." It's all just different forms of confirmation bias. @TomF @ireneista @emaytch @mcc I love how in e/acc and xrisk language, climate change is not "existential," only these pretend billion-year-future things. 










![Asta [AMP] Asta [AMP]](https://sharkey-data.sjc1.vultrobjects.com//webpublic-5849bbaa-f124-4380-b11c-f0975c01ae62.png)





 


