Skip to content

Piero Bosio Social Web Site Personale Logo Fediverso

Social Forum federato con il resto del mondo. Non contano le istanze, contano le persone

lol, "if only someone had warned us about this sort of thing?!"

Uncategorized
42 31 3

Gli ultimi otto messaggi ricevuti dalla Federazione
Post suggeriti
  • 0 Votes
    3 Posts
    18 Views
    As always, you can read all of that ad- and tracker-free over the #GeminiProtocol. No, I won't stop going on about the Gemini protocol. It's ace.gemini://tilde.club/~ghalfacree/
  • 0 Votes
    1 Posts
    8 Views
    Language models cannot reliably distinguish belief from knowledge and factAbstract-----------«As language models (LMs) increasingly infiltrate into high-stakes domains such as law, medicine, journalism and science, their ability to distinguish belief from knowledge, and fact from fiction, becomes imperative. Failure to make such distinctions can mislead diagnoses, distort judicial judgments and amplify misinformation. Here we evaluate 24 cutting-edge LMs using a new KaBLE benchmark of 13,000 questions across 13 epistemic tasks. Our findings reveal crucial limitations. In particular, all models tested systematically fail to acknowledge first-person false beliefs, with GPT-4o dropping from 98.2% to 64.4% accuracy and DeepSeek R1 plummeting from over 90% to 14.4%. Further, models process third-person false beliefs with substantially higher accuracy (95% for newer models; 79% for older ones) than first-person false beliefs (62.6% for newer; 52.5% for older), revealing a troubling attribution bias. We also find that, while recent models show competence in recursive knowledge tasks, they still rely on inconsistent reasoning strategies, suggesting superficial pattern matching rather than robust epistemic understanding. Most models lack a robust understanding of the factive nature of knowledge, that knowledge inherently requires truth. These limitations necessitate urgent improvements before deploying LMs in high-stakes domains where epistemic distinctions are crucial.»#ai #LLMs #epistemology #knowledgehttps://www.nature.com/articles/s42256-025-01113-8
  • 0 Votes
    5 Posts
    12 Views
    @ghalfacree you have my sympathy - that is not a fun keyboard to type on! (looking forward to reading the review!)
  • 0 Votes
    1 Posts
    12 Views
    Over 40 years, we were collectively told to give tax cuts to rich people.And we were told that if we did that, wealth would trickle down and everyone would be better off.Over 40 years, pretty much everything got cut to fund these tax cuts.Schools. Hospitals. Public housing. Public transport. Universities. Roads projects. Mental health services. Welfare payments.People literally went homeless or starved, so rich people could get tax cuts.Because the wealth would trickle down.Eventually the eroding of public goods caused social dislocation.So governments further cut those public goods to fund more police and prisons. To continue giving tax cuts to rich people.But they said the wealth would trickle down.Eventually the climate started changing because of the amount of toxic fossil fuel pollution in the atmosphere.So governments chose to keep the tax cuts rather than fund infrastructure to reduce emissions.(Many of those billionaires getting tax cuts made their money selling toxic fossil fuels.)And as the oceans and atmosphere warmed, the bushfires, droughts, hurricanes, cyclones, floods, and droughts got worse.But they said the wealth would trickle down.Eventually people were getting pissed off at the dire state of the world.The rich misdirected that anger at immigrants!And First Nations!And trans people!And neurodivergent people!Anyone but the billionaires who got the tax cuts.So governments chose to keep the tax cuts. (For the rich. Everyone else got new tariff taxes.)But they said the wealth would trickle down.So did the wealth trickle down?Well...A group of billionaires saw this kinda cool tech demo.It predicted the next pixel of an image, based on the colour patterns of every image on the internet.It also predicted the next word in a sentence, based on an analysis of every piece of writing on the internet.The rich decided that this clearly showed that a sentient computer was just around the corner.The problem was these tech demos needed servers with a lot of GPUs to work.So the rich took all the money they got from those tax cuts.And they bought GPUs.Millions and millions and millions and millions of GPUs.All the tax cuts? All the underfunded schools? All the draconian welfare cuts? All the public housing shortages? The delays in funding clean energy.In the end, it didn't trickle down.And instead of all the public goods it could have bought......We'll be left with millions and millions and millions of GPUs in a landfill.#ChatGPT #Claude #AI #LLM #capitalism #socialism #business #politics #Nvidia