LLMs are spam generators.
-
LLMs are spam generators. That is all.
They're designed to generate plausibly human-like text well enough to pass a generic Turing Test. That's why people believe they're "intelligent".
But really, all they are is spam generators.
We have hit the spamularity.
-
LLMs are spam generators. That is all.
They're designed to generate plausibly human-like text well enough to pass a generic Turing Test. That's why people believe they're "intelligent".
But really, all they are is spam generators.
We have hit the spamularity.
@cstross "spicy autocomplete" has been one of my favorite descriptions of it.
-
LLMs are spam generators. That is all.
They're designed to generate plausibly human-like text well enough to pass a generic Turing Test. That's why people believe they're "intelligent".
But really, all they are is spam generators.
We have hit the spamularity.
@cstross leblose Labermaschinen
-
LLMs are spam generators. That is all.
They're designed to generate plausibly human-like text well enough to pass a generic Turing Test. That's why people believe they're "intelligent".
But really, all they are is spam generators.
We have hit the spamularity.
@cstross LLMs always remind me of the protagonists of โAccelerandoโ finally making contact with a whole system of alien AIs, only to learn that theyโre all just jumped-up versions of the Nigerian prince scam.
-
LLMs are spam generators. That is all.
They're designed to generate plausibly human-like text well enough to pass a generic Turing Test. That's why people believe they're "intelligent".
But really, all they are is spam generators.
We have hit the spamularity.
@cstross Love your fiction, can't agree on this take. Spam does not help with writing code, whereas LLM's can be extremely helpful.
LLM's are not generally intelligent. They can be immensely problematic, but to dismiss them as merely spam generators as you have, or "parrots" as others have is simply incorrect.
-
@cstross LLMs always remind me of the protagonists of โAccelerandoโ finally making contact with a whole system of alien AIs, only to learn that theyโre all just jumped-up versions of the Nigerian prince scam.
@mighty_orbot @cstross turns out the computronium sphere was because they were running out of RAM
-
LLMs are spam generators. That is all.
They're designed to generate plausibly human-like text well enough to pass a generic Turing Test. That's why people believe they're "intelligent".
But really, all they are is spam generators.
We have hit the spamularity.
@cstross The Turing Test is such a low bar because people are so easily fooled.
LLM's expose how naive people are. -
LLMs are spam generators. That is all.
They're designed to generate plausibly human-like text well enough to pass a generic Turing Test. That's why people believe they're "intelligent".
But really, all they are is spam generators.
We have hit the spamularity.
@cstross Spammageddon.
-
@cstross Love your fiction, can't agree on this take. Spam does not help with writing code, whereas LLM's can be extremely helpful.
LLM's are not generally intelligent. They can be immensely problematic, but to dismiss them as merely spam generators as you have, or "parrots" as others have is simply incorrect.
@ViennaMike You're using code generators. Not the same thing, frankly. Stop generalizing your experience as a developer to the public at large, who only see magic talking box.
-
@cstross "spicy autocomplete" has been one of my favorite descriptions of it.
@virtualbri @cstross "Mansplaining as a service" is my all-time favorite: https://phpc.social/@andrewfeeney/109466122845775778
-
LLMs are spam generators. That is all.
They're designed to generate plausibly human-like text well enough to pass a generic Turing Test. That's why people believe they're "intelligent".
But really, all they are is spam generators.
We have hit the spamularity.
@cstross Slop in the Faith..."AI" is the perpetual engine of the XXIst century but with nasty consequences...
-
@ViennaMike You're using code generators. Not the same thing, frankly. Stop generalizing your experience as a developer to the public at large, who only see magic talking box.
@cstross @ViennaMike Rubber ducks are useful when writing code, too. Doesn't make them intelligent.
-
@ViennaMike You're using code generators. Not the same thing, frankly. Stop generalizing your experience as a developer to the public at large, who only see magic talking box.
There is some real utility under the hood, but goddamn.
The pushback from higher utility bills and GenAI diarrhea is justified, and a huge black eye for practical use.
-
@cstross Love your fiction, can't agree on this take. Spam does not help with writing code, whereas LLM's can be extremely helpful.
LLM's are not generally intelligent. They can be immensely problematic, but to dismiss them as merely spam generators as you have, or "parrots" as others have is simply incorrect.
@ViennaMike if the purpose of a system is what it doesโฆ they are spam generators. They generate meaningless nonsense at vast scale to destroy any ability for information correlation or retrieval at large, for profit.
Anything else in a meaningless subpercent of usage.
-
LLMs are spam generators. That is all.
They're designed to generate plausibly human-like text well enough to pass a generic Turing Test. That's why people believe they're "intelligent".
But really, all they are is spam generators.
We have hit the spamularity.
For decades #Turing test was the ironclad determinant for #AI quality.
Once it was trivially broken by now old models, the goalposts have shifted......it's now humanities last exam, HLE, BTW.
The world is full of spam generating humans.
One is in the Whitest house, putin is surrounded by them, and any large family gathering will contain 2-3 human spam generators the will jibber-jabber nonsensical human-like speech constantly. -
LLMs are spam generators. That is all.
They're designed to generate plausibly human-like text well enough to pass a generic Turing Test. That's why people believe they're "intelligent".
But really, all they are is spam generators.
We have hit the spamularity.
@cstross
User:
Well, whatโve you got?Internet:
Well, thereโs ads and information; ads fake news and information; ads and AI slop; ads information and AI slop; ads information fake news and AI slop; AI slop information fake news and AI slop; AI slop ads AI slop AI slop information and AI slop; AI slop fake news AI slop AI slop information AI slop hate speech and AI slop; ... -
@cstross "spicy autocomplete" has been one of my favorite descriptions of it.
@virtualbri @cstross
I feel like "spicy" has too much of a positive connotation. I struggle to come up with something better than the simple "big autocomplete" -
@cstross @ViennaMike Rubber ducks are useful when writing code, too. Doesn't make them intelligent.
@rupert @cstross @ViennaMike the LLMs are tools just like any other. They can be helpful or they can delete your entire production database...it just depends how you use them
-
@virtualbri @cstross "Mansplaining as a service" is my all-time favorite: https://phpc.social/@andrewfeeney/109466122845775778
-
LLMs are spam generators. That is all.
They're designed to generate plausibly human-like text well enough to pass a generic Turing Test. That's why people believe they're "intelligent".
But really, all they are is spam generators.
We have hit the spamularity.
@cstross People believed Eliza was intelligent.
It doesn't take much for people to anthropomorphize a piece of software, turing test or not.