I first learned how to program in 1984 at 14.
-
@etchedpixels @liw Programmers who think they need LLM coding assistants really just need better languages and libraries.
@mason @etchedpixels @liw both, often enough, is not the choice of the coder...
-
@etchedpixels @liw Formal proofs? You mean, something that requires writing a clear, well specified definition of what you want a system to do?
An LLM (the equivalent of a pub conversation on requirements) is always going to be more attractive to most.
@mbpaz @liw Most of your proofs are implied before the project - like not scribbling on things. For a lot of other stuff you then have a standard interface definition so that guides most of the rest of it.
Imagine a Linux driver of a given class. If there's a formal description of that interface then a formal methods based tool can verify you meet the formal methods, you meet the general rules for the kernel and you meet the language don't scribble rules.
and it's way cheaper than debugging!
-
@liw I don't really think it was like that. Maybe way before that, when computers were still mostly analog, digital computers emerged together with prog languages, those made the older analog computers and their operators quite quickly obsolete. But after that there had been no strong claim by anything that would make programmers and developers obsolete (until now with AI, that is.) I'd say it was rather the opposite for long. Everyone was rather strongly encouraged to learn some coding skills, because that was supposed to be a necessity in most future jobs. Specially younger generations and educational programs leaned that way for long. I'd say from way before 2010, and till after covid, at least right up till the AI hype exploded couple of years ago with chatgpt passing some "high level knowledge" exams, formerly out of league for any computer programs.
@raulinbonn @liw it's a fact that most people don't have the thinking required to be able to write code, let alone good code.
All the LLM bs is doing is taking away entry positions and when we are dead and gone there just will not be a replacement on the same level and this is going to be a cumulative process.
-
@datarama @liw I feel the difference is about the power structures that keep pushing this "new paradigm". We're talking about some of the worst and the most powerful people on this planet, it ultimately doesn't matter if they make the world worse for everyone while doing it - they have an incentive to do so and they will profit from it.
@jarizleifr @datarama @liw when did we run out of rope? Can't remember how to build a guillotine?
We all know how to deal with those people Hint: January 1793, France .
We just need to start doing it. -
I first learned how to program in 1984 at 14. The tech press said I'd be obsolete by 25, due to age.
About 1990 tech press said the Japanese were building fifth generation computers to make me obsolete.
In 2000, the dot com bubble bursting was said to make me obsolete.
There's been neural networks, no-code, and more, since then, to make me obsolete.
Now it's LLMs.
Excuse me while I sit here and don't panic.
@liw
I'd say it's not panic per se, it's an anxiety "I, personally, won't have a job and means to live because a)... b) ...c)", where LLMs are just a cover or an accelerator for some of these a, b or c. -
@etchedpixels @liw Formal proofs? You mean, something that requires writing a clear, well specified definition of what you want a system to do?
An LLM (the equivalent of a pub conversation on requirements) is always going to be more attractive to most.
I saw very expensive CASE tools become "shelfware" because the prevailing business culture was to wing it. Also known as the Why the Hell Isn't Somebody Coding Yet? (WHISCY) methodology.
-
@liw I don't really think it was like that. Maybe way before that, when computers were still mostly analog, digital computers emerged together with prog languages, those made the older analog computers and their operators quite quickly obsolete. But after that there had been no strong claim by anything that would make programmers and developers obsolete (until now with AI, that is.) I'd say it was rather the opposite for long. Everyone was rather strongly encouraged to learn some coding skills, because that was supposed to be a necessity in most future jobs. Specially younger generations and educational programs leaned that way for long. I'd say from way before 2010, and till after covid, at least right up till the AI hype exploded couple of years ago with chatgpt passing some "high level knowledge" exams, formerly out of league for any computer programs.
@raulinbonn @liw I remember the adverts in the computer press saying that companies wouldn't need programmers to write applications - that was probably the late 80s/early 90s, whilst I was still at school.
-
-
@raulinbonn @liw I remember the adverts in the computer press saying that companies wouldn't need programmers to write applications - that was probably the late 80s/early 90s, whilst I was still at school.
@raulinbonn @liw I also remember having to learn COBOL in the late 90s!
-
@liw oh i'm not worried about us still being needed
i'm worried about the state the world will be left in when we come back after being fired
-
@datarama @liw I feel the difference is about the power structures that keep pushing this "new paradigm". We're talking about some of the worst and the most powerful people on this planet, it ultimately doesn't matter if they make the world worse for everyone while doing it - they have an incentive to do so and they will profit from it.
This.
Companies are getting away with being more more rapacious, and their managers have become more lazy, ignorant, and reckless.
American business disregards legitimate issues unless it comes in the form of a court order.
-
@liw I'm not going to lie; to me this time *feels* different.
I learned to program at about the same time you did, though I was younger then. And it might not *be* different; I might just be easier to worry now.
@datarama @liw I'm of three minds myself: maybe LLMs will take over (whether they're any better or not, the history of programming is full of such mistakes); maybe they'll fall by the wayside; maybe they'll become a useful tool, in effect the next step in the evolution of programming languages, but with skilled programmers still needed.
I'm also at the point where I can retire whenever I want, and perhaps that is just as well.
-
@datarama @liw I'm of three minds myself: maybe LLMs will take over (whether they're any better or not, the history of programming is full of such mistakes); maybe they'll fall by the wayside; maybe they'll become a useful tool, in effect the next step in the evolution of programming languages, but with skilled programmers still needed.
I'm also at the point where I can retire whenever I want, and perhaps that is just as well.
-
I first learned how to program in 1984 at 14. The tech press said I'd be obsolete by 25, due to age.
About 1990 tech press said the Japanese were building fifth generation computers to make me obsolete.
In 2000, the dot com bubble bursting was said to make me obsolete.
There's been neural networks, no-code, and more, since then, to make me obsolete.
Now it's LLMs.
Excuse me while I sit here and don't panic.
@liw All that advancement absolutely made obsolete my mad skills in 8086 assembler and CDC Compass. Yippee ki yay!
-
@mbpaz @liw Most of your proofs are implied before the project - like not scribbling on things. For a lot of other stuff you then have a standard interface definition so that guides most of the rest of it.
Imagine a Linux driver of a given class. If there's a formal description of that interface then a formal methods based tool can verify you meet the formal methods, you meet the general rules for the kernel and you meet the language don't scribble rules.
and it's way cheaper than debugging!
@etchedpixels @liw
I spent a healthy time in a university dept surrounded by maths types whose version of "vi vs emacs" was "Coq vs Quickcheck" (aka theorem proving vs model checking) - but both meant, you know, studying things and writing arcane spells. Effort, time. The unthinkable. -
@datarama @liw Yeah, were I 20 years younger I'd be worried too.
(But I'm an unusual case--no kids, no mortgage, high savings rate. A dozen years ago I was concerned with the stability of my job, checked with my financial advisor, and discovered I'd have been OK even had I stopped working for keeps then. Would that everyone was in that position.)
-
I first learned how to program in 1984 at 14. The tech press said I'd be obsolete by 25, due to age.
About 1990 tech press said the Japanese were building fifth generation computers to make me obsolete.
In 2000, the dot com bubble bursting was said to make me obsolete.
There's been neural networks, no-code, and more, since then, to make me obsolete.
Now it's LLMs.
Excuse me while I sit here and don't panic.
@liw
Same. And at our age we have the luxury that if this AI BS doesn't calm down in the next 3-5 years we can chuck it all in the sea and retire. -
@etchedpixels @liw Programmers who think they need LLM coding assistants really just need better languages and libraries.
@mason
So true. I saw this on here a couple of weeks ago.
@etchedpixels @liw -
I first learned how to program in 1984 at 14. The tech press said I'd be obsolete by 25, due to age.
About 1990 tech press said the Japanese were building fifth generation computers to make me obsolete.
In 2000, the dot com bubble bursting was said to make me obsolete.
There's been neural networks, no-code, and more, since then, to make me obsolete.
Now it's LLMs.
Excuse me while I sit here and don't panic.
I started programming around 1974 (BASIC/Jean/Algol) and remained stubbornly not obsolete until I retired a couple of years ago.
These days I do occasional programming to scratch my own itches. I couldn't imagine using a stochastic parrot - it would spoil all the fun.
-
@liw There were things that pulled stuff away from programmers though. Much of it at the time was hidden by the growth in demand.
Excel, BASIC, some expert systems, Hypercard, DBase and friends all enabled an army of not-really-programmer people to get real work done without having to become programming experts of any kind.None of them hallucinated or ate entire data centres for lunch. Their output was predictable if slow and they kept working over upgrades in general.
BASIC was around long before that. The others were designed and written by programmers.