I work with IT and the AI stuff is...
-
I work with IT and the AI stuff is...
@leah Motivating me to learn a trade and GTFO of the IT! They deserve to run slopshops with no humans besides the chief sloperator.
-
I work with IT and the AI stuff is...
@leah It's pissing off a number of my colleagues. I've decided to retire - twelve days to go!
-
I work with IT and the AI stuff is...
@leah@chaos.social Demotivating. It's not that I think AI will replace me anytime soon. (In fact it won't. I've played with it enough to know that current-gen AI is at best Legacy-Code-as-a-Service. It's not usable for anything productive. At least not in the long term.)
But the issue is that a lot of people - even within the software industry - think it will be able to replace us. It's demotivating to realize that a lot of people don't understand the worth of an experienced software engineer... -
I work with IT and the AI stuff is...
@leah it doesn't bother me right now, because I don't have to use it. But every time a colleague talks about it, or the company makes decisions to incorporate it more, I groan.
-
@leah@chaos.social Demotivating. It's not that I think AI will replace me anytime soon. (In fact it won't. I've played with it enough to know that current-gen AI is at best Legacy-Code-as-a-Service. It's not usable for anything productive. At least not in the long term.)
But the issue is that a lot of people - even within the software industry - think it will be able to replace us. It's demotivating to realize that a lot of people don't understand the worth of an experienced software engineer...@leah @sigmasternchen I'm on the exact same page
-
@leah@chaos.social Demotivating. It's not that I think AI will replace me anytime soon. (In fact it won't. I've played with it enough to know that current-gen AI is at best Legacy-Code-as-a-Service. It's not usable for anything productive. At least not in the long term.)
But the issue is that a lot of people - even within the software industry - think it will be able to replace us. It's demotivating to realize that a lot of people don't understand the worth of an experienced software engineer...@sigmasternchen @leah I feel different. For me it is a tool to craft software. Being experienced helps me to shape the output. As every tool, it has a learning curve. But in many cases, it helps rather than it hinders. I would compare it to the usage of an IDE (as opposed to a text editor).
-
I work with IT and the AI stuff is...
@leah motivating me to learn and do my work out of pure rage so that I can tell AI bros where they can shove it
-
I work with IT and the AI stuff is...
@leah AI is actually amazing for single project maintainers and really helps a lot with offloading things you have to do
-
I work with IT and the AI stuff is...
@leah I might be the only one down as "motivating" here.
I do sadly catch "I don't wanna write this code when the bot can maybe do it" energy (even when the bot manifestly cannot and it proves to be waste of time to ask it to try and a cogitohazard to wade through what it spits out).
But "The bot is more diligent about writing and running the tests than I am" (and "The bot will lie and submit code-shaped nonsense instead of code") is definitely "motivating" me to actually be test-driven.
-
@leah I might be the only one down as "motivating" here.
I do sadly catch "I don't wanna write this code when the bot can maybe do it" energy (even when the bot manifestly cannot and it proves to be waste of time to ask it to try and a cogitohazard to wade through what it spits out).
But "The bot is more diligent about writing and running the tests than I am" (and "The bot will lie and submit code-shaped nonsense instead of code") is definitely "motivating" me to actually be test-driven.
@leah And it turns out writing out a couple paragraphs of science fiction about how something ought to work before setting out to make it is a good idea for humans too.
-
I work with IT and the AI stuff is...
@leah AI excitement among my colleagues demotivates me to review their code and collaborate but on the other side it motivates me to volunteer and contribute to free software more (to projects which do not use AI to produce the code).
-
@leah And it turns out writing out a couple paragraphs of science fiction about how something ought to work before setting out to make it is a good idea for humans too.
@leah In general, figuring out that mechanical processes find your codebase confusing (where are the tests? How does someone even begin to build this? What are the essential pieces of domain knowledge that it is impossible to write useful code without accounting for?) will also reveal problems that make it inaccessible to new humans.
The less clever the automated system, the better it is at tripping over problems that shouldn't be there anyway and inspiring you to have better/any docs.
-
@sigmasternchen @leah I feel different. For me it is a tool to craft software. Being experienced helps me to shape the output. As every tool, it has a learning curve. But in many cases, it helps rather than it hinders. I would compare it to the usage of an IDE (as opposed to a text editor).
@morl99@hessen.social @leah@chaos.social I used to think that too.
But the output quality even of something like Claude 4.5 Opus is so incredibly bad that I spent more time explaining to the AI that no, "assert status_code in [200, 500]" is, in fact, not a sensible test. That comments that parrot the code do more harm than good. Or that changing the scope of the feature midway through implementing it might be bad. I'm sure some of this can be fixed with precise prompting, but at some point it's just faster to write it by hand - that's what programming languages are for after all: They are languages that are way more precise than natural language to tell the computer what to do.
And granted, maybe that's just my experience because I tend to work on problems with non-trivial solutions, where there is just not as much related training data for the AI. But it still makes them unusable for me and for the work I do. And it feels validating to me that other programmers that I respect also warn about the consequences for code quality - and by extension maintainability - and security.
And this is just regarding the output, right. There's loads of other problems, like deskilling, the unrealistic pricing model, dependency on US companies, the ecological problems, and of course also the moral issues.
Bottom line (for me): Current-gen AIs can not do my job as well as me. They slow me down and frustrate me, when they are "trying to help me". And I think (for various reasons) that we should probably not get dependent on them. -
@leah In general, figuring out that mechanical processes find your codebase confusing (where are the tests? How does someone even begin to build this? What are the essential pieces of domain knowledge that it is impossible to write useful code without accounting for?) will also reveal problems that make it inaccessible to new humans.
The less clever the automated system, the better it is at tripping over problems that shouldn't be there anyway and inspiring you to have better/any docs.
@leah But I luckily don't have a bunch of coworkers slinging reams of slop at me that they decline to be accountable for. If that happened I would probably die immediately of rage.
-
I work with IT and the AI stuff is...
@leah It's not really AI that demotivates me. I see AI as a tool that can help you if you use it correctly. Just like a craftsman uses a hammer. If you use it correctly, it makes it easier to hammer a nail into a piece of wood. What demotivates me is the constant influx of people saying, “AI will replace you all,” especially those in the company whose job it is to motivate people. When even your CTO has been telling you for two years that “AI will replace you,” you lose all joy in your work.
-
I work with IT and the AI stuff is...
@leah I hate the tons of bad practices, tech dept in the PRs and it seems nobody cares.
I can not review all the PRs and tons of this bad practices are merged.Companies started to hire people without experience to lead or as Sr and they bring them AI shit to work.
I feel that something that I love so much that is the #SoftwareEngineering is no respected just because AI is the new god and companies wants to save money at the expense of the quality of software.
-
I work with IT and the AI stuff is...
@leah motivating me to invent something much better. -
@leah I hate the tons of bad practices, tech dept in the PRs and it seems nobody cares.
I can not review all the PRs and tons of this bad practices are merged.Companies started to hire people without experience to lead or as Sr and they bring them AI shit to work.
I feel that something that I love so much that is the #SoftwareEngineering is no respected just because AI is the new god and companies wants to save money at the expense of the quality of software.
@leah I hate when the justification for a code be
"...because chatgpt told me"or several time people can not response a simple question about their own fucking code.
-
@leah I hate when the justification for a code be
"...because chatgpt told me"or several time people can not response a simple question about their own fucking code.
@leah Something that keep my mind calm is the #openSource #freeSoftware or my personal projects
a little space where the good code still matter.
-
@morl99@hessen.social @leah@chaos.social I used to think that too.
But the output quality even of something like Claude 4.5 Opus is so incredibly bad that I spent more time explaining to the AI that no, "assert status_code in [200, 500]" is, in fact, not a sensible test. That comments that parrot the code do more harm than good. Or that changing the scope of the feature midway through implementing it might be bad. I'm sure some of this can be fixed with precise prompting, but at some point it's just faster to write it by hand - that's what programming languages are for after all: They are languages that are way more precise than natural language to tell the computer what to do.
And granted, maybe that's just my experience because I tend to work on problems with non-trivial solutions, where there is just not as much related training data for the AI. But it still makes them unusable for me and for the work I do. And it feels validating to me that other programmers that I respect also warn about the consequences for code quality - and by extension maintainability - and security.
And this is just regarding the output, right. There's loads of other problems, like deskilling, the unrealistic pricing model, dependency on US companies, the ecological problems, and of course also the moral issues.
Bottom line (for me): Current-gen AIs can not do my job as well as me. They slow me down and frustrate me, when they are "trying to help me". And I think (for various reasons) that we should probably not get dependent on them.@sigmasternchen @leah I have had problems, where the AI was of no (substantial) help to me. I cannot say if this was due to my lack of context engineering skill at that time. But I really like the AI as a sparrings partner, given a predefined workflow and some "fixed" requirements.
But in no way do I see this as a replacement for my job, the AI is nothing with me doing the context engineering. As for the other problems, they sure await new solution strategies.