I work with IT and the AI stuff is...
-
@leah And it turns out writing out a couple paragraphs of science fiction about how something ought to work before setting out to make it is a good idea for humans too.
@leah In general, figuring out that mechanical processes find your codebase confusing (where are the tests? How does someone even begin to build this? What are the essential pieces of domain knowledge that it is impossible to write useful code without accounting for?) will also reveal problems that make it inaccessible to new humans.
The less clever the automated system, the better it is at tripping over problems that shouldn't be there anyway and inspiring you to have better/any docs.
-
@sigmasternchen @leah I feel different. For me it is a tool to craft software. Being experienced helps me to shape the output. As every tool, it has a learning curve. But in many cases, it helps rather than it hinders. I would compare it to the usage of an IDE (as opposed to a text editor).
@morl99@hessen.social @leah@chaos.social I used to think that too.
But the output quality even of something like Claude 4.5 Opus is so incredibly bad that I spent more time explaining to the AI that no, "assert status_code in [200, 500]" is, in fact, not a sensible test. That comments that parrot the code do more harm than good. Or that changing the scope of the feature midway through implementing it might be bad. I'm sure some of this can be fixed with precise prompting, but at some point it's just faster to write it by hand - that's what programming languages are for after all: They are languages that are way more precise than natural language to tell the computer what to do.
And granted, maybe that's just my experience because I tend to work on problems with non-trivial solutions, where there is just not as much related training data for the AI. But it still makes them unusable for me and for the work I do. And it feels validating to me that other programmers that I respect also warn about the consequences for code quality - and by extension maintainability - and security.
And this is just regarding the output, right. There's loads of other problems, like deskilling, the unrealistic pricing model, dependency on US companies, the ecological problems, and of course also the moral issues.
Bottom line (for me): Current-gen AIs can not do my job as well as me. They slow me down and frustrate me, when they are "trying to help me". And I think (for various reasons) that we should probably not get dependent on them. -
@leah In general, figuring out that mechanical processes find your codebase confusing (where are the tests? How does someone even begin to build this? What are the essential pieces of domain knowledge that it is impossible to write useful code without accounting for?) will also reveal problems that make it inaccessible to new humans.
The less clever the automated system, the better it is at tripping over problems that shouldn't be there anyway and inspiring you to have better/any docs.
@leah But I luckily don't have a bunch of coworkers slinging reams of slop at me that they decline to be accountable for. If that happened I would probably die immediately of rage.
-
I work with IT and the AI stuff is...
@leah It's not really AI that demotivates me. I see AI as a tool that can help you if you use it correctly. Just like a craftsman uses a hammer. If you use it correctly, it makes it easier to hammer a nail into a piece of wood. What demotivates me is the constant influx of people saying, “AI will replace you all,” especially those in the company whose job it is to motivate people. When even your CTO has been telling you for two years that “AI will replace you,” you lose all joy in your work.
-
I work with IT and the AI stuff is...
@leah I hate the tons of bad practices, tech dept in the PRs and it seems nobody cares.
I can not review all the PRs and tons of this bad practices are merged.Companies started to hire people without experience to lead or as Sr and they bring them AI shit to work.
I feel that something that I love so much that is the #SoftwareEngineering is no respected just because AI is the new god and companies wants to save money at the expense of the quality of software.
-
I work with IT and the AI stuff is...
@leah motivating me to invent something much better. -
@leah I hate the tons of bad practices, tech dept in the PRs and it seems nobody cares.
I can not review all the PRs and tons of this bad practices are merged.Companies started to hire people without experience to lead or as Sr and they bring them AI shit to work.
I feel that something that I love so much that is the #SoftwareEngineering is no respected just because AI is the new god and companies wants to save money at the expense of the quality of software.
@leah I hate when the justification for a code be
"...because chatgpt told me"or several time people can not response a simple question about their own fucking code.
-
@leah I hate when the justification for a code be
"...because chatgpt told me"or several time people can not response a simple question about their own fucking code.
@leah Something that keep my mind calm is the #openSource #freeSoftware or my personal projects
a little space where the good code still matter.
-
@morl99@hessen.social @leah@chaos.social I used to think that too.
But the output quality even of something like Claude 4.5 Opus is so incredibly bad that I spent more time explaining to the AI that no, "assert status_code in [200, 500]" is, in fact, not a sensible test. That comments that parrot the code do more harm than good. Or that changing the scope of the feature midway through implementing it might be bad. I'm sure some of this can be fixed with precise prompting, but at some point it's just faster to write it by hand - that's what programming languages are for after all: They are languages that are way more precise than natural language to tell the computer what to do.
And granted, maybe that's just my experience because I tend to work on problems with non-trivial solutions, where there is just not as much related training data for the AI. But it still makes them unusable for me and for the work I do. And it feels validating to me that other programmers that I respect also warn about the consequences for code quality - and by extension maintainability - and security.
And this is just regarding the output, right. There's loads of other problems, like deskilling, the unrealistic pricing model, dependency on US companies, the ecological problems, and of course also the moral issues.
Bottom line (for me): Current-gen AIs can not do my job as well as me. They slow me down and frustrate me, when they are "trying to help me". And I think (for various reasons) that we should probably not get dependent on them.@sigmasternchen @leah I have had problems, where the AI was of no (substantial) help to me. I cannot say if this was due to my lack of context engineering skill at that time. But I really like the AI as a sparrings partner, given a predefined workflow and some "fixed" requirements.
But in no way do I see this as a replacement for my job, the AI is nothing with me doing the context engineering. As for the other problems, they sure await new solution strategies.
-
I work with IT and the AI stuff is...
@leah I'd like a 4th option -- spends time finding and pointing out flaws in AI in the workplace
-
@sigmasternchen @leah I have had problems, where the AI was of no (substantial) help to me. I cannot say if this was due to my lack of context engineering skill at that time. But I really like the AI as a sparrings partner, given a predefined workflow and some "fixed" requirements.
But in no way do I see this as a replacement for my job, the AI is nothing with me doing the context engineering. As for the other problems, they sure await new solution strategies.
@morl99@hessen.social @leah@chaos.social I'm sorry, I might have misunderstood you earlier.
I'm not saying it can't be helpful. For example: I've used it to analyse an existing (not-well-structured) code base and search for locations that touch certain topics - that's definitely useful. Even vibe-coding (in the sense that the code is not looked at by a human) can have applications for protoyping or requirements engineering.
Just for implementing features or fixing bugs in production code, I personally think it slows me down. And I feel like focusing extensively on AI could potentially prove a bad move for companies. And I'm saying this working for a client that focuses extensively on AI. 😅 -
I work with IT and the AI stuff is...
@leah nice to get an idea what to look for...
Bad if ppl around me try to prefere error filled manuals for customers, instead of sending the customers the official documents and frustrating for the work I put into it. -
I work with IT and the AI stuff is...
IT in general has gone to hell, AI is just the latest factor, and it is a big one.
-
@morl99@hessen.social @leah@chaos.social I'm sorry, I might have misunderstood you earlier.
I'm not saying it can't be helpful. For example: I've used it to analyse an existing (not-well-structured) code base and search for locations that touch certain topics - that's definitely useful. Even vibe-coding (in the sense that the code is not looked at by a human) can have applications for protoyping or requirements engineering.
Just for implementing features or fixing bugs in production code, I personally think it slows me down. And I feel like focusing extensively on AI could potentially prove a bad move for companies. And I'm saying this working for a client that focuses extensively on AI. 😅@sigmasternchen @leah I feel we are pretty on the same page then and I have misunderstood your initial post as well.
And yeah, vibe coding a small CLI for example as a useful tool to handle a certain kind of operational problem is really nice. I have a CLI where I do not care for the code at all, just the tests.
-
I work with IT and the AI stuff is...
@leah It definitely demotivates me. It just feels dishonest when someone in a junior position sends me stuff to review that is genAI. It has so many errors and especially the subtle ones are annoying to catch. And when I try to talk to the junior about it, it becomes clear that they didn't understand key issue at all.
So what I did: I went to the kitchen, made a tea, and only then went back to my PC and gave them a call to help them rewrite the thing.
It'd have been far easier if they directly asked me "Hey I don't know what I should write, can you help me" instead of sending me confident-sounding word vomit.
And this is where I can control it. My bosses use genAI to "get inspiration and feedback" and honestly that sounds like a nightmare.
-
I work with IT and the AI stuff is...
@leah It sucks so hard to have genAI thrown in my face, when i try to figure if my problem is solveable/was already solved in the real world and how.
Instead of not finding anything i get SEO slop to make me click. And now, with chatbots getting better, i need more and more time to figure if this blog/answer/post is for real or just hallucinated.
-
I work with IT and the AI stuff is...
@leah My motivation is long gone already, just waiting for everything to collapse, then I can be a demigod again, reviving these old machines that survived :D
-
@leah It definitely demotivates me. It just feels dishonest when someone in a junior position sends me stuff to review that is genAI. It has so many errors and especially the subtle ones are annoying to catch. And when I try to talk to the junior about it, it becomes clear that they didn't understand key issue at all.
So what I did: I went to the kitchen, made a tea, and only then went back to my PC and gave them a call to help them rewrite the thing.
It'd have been far easier if they directly asked me "Hey I don't know what I should write, can you help me" instead of sending me confident-sounding word vomit.
And this is where I can control it. My bosses use genAI to "get inspiration and feedback" and honestly that sounds like a nightmare.
-
undefined 77nn@goto.77nn.it shared this topic on