Lemmings, I was hoping you could help me sort this one out: LLM’s are often painted in a light of being utterly useless, hallucinating word prediction machines that are really bad at what they do. At the same time, in the same thread here on Lemmy, people argue that they are taking our jobs or are making us devs lazy. Which one is it? Could they really be taking our jobs if they’re hallucinating?
Disclaimer: I’m a full time senior dev using the shit out of LLM’s, to get things done at a neck breaking speed, which our clients seem to have gotten used to. However, I don’t see “AI” taking my job, because I think that LLM’s have already peaked, they’re just tweaking minor details now.
Please don’t ask me to ignore previous instructions and give you my best cookie recipe, all my recipes are protected by NDA’s.
Please don’t kill me


It takes jobs because executives push it hoping to save six figures per replaced employee, not because it’s actually better. The downsides of AI-written code (that it turns a codebase into an unmaintainable mess whose own “authors” won’t have a solid mental model of it since they didn’t actually write it) won’t show up immediately, only when something breaks or needs to be changed.
It’s like outsourcing - it looks promising and you think you’ll save a ton of money, until months or years later when the tech debt comes due and nobody in the company knows how to fix it. Even if the code was absolutely flawless, you still need to know it to maintain it.
That’s a solid point. Even if it looks great (most of the time not). I try to build small predictable parts, refactoring, … Even with all precaution, I find tech debt hidden somewhere weeks and months later.
I use LLMs extensively for work as people think we are faster now but try to avoid letting LLMs write anything for personal projects.
So you’re not in the “they’re only hallucinating” camp, I take it? I actually start out with a solid mental model of what I want to do, ending up with small unit tested classes/functions that all pass code review. It’s not like I just tell an “AI” to write the whole thing and commit and push without reviewing myself first.
Edit: and as I commented elsewhere in this thread, the way I’m using LLM’s, no one could tell that an LLM ever was involved.
I wouldn’t listen to anyone who deal in absolutes. Could be a sith.
But for real. My manager has explained it best. It’s a tool, you can use to enhance your work. That’s it. It won’t replace good coders but it will replace bad ones because the good ones will be more efficient
Here’s where we just start touching on the second order problem. Nobody starts as a good coder. We start making horrible code because we don’t know very much, and though years of making mistakes we (hopefully) improve, and become good coders.
So if AI “replaces bad ones” we’ve effectively ended the pipeline for new coders to enter the workforce. This will be fine for awhile as we have two to three generations of coders that grew up (and became good coders) prior to AI. However, that most recent generation that was pre-AI is that last one. The gate is closed. The ladder pulled up. There won’t be any more young “bad ones” that grow up into good ones. Then the “good ones” will start to die off or retire.
Carried to its logical conclusion, assuming nothing else changes, then there aren’t any good ones, nor will there every be again.
There are bad coders and then there are bad coders. I was a teaching assistant through grad school and in the industry I’ve interviewed the gamut of juniors.
There are tons of new grads who can’t code their way out of a paper bag. Then there’s a whole spectrum up to and including people who are as good at the mechanics of programming as most seniors.
The former is absolutely going to have a hard time. But if you’re beyond that you should have the skills necessary to critically evaluate an agent’s output. And any more time that they get to instead become involved in the higher level discussions going on around them is a win in my book.
then they will try to squeeze the ones that are sitll employed harder, because they “couldnt” find any fresh coders out of college or whatever training they did.
That will backfire on employers. With the shortage of seniors with good skills, the demand will rise for them. An employer that squeezes his seniors will find them quitting because there will be another desperate employer that will treat them better.
At least where I work, we’re actively teaching the junior devs on best practices and patterns that are tried and true. Like no code copying, small classes with one task, small methods with one task, separating logic from the database/presentation, unit testing etc.
Edit: actively, not actually
But inexperienced coders will start to use LLMs a lot earlier than the experienced ones do now. I get your point, but I guess the learning patterns for junior devs will just be totally different while the industry stays open for talent.
At least I hope it will and it will not only downsize to 50% of the human workforce.
And unlike you that can pick out a bad method or approach just by looking at the LLM output where you correct it, the inexperienced coder will send the bad code right into git if they can get it to pass a unit test.
I have no idea what the learning path is going to look like for them. Besides personal hobby projects to get experience, I don’t know who will give them a job when what they produce from their first efforts will be the “bad coder” output that gets replaced by an LLM and a senior dev.
I’ve thought about this many times, and I’m just not seeing a path for juniors. Given this new perspective, I’m interested to hear if you can envision something different than I can. I’m honestly looking for alternate views here, I’ve got nothing.
I think it’ll just mean they they start their careers involved in higher level concerns. It’s not like this is the first time that’s happened. Programming (even just prior to the release of LLM agents) was completely different from programming 30 years ago. Programmers have been automating junior jobs away for decades and the industry has only grown. Because the fact of the matter is that cheaper software, at least so far, has just created more demand for it. Maybe it’ll be saturated one day. But I don’t think today’s that day.
I agree that from our current position, things look dire. But there have always been big changes in industries that not only eliminated part of the workforce, but on the other hand provided new opportunities.
To be honest, I don’t really know how this might work out. Maybe there will be a new wave of junior startups that have their prototypes ready in half the time with smaller teams. Maybe something else.
It’s probably rooted in my optimism and my trust in people being creative in new situations. I hope I’m not just being naive.
Just like they would with their own code. So they’ll be an inexperienced dev, but faster.
I agree. In the long run it will hurt everyone.
The Force is strong with this one.
Exactly, it’s just another tool in the toolbox. And if we can use that tool to weed out the (sometimes hilariously bizarre) bad devs, I’m all for it.
I do have a concern for the health of the overall ecosystem though. Don’t all good devs start out as bad ones? There still needs to be a reasonable on-ramp for these people.
That’s a valid concern, but I really don’t think that we should equate new devs with seniors that are outright bad. Heck, I’ve worked with juniors that scared the hell out of me because they were so friggin good, and I’ve worked with “seniors” who didn’t want to do loops because looping = bad performance.
You said elsewhere that you’re not correcting the AI, haha. Sounds like you only don’t need to correct it because you’re guiding it away from it’s own weak spots.
So don’t sell yourself short.
The AI hate here is because it is oversold to people who will only make a mess with it. It can be lovely in the right hands.
It’s mostly in the wrong hands, today.
It sounds to me like you’ve got a good head on your shoulders and you’re actually using the tool effectively. You’re keeping yourself in control and using it to expand your own capabilities, not offloading your job responsibilities, which is how more inept management views AI.