The only people who are going to put in the time, are people who care enough to. The problem is you have people who didn’t care before who were equipped with a garden hose. Now that they have a fully pressurized fire hose they can make more of a mess faster.
As an author of fine literature, these million monkeys on typewriters simply upset my sense of dignity. And to imagine the impoverished prose so many readers shalt forthwith be perusing!
Maybe. But it depends on the metric. It seems like orgs are focused on PR count and token usage. Issues caused by poor code are often lagging indicators so it’s asymmetrical in that aspect.
Write lots of code now and statistically look great, while the impact won’t be felt for a much larger range of time.
With the job search and whatnot then yeah, caring becomes a lot more important. That’s true.
Hard disagree. LLMs are fantastic for fixing bad architecture that's been around for a decade because nobody was willing to touch it. I can have it write tons and tons of sanity checks and then have it rewrite functionality piece by piece with far more verification than what I'd get from most engineers.
It's not immediate, it still takes weeks if you want to actually do QA and roll out to prod, but it's definitely better than the pre-LLM alternatives.
Because there is a certain point where barrier to entry prevents meaningful competition once winner-take-all power laws start kicking in, and stability hitherto has been predisposed on having a plurality of non interrelated competitors to ensure no one man's quirks drives too much of societies theoretical output.
AI will make this dynamic worse, and it's got the extra danger of the default banal way of applying the technology in fact encourages it's application to that end.
I don't really see it that way because most software companies overestimate the importance of fantastic software vs merely adequate software, and most times good sales development, support, and negotiation skills are what helps actually sell.
I also don't think that the commodification of programming is a substitute for things like understanding your customers, having good taste for design, and designing software in a way that is maximally iterable.
But we aren't cooking with gas. We are cooking with a more controlled burner than ever that can download a clean code claude skill and be committing better code than you or I could write.
What would normally be considered overengineered gold plating is "free" now.
Like with a lot of things in this space, it depends where you invest your effort. If you care about quality design and good code, you can definitely get there - but that doesn't happen by default.
With the right investment, we could certainly have tooling that creates and maintains very good designs out of the box. My bet is that we'll continue chasing quick and hacky code, mostly because that's the majority of the code that it was trained on, and because the majority of people seem to be interested in a quick result vs a long-term maintainable one.
Forget what AI can and cannot do. What can you do?
If you are only doing data entry into an LLM without understanding how any of this actually works then what do I need you for? I can just promote the janitor at half the cost to do your job.
Second, LLM code can be less of a hot mess than human written code if you put in the time to train/prompt/verify/review.
Generating perfect well patterned SOLID and unit tested code with no warnings or anti-patterns has never been easier.
reply