The craftsman is dead, long live the craftsman
Someone I respect recently posted on X that “software craftsmanship is over, the industrial age of programming is here.” At first I was horrified, I wanted to push back, but I think they might have a point.
If you look up “craftsman” in the dictionary, there are two definitions. The first is a person who practices a trade or handicraft. The second is a person who is skilled in a craft.
And I think the “skilled” part is the key. Our industry is full of “definition 1” people. We’ve all been in teams with people who are just coasting, whose contributions are marginal, and sometimes even net negative. Others have focused on gatekeeping. Blocking PRs over personal style preferences, establishing “best practices” that really meant “do it like I would.” Lots of non-actionable feedback, lots of power games, lots of focus on the meta-work instead of the real work. And because the industry had what felt like infinite money, we just tolerated it. That was never craft. Maybe we needed AI to finally expose it. Maybe our industry earned this.
The problem is that the overcorrection is punishing the wrong people. The good are paying for the sins of the bad.
Understanding of fundamentals, having good judgment, being able to debug the thing that breaks at 3am? That’s more needed than ever. When anyone can generate a thousand lines of code in minutes, the craft isn’t in the production anymore. It’s in knowing what to build, and how to develop systems that can grow over time without falling apart.
But the AI conversation instead has always been about more. More output, more speed. And the pace at which it’s escalating is frightening. We’ve gone in a few months from code completion, to agentic coding, to managing fleets of agents. Run ten. Run a hundred. Don’t even look at the code anymore, just look at the results. The best engineers, they say, are orchestrating swarms of agents from above, never touching a line of code. And if you’re not doing that, you’re falling behind. Or that’s what they have us believe.
On top of that, there’s this resurgence of spec-driven development. Write detailed specs upfront, let agents run for 40 minutes, an hour, can you do 10 hours? Figure everything out before you start. We learned decades ago that this doesn’t work. You make decisions with incomplete information and then you’re locked into them for however long the agents run. This is old waterfall wearing an AI costume. Iterative development exists for a reason. Problems reveal themselves through building, not through planning.
Look, I’m not advocating for working without a plan either. You need to know which direction you’re going. But how you get there? That’s the part where you iterate. Because there’s always a gap between the plan and the result. The design in your head (or the agent’s plan mode) and the end result are never the same thing. You only see where they diverged by reviewing what comes out. And that gap is where the craft lives.
Then there’s also the meta-work trap. You add agents to speed things up. So now you need orchestration because the agents create complexity. And then, you need monitoring for the orchestrators. Suddenly all your effort goes into managing the machinery instead of solving the actual problem. Layers of abstraction solving problems created by the previous layer. The original question was “how do I build this feature?” and now it’s “how do I manage this sprawling system of agents?” You traded one kind of work for worse work that’s further from the thing you’re actually trying to build.
This narrative isn’t just technically misguided. It’s doing real damage. If you’re a solid engineer who thinks carefully about your work, who reviews code, who cares about design, the current discourse makes you feel like a loser. Like you’re not good enough. Like somehow these people running agent swarms have figured out something you haven’t. You internalize the doubt instead of questioning the premise.
And so the people who would genuinely benefit from AI as a tool, are tuning out entirely. They don’t want to be part of that culture. They don’t want to become those guys. The hype is narrowing who feels welcome in the conversation, and that’s a genuine loss.
All of this makes me think of the fashion world (really?). There’s fast fashion brands, like Shein, you buy some cheap pants, for twenty bucks, they get the job done but they fall apart in three months. There’s haute couture, beautiful and exclusive but not exactly practical for everyday life. And then there’s Levi’s. Not disposable, not luxury. Just well made. A pair of jeans that lasts, that you can actually afford, that you want to keep wearing. The industry right now is acting like Shein is the only option. Ship fast, ship more, who cares if it falls apart. It’s erased the middle from the conversation entirely.
Similarly with programming, you shouldn’t obsess over every single line of generated code. That’s cargo cult rigor. Instead focus on what actually matters. What are we testing? What’s the shape of the code? How is it organized? Are the names consistent? Are the APIs clean? Is my data layer well designed? Are the boundaries between components right? Those are the design decisions that cascade through everything you build after.
I’m currently building an application for learning to play drums with MIDI feedback. There’s a lot of hard work going into it, a lot of math involved combined with audiovisual programming. That’s a lot of knowledge which I don’t have and it would take a crazy amount of time to learn and get right from scratch, and so I use AI to handle a lot of the complexity, that’s where it genuinely shines and I’m not going to pretend otherwise.
But here’s what I actually focus on. Not the sync algorithm. The architecture. The first prototype had all the logic living inside React components. The audio engine was tangled with the display engine. It worked, but it was a dead end. I had to pull out proper domain models, and make React use them without owning the logic. That’s not a decision AI makes for you. That’s the kind of thing you only see when you’re going slow enough to look at how the pieces fit together. Get that wrong and every feature you add after fights against the coupling.
If I’d just accepted whatever AI produced without reviewing it, one agent or five, doesn’t matter, I’d have a working app full of invisible structural problems that would bite me weeks later. Instead, AI handles the complexity while I maintain a clear mental model of the system. Not writing the algorithm myself, but making sure it lives in the right place and doesn’t create hidden dependencies.
And here’s where it compounds. When the codebase is clean, when the design is solid, everything you add next builds on top of something stable. Features grow naturally from the existing structure instead of fighting against it. And when you notice a design problem, and you will, AI makes refactoring dramatically cheaper than before. You can restructure things in one focused effort. That’s a refactoring you simply wouldn’t have done five years ago because the cost was too high. Now you can afford to be picky about design. AI enables craftsmanship instead of destroying it.
Bug fixes work the same way. When the code is coherent, when functions have good names, when the patterns are clear, the AI finds the problem fast. One iteration, done. The clean codebase isn’t just for you, it’s for the agent too. When you work on a spaghetti codebase, figuring out where problems are is a challenge, and we’ve all been there right? You’re trying to fix a minor bug, and then that becomes a major task which would require weeks of work to fix, so the bug isn’t fixed and goes into a JIRA board with 100 other similarly fated issues, and someone promises one day they’ll look into it, but that day never comes. It doesn’t matter who wrote the code, the challenge is the same for humans and AI alike.
Slowing down, being deliberate, thinking about architecture, that actually makes AI more useful, not less. You’re directing it precisely where it matters. Funny enough, I even have to tell the AI itself to slow down sometimes. Maybe it learned that from us, or it has been trained like that. Either way, the irony isn’t lost on me.
I have no doubt that things will keep changing. The tools will evolve and the level of abstraction at which we work will keep rising. But we can’t give up our judgment, our role in ensuring the quality of what we produce. We’ll have to keep honing our skills for sure, but that’s part of the craft.
The craftsman is dead, long live the craftsman.
Cheers!