The teams succeeding with AI coding tools aren't the ones with the best setups. They're the ones that changed how they work.
Since posting 100 PRs and my workflow I've had the chance to sit down with a small sample of engineering organizations adopting AI. It's not enough to develop a rubric just yet, for now I'm documenting the most clear patterns I'm seeing thus far.
To be clear -- I'm not talking about greenfield vibecoding. These are scale established organizations with multiple engineering teams, with existing customers, functioning software, and real process[1].
Bimodal adoption
There's a genuine split. Some teams are thriving, others are drowning[2].
On one side, teams that treated AI as a catalyst for rethinking how they build software. They've restructured codebases, changed review processes, rebuilt deployment pipelines, and invested heavily in shared learning. These teams are getting genuine lift.
On the other side, teams that dropped AI into their existing workflow and expected acceleration. What they got was chaos. PRs of wildly varying quality flooding a review process that was already a bottleneck[3]. Engineers learning on the fly with no shared playbook. Senior engineers -- often learning these new tools on the fly themselves -- trying to hold it all together.
The difference isn't talent or tooling -- it's whether leadership treated this as a process transformation or a tool rollout.
We're all "Product Engineers" now
This is the observation that I think it's the least discussed. You already see the shift in job descriptions around "Product Engineer", "Founding Engineer", and "AI Engineer".
One team described it well. Their best engineer -- extremely technical, competent with AI tools -- was bottlenecked on getting new work both deployed and specified. Meanwhile a less senior engineer was talking directly to customers, identifying pain points, and fleshing out potential features. The senior engineer was faster. The other engineer was more productive. That's the shift and it was a source of friction.
This isn't the lane for a lot of engineers. Many became engineers precisely because they wanted to solve well-defined technical problems, not sit in ambiguity deciding what the customer needs. And the teams pushing "product engineering" are finding exactly that split -- some engineers thrive in it, others are genuinely struggling. It's not a training issue. It's a fundamental change in what the job asks of you.
I don't have the answer here, but I think it's the question that matters most for how engineering teams evolve over the next few years.
Tooling over process
Engineers are spending serious time turning Claude Code into their own customized hotrod. Custom CLI/MCP servers, elaborate prompt chains, multi-step agent orchestrations. I get the appeal -- but for established codebases you get most of the benefit from a (relatively) simple setup.
One team had spent weeks on an elaborate agent pipeline and was still shipping fewer PRs than before they started. In my case, I default to a simple single-prompt approach more often than not.
Part of the problem is where we're drawing "best practice" from. Social media is full of solo developers or small teams using supercharged setups to smash out greenfield projects. Impressive -- but it does not translate to established codebases, teams, and processes. In those environments, supercharging just breaks things.
The review bottleneck
The same pattern shows up with process. Instead of asking "how might we work differently?" teams are shoe-horning AI into their existing workflow unchanged.
Take PR reviews. When I started my career, PRs didn't exist. In many organizations we committed to trunk and relied on code review meetings, pair programming, and honestly, trust. The PR-for-everything orthodoxy emerged over the last fifteen years and it served us well when the bottleneck was code quality and awareness. But it was always a trade-off against throughput, and the dynamics of that trade-off have changed dramatically[4].
Do you need a senior engineer to review every AI-generated change? I keep hearing "hell yes." I don't buy it for a raft of reasons. A copy change, a dependency bump, a refactor with full test coverage -- these don't need the same scrutiny as a new authentication flow. Risk-tiered reviews aren't a new idea, but they become essential when you're generating code at five or ten times the previous rate.
There are tools to make higher throughput safe. Feature flags. Progressive rollouts. Automated test coverage that gives you confidence. At Pyn we have Customer Success shipping PRs[5]. The time going into supercharged-code-tooling should be going into unlocking these process bottlenecks.
It's not just the engineering teams. Equally, you may have Customer Success teams stressed because features are shipping too fast. They're not able to communicate or document them effectively. That's another area where process improvements and adjusting expectations is essential. These changes do not exist in a silo. For a software organization it touches almost every single employee.
Not committing to the change
Several teams are running hackathons or "AI weeks." Usage spikes, people get excited, and then it fades. Within a month they're back to their old workflow.
Part of the problem is that hackathon side projects on greenfield ideas don't build skills for the day-to-day of an established codebase. And if the hackathon is on the day-to-day, it can make things worse. It's easy to make a mess of an existing codebase. You need to imagine that Claude is a competent software engineer that has just joined your team -- it needs onboarding to be successful[6].
Once the hackathon is over, "urgent get this done today" tickets come charging back. Under pressure people revert to what they know and what they can rely on.
Engineers doing AI adoption should be spending half their time right now on upskilling and improving process. That sounds like a lot. It isn't. The productivity gap between someone who's internalized these tools and someone still fighting them is enormous. And it compounds[7].
The other missing piece is systemic sharing. The person who figured out a great prompting pattern last Tuesday? Nobody else on the team knows about it. Individual experiments stay individual. Without a structure for sharing what works, adoption stalls[8].
The teams where leadership is mandating AI use but not carving out time for learning? Those are the stressed ones.
The craft question
I do not pretend to have a good answer on this, but I do know that ignoring it has the potential to be toxic.
In almost every interview, someone brings up the craft of coding[9]. Sometimes it's engineers themselves, sometimes a lead describing resistance. The framing is always some version of: "my engineers love coding and they feel like this is taking that away." This is the biggest adoption barrier I'm seeing. Leave it unresolved and you get a dysfunctional team -- some leaning in, others quietly resisting, some outright protesting.
The surface concern -- that AI replaces the satisfying parts of the job -- is the easier one to address. The teams doing well started with the tedious stuff. Dependency upgrades, boilerplate, test data generation. Not the features engineers care about. One lead told me resistance dropped significantly with that reframing alone.
The deeper concern is harder: that leaning on AI makes you shallow. That you generate solutions without truly understanding the system. This is legitimate and dismissing it is a mistake. I have views here on how this can be spun, but there are no right answers here. Whilst I'm a big proponent of these tools, this is a risk that is almost certain to materialize in the coming years. At the moment it just depends what timeframe you are optimizing for.
No surprise. The thriving teams all had high alignment on this -- not by accident, but by having the conversation explicitly.
Where this lands
I know it's a small sample so far, but the pattern from these interviews is consistent. The teams pulling ahead invested in process before tooling, created structures for shared learning, and gave their engineers time to adapt. The teams that skipped those steps bought the license and are still waiting for the magic.
The gap between these two groups will be wider in a year, not narrower. The compounding has barely started.
Get in touch if you're interested in being interviewed.