This article is the final part 8 of "Towards an agentic design system".
Our head of data measured it: our development team is producing at 3x to 4x per developer compared to 2023. That's not a projection or an industry estimate. That's measured output from an engineering team that has embedded AI deeply into their workflow.
I'm the sole designer on that team. A year ago, I had breathing room. I could explore ideas for the next quarter, prototype things that weren't urgent. Today I'm the bottleneck. Everyone is waiting on me.
And I'm someone who has been building AI into my own practice for over a year. I developed an agentic design system that encodes, executes, and audits design decisions. I built the infrastructure I've spent seven articles documenting. If I'm the constraint, the problem isn't effort or fluency. It's structural.
A 1:10 design-to-dev ratio was sustainable at the old speed. At 3x-4x, it breaks. And this isn't just our problem. It's coming for every team that accelerates development without rethinking what wraps around it.
Speed scales debt, not output
The instinct most companies have is obvious: if devs are going faster, add more devs. Ship more. Scale output. But what actually happens when you do that? You scale code reviews from huge PRs that nobody has time to review properly. You scale QA sessions that get squeezed or skipped. You scale the gap between what was designed and what shipped.
Speed without the support structures to match it doesn't produce more output. It produces more debt. Faster.
The bottleneck didn't disappear. It moved. It moved from development speed to everything that wraps around development: design decisions, quality assurance, product clarity, implementation review. The constraints are in the support layer now. And you can't solve it by accelerating the thing that's already fast.
The ratio is broken. So is the role definition.
I'm going to say something that might sound counterintuitive given the "AI replaces jobs" narrative: the companies that get this right will hire more designers, more PMs, more QA specialists. Not fewer.
When dev velocity jumps 3-4x, you need the surrounding roles to match or you burn your developers with review work they don't enjoy or slack on it. The ratio between builders and support profiles was calibrated for a different speed. That calibration is broken.
But "more designers" doesn't mean more of the same role. The work I'm describing, encoding decisions into systems that execute and self-audit, doesn't map onto a traditional product designer job description. It doesn't map onto a design systems role either. It sits in a space that requires design judgment, systems thinking, and enough technical fluency to work the interface between design and code.
I experienced this gap firsthand. I went through a design challenge at a company I respect, a company that's forward-thinking by most standards. I had eight hours and I made a bet: instead of going deep on a product solution, I used the time to showcase this shift. The infrastructure, the governance layer, the encoded accountability. How the role itself changes when you can close the loop between decision and execution.
They said no. They expected more time on finding a proper solution. They were filling a role shaped by their current team structure. I was proposing a restructure of what the role means.
I don't regret the decision. They couldn't see the shift yet. And I realized I need to be in a place that's ready to move with me, not a place where I'd need to champion the idea before I could practice it.
This is the timeline gap. What's already happening in high-fluency environments hasn't reached the planning horizon of most organizations. They're hiring for roles that matched last year's speed. The job descriptions, the evaluation criteria, the team structures: they're calibrated for a world that already passed.
AI fluency is the variable that explains the gap
Anthropic published an AI Fluency Index built around four dimensions: Delegation, Description, Discernment, and Diligence. How well can you delegate to AI, describe what you need, discern quality in what it produces, and maintain rigor in how you use it. I've been building on that framework, developing a tool to measure AI fluency and design training strategies around the results.
This maps directly to what I see on my team. The developers who are most effective aren't the ones who know the most syntax. They're the ones who can describe the behavior they want, recognize when the output is right, and maintain the discipline to follow the system's conventions.
I watched this play out from a completely different angle. Our Head of Operations has a background that spans marketing, product, and operations. She carries domain knowledge about clinical hiring that nobody else in the company has. I taught her how to build Claude skills. Then how to generate artifacts. Then I helped her set up a development environment on her laptop. She took her spreadsheets, iterated on them, and she built a web application for clinical hiring projections.
She's not a developer. She's not becoming one. She's a domain expert whose background gave her the ability to describe what she needed, and AI fluency gave her the ability to encode that knowledge into a tool. That capacity didn't exist in her role definition a year ago.
For most companies, this is months away on their timeline. For us, it's Tuesday. The difference is fluency, and the organizational willingness to let fluency reshape who can build what.
This isn't a green field
I need to be honest about what I'm describing and what I'm not.
I built this system in conditions most teams don't have. A sole designer with direct engineering access, in a small company where AI adoption starts at the CEO and moves fast because there are no committees to convince. We run into problems that larger teams won't face for months, because adoption moves at a different speed when there are no barriers or politics to navigate.
That's my advantage. It's also why I can see what's coming.
I don't have a playbook for how a 15-person design system team restructures around this. I don't know what the politics of measuring AI fluency look like in an organization where some people will score low and feel threatened. I don't know how you convince a hiring committee to evaluate candidates against role definitions that don't exist yet.
I can point you to someone who might have a better answer to this. Check How to get buy-in for an AI-ready design system by Grace Han. It's a great read.
What I do know: teams need to sit down and articulate the rules they've been following implicitly. Surface errors when they catch them. Agree on what's correct and what's an exception. If they don't, the result is predictable. AI scales whatever it finds. Including errors. Including drift. Including the gap between what was intended and what was built.
The infrastructure I've documented in this series is reproducible. The metadata format, the indexing approach, the audit skills, the governance layer. None of that requires a team of one. The organizational shift, getting a team to encode its knowledge and accept accountability for what gets encoded, that's harder than any of the technical work.
Where this goes
Companies that treat AI fluency as a strategic asset, something to measure, develop, and hire for, will reach the steady state faster. They'll rebalance their ratios. They'll recognize that the boundaries between roles were drawn around tool limitations, those are dissolving fast. They'll hire for the competencies the new speed demands instead of scaling yesterday's org chart.
The ones that don't, will keep adding developers wondering why velocity isn't translating into quality, and burning out the support profiles that were never staffed for this pace.
I can show you how it looks when the pieces come together. A designer who owns execution. An operations lead who builds her own tools. A CEO prototyping his vision. An engineering team moving at 3-4x with a system that maintains coherence. A token auditor that traces errors back to the decision, not the implementation.
I can't tell you how to get there from where you are. Every team's path will look different. But the destination is the same: a steady state where speed and quality aren't in tension, because the support structures match the velocity.
The teams that figure this out first will have a competitive advantage. The ones that wait will be hiring for it later, at a higher cost, playing catch-up against organizations that already moved.