· 6 min read

Teaching AI to use your Design System

AI-ready metadata is the semantic layer that helps AI models reason about your design system.

Teaching AI to use your Design System
Cristian Morales

Cristian Morales

Product Designer

· 6 min read

Why AI struggles with existing design systems

Every time you ask an LLM to “create a login page” or “build a dashboard”, it generates something new. It doesn’t reuse your Button, your Card, or your TextInput — even though you’ve spent months perfecting them.

AI can already "design".
It can also code.
But it still doesn’t know your design system.

AI doesn’t understand the context of your components.
It doesn’t know what each one does, how they’re composed, or which variants are appropriate.

When you provide code, it only sees syntax.
When you provide documentation, it only reads language.
What it needs is a bridge between both; structured, machine-readable context that explains when and how to use components correctly.

That’s what AI-ready metadata is for.

Component-Level Metadata for Intelligent UI Generation

AI-ready metadata is the semantic layer that helps models like Claude reason about your design system.

It’s not visual design.
It’s not documentation.
It’s an intermediate language that describes the purpose, behavior, accessibility, and usage of each component. So AI can make informed decisions instead of guessing.

For example, when you ask Claude to “create a form with submit and cancel buttons”, metadata allows it to:

  • Pick the Button component from your system (not generate new code)
  • Apply the correct variants (solid_primary, outline_default)
  • Follow accessibility rules
  • Avoid anti-patterns like “multiple primary buttons”

That’s intelligent use of your design system, not generation.

Designing the schema

To make this possible, I designed a base schema that captures how a component behaves in code, documentation, and design.

It looks like this (simplified):

{
  "component": { "name": "Button", "category": "atoms", "type": "interactive" },
  "usage": {
    "useCases": ["form submission", "dialog confirmations"],
    "commonPatterns": [{ "name": "primary-action", "composition": "<Button>...</Button>" }],
    "antiPatterns": [{ "scenario": "multiple-primary-buttons" }]
  },
  "composition": {
    "slots": { "Icon": { "required": false } },
    "nestedComponents": ["Button.Text", "Button.Icon"]
  },
  "behavior": {
    "states": ["default", "hover", "disabled"],
    "interactions": { "click": "Triggers an action" }
  },
  "accessibility": {
    "role": "button",
    "keyboardSupport": "Enter/Space activates",
    "screenReader": "Announces label and state"
  },
  "aiHints": {
    "keywords": ["submit", "confirm", "action"],
    "context": "Use for primary user actions"
  }
}

This is enough for an AI to understand the what, when, and how of your component. And to use it confidently in context.

Using AI to fill the gaps

I created a Claude skill for this, but any LLM with enough context will do the job.
You provide your component file, and the model analyzes it to extract structure, composition, and behavior.

  1. Read the component: identify props, slots, states, and ARIA roles.
  2. Infer purpose: from naming, documentation, and code context.
  3. Generate metadata: following the schema above.
  4. Output: a JSON or TypeScript export ready to live next to your component.

Designers can use the Claude, Gemini or ChatGPT desktop version to build metadata visually, while technical designers can use a CLI version or IDE integration to generate and commit it directly in PRs.

The workflow is simple:

AI generates → Human validates → Component gets smarter.

Examples

I started with some of our simplest components.

Button: great for defining clear patterns and anti-patterns.
Chip: useful for exploring variants (selected, disabled, with icons).
Switch: perfect for accessibility, since metadata includes states and ARIA roles.

Even though some examples are not 100 % accurate yet, they already prove that metadata can teach the AI to reuse instead of recreate.

The goal isn’t perfection.
The goal is consistency through understanding.

System sync: keeping code, docs and metadata aligned

Is critical to have your codebase, documentation and design tool sync for this to work correctly.
Metadata is only valuable if it reflects the current codebase.
Each component should have its own version control and changelog.

Think of metadata as another layer that evolves with your design system:

  • When code changes → metadata updates.
  • When documentation changes → metadata adapts.
  • When AI learns → metadata expands.

That’s system sync in action.

Human validation and AI testing

The metadata is generated by AI, but validated by humans. Just like accessibility, QA, or documentation reviews.

We can validate components in a tool like Storybook.
We check that the metadata accurately represents the real component and its use cases.

This opens the door to AI-driven testing too:
If metadata defines behavior and accessibility, AI can automatically verify that a component meets those expectations. For example, ensuring focus rings, keyboard navigation, or ARIA roles.

Defining success

At this stage, success doesn’t mean a perfect dataset or a complete schema.

It means:

  1. The AI is using your existing components instead of creating new ones.
  2. The AI’s suggestions are actually useful to your team.

If those two happen, the skill is working.

Because for a human, it’s easy to fix a prompt or tweak a prop.
But for AI, learning to reuse instead of recreate. That’s a breakthrough.

Next steps: AI-assisted prototyping

Once AI understands your components, you can move faster.

Imagine prototyping flows directly from text:

“Create a settings screen with toggles for notifications and dark mode.”

The AI reads your metadata, grabs the right Switch component, and builds a layout using your Grid and Text tokens. Exactly as your system defines them.

You still guide the direction, but the AI now speaks your design language.

This is just the beginning of AI-assisted prototyping, where design systems stop being static documentation and become living instruction sets for machines and humans alike.

If you’ve built your own design system and want to make it AI-ready,
start by adding metadata to one component.
Then teach your AI to use it.

It’s like giving your design system a second brain.

Learn More

Download my ai-component-metadata skill for Claude.

Follow TJ Pitre on linkedin, he is on the spearhead on AI + Design related stuff.

This thought was based on my own experimentation, but also something clicked while reading Diana Wolosin articles: