BlogEditorial

Design system analytics: measure component usage and impact

Most design teams know their components exist. Few know which ones are actually being used, where they break in the wild, and what to retire. This is what design system analytics solves.

Omar
Omar
May 7, 20267 min read

Design system analytics: measure component usage and impact

You shipped a design system. The team adopted it, mostly. Six months in, you're not sure which components are pulling their weight, which ones are getting hacked around in code, and which ones nobody touches. The Figma library says one thing. The codebase says another. The actual product, where users live, says something else again.

Design system analytics is what closes that gap. It tells you not just what was built, but what gets used, where it breaks, and what to invest in next.

This is for design system leads, principal designers, and product designers who want their systems to keep earning their keep, not just keep existing.

What design system analytics actually means

There are three different layers most teams measure, and each tells you something different.

Design-phase usage. Which components designers are pulling into Figma files. How often. By which teams. This is the "are people designing with the system" layer. Figma's library analytics tools handle this layer well, surfacing component insertions, team contributions, and trends.

Code-phase usage. Which components engineers are actually using in production. How they're imported, which props they're passed, how often they're overridden. This is the "is the system shipping" layer. It's where most reality checks happen.

User-facing impact. Which components users actually interact with, where they cause friction, and how that friction shows up in journeys. This is the "does it work for the people we built it for" layer. It's the layer most teams skip.

The teams that get the most from their design systems are the ones measuring all three. Skip any of them and you'll be solving the wrong problem.

Why most design teams stop at design-phase metrics

Library analytics are easy. The numbers are right there in Figma. You can show a graph of insertions per week and call it adoption.

The trouble is that insertions in Figma don't equal components in production. A designer can drag in a button 30 times and an engineer can ignore the design and ship a custom one. The Figma graph still goes up.

This is why teams that depend purely on design-phase metrics often have the unpleasant experience of "high adoption" on paper and a fragmented product in reality. Figma's own design system 104 guide on metrics makes this point: design-phase data is necessary but not sufficient.

What code-phase metrics actually tell you

The simplest version of code-phase tracking is parsing your codebase for component references. You count how many times <Button /> appears across the product, compare to the number of bespoke buttons that exist, and you have a real adoption number.

The more useful version goes deeper.

Override frequency. How often the standard component is being passed custom CSS or className overrides. High override rates mean the component isn't doing the job. The fix is usually to update the component, not police the overrides.

Prop usage distribution. Which prop combinations are most common. If 80% of usages of your Modal component pass the same three props, those three props are the real API. The other props are noise.

Bespoke vs system ratio. The honest test of adoption. Across your product, what percentage of buttons (or modals, or cards) are coming from the system vs. being one-off implementations? This number is the one design system leads should care about most.

These aren't hard to track. The tooling has matured a lot. The harder part is convincing the team to care about the numbers consistently.

Where user-facing impact comes in

This is where most teams have a blind spot. You've measured what designers do, you've measured what engineers ship. You haven't measured what happens when real users meet the components.

The questions that matter at this layer:

  • Which components do users interact with most often? Are they the ones you invested most in?
  • Where are users abandoning flows that depend on a specific component?
  • Which components are causing rage clicks, hesitation, or repeated attempts?
  • Which components are accessible in the lab but failing in the wild?

You can't answer these from Figma. You can't answer them from code parsing. You need product intelligence on top of the design system.

This is where the worlds connect. Adora's Product Design Library is built around exactly this problem: linking the components in your system to the user behaviour that touches them. When you can see, for a single component, who used it, where they got stuck, and how the journey continued, you stop guessing about impact.

The measurement loop that actually works

A useful design system analytics practice runs on a quarterly loop with a weekly check.

Quarterly: Set the bets. Pick three components to invest in this quarter. Pick one to retire or merge. The selection is data-informed, not data-driven, the data tells you what's broken or unused, the team decides what to do about it.

Weekly: Read the signals. Spend ten minutes scanning the friction signals across the system. Where are users hesitating, abandoning, rage-clicking near a system component? Adora's AI Insights is designed for this kind of regular scan: the system flags what's worth your attention, you decide what to do.

Per release: Verify the change. When you ship a component update, check it. Is usage going up? Did the friction reduce? Did engineers adopt the new version, or are they still on the old one? Without verification, every release is a guess.

This loop is much smaller than most "metrics dashboards" promise. That's the point. You don't need 50 metrics, you need the seven that pull cleanly and the discipline to actually look at them.

Mistakes design teams make with system analytics

Three patterns to avoid.

Mistake 1: Measuring activity instead of impact. Number of components shipped is activity. Percentage of UI built with the system is impact. Confusing the two leads to teams that build a lot and influence little.

Mistake 2: Counting Figma insertions as adoption. Insertions tell you about design intent, not product reality. Always pair the design metric with a code-phase or impact-phase metric.

Mistake 3: Treating retirement as failure. Retiring a component is a healthy signal, not a problem. If a component isn't being used, killing it is the right move. Smaller, cleaner systems beat sprawling, half-used ones.

The teams that get the most from their systems are the ones willing to remove things, not just add them.

How AI is changing design system analytics

Two things are starting to shift.

Faster pattern detection across the system. Instead of hand-built dashboards, AI tools can flag patterns automatically. "Component X is being overridden by 60% of teams" or "users hesitate on form Y for 4x longer than the median." You see the pattern when it matters, not at the next quarterly review.

Connecting components to journey-level outcomes. This is the deeper shift. Your buttons aren't important on their own, they're important because they sit in flows that lead to outcomes. AI analytics that understands both the component and the journey can tell you which components are blocking real outcomes, not just which ones are getting clicked.

McKinsey's 2024 State of AI survey found that high-AI performers are pulling ahead by embedding AI into existing workflows. Design system analytics is a clear case: bolt AI onto the data you already have, and the work changes character. You stop measuring after the fact and start spotting issues live.

Where to start

If your design system is a few quarters old and you don't have a clear answer to "which components matter most," do one thing this week.

Pick five components you suspect are pulling weight. For each, write down two numbers: how often it appears in code, and how often users interact with it without friction. If you don't have the numbers, that's your next plumbing job. If you do have them, you've found your top investments for the next quarter.

Either way, you'll know more on Friday than you do today. That's enough to start.