Setting the Framework for AI-Powered UX
For years, design teams have built systems to help products scale — codifying patterns, components, and tokens so teams could work faster and more consistently across platforms. These systems brought alignment, efficiency, and clarity to growing organizations.
And it worked — up to a point. But we’re now on the edge of something new. With the rise of AI, growing behavioral data, and increasingly fluid user expectations, the experience layer is beginning to shift. Interfaces that were once fixed outputs may soon adapt dynamically — tuning layouts, content, and interactions in real time based on individual context. The Nielsen Norman Group refers to this shift as a move toward “generative UIs” — interfaces capable of assembling themselves based on behavioral and contextual inputs, rather than static design templates. To meet that potential, design systems must evolve. They can no longer function solely as scalable sets of components. They need to become adaptive frameworks: systems infused with design logic, ethical boundaries, and clear intent that guide how AI should behave. Not to manipulate, but to personalize — to serve the specific needs of the user, not just the statistical average.
As I explored in Designing for Everyone, Not Just the Average, the future of UX lies in designing experiences that flex — not based on assumptions, but on behavior, context, and intent. That future won’t be handcrafted at every touchpoint. It will be guided by design systems, and orchestrated by AI — within the rules and boundaries we define.
From Scaling Sites to Scaling Experiences
Design systems have brought remarkable structure to how we build products — defining reusable components, aligning teams, and streamlining collaboration. But as user journeys become more dynamic, these systems are hitting their limits.
While today’s design systems excel at scaling production and consistency across teams, most are not equipped to adapt experiences in real time based on user behavior or context. They govern visual cohesion and enforce standards, but they rarely account for the fluid, situational needs of actual users.
This is the gap we’re staring down: we have systems that scale what we build, but not how those experiences respond once they’re in the wild. An interface can be pixel-perfect and on-brand — and still fail to resonate with a specific user in a specific moment.
To bridge that gap, we need to reimagine design systems not just as tools for building, but as strategies for flexing — infused with logic, variation rules, and decision-making structures that allow experiences to shift based on who’s using them, and how.
I encountered this tension during a large-scale banner test at Walmart.com. We launched a multivariate experiment with several layout, copy, and CTA variations. One version clearly “won” by overall engagement rate — but other segments of users had responded better to different combinations.
The problem wasn’t with the test — it was with the system. Our design infrastructure wasn’t built to support multiple high-performing experiences, only to crown a single winner and ship it universally. There was no mechanism for adapting the experience in real time based on what we’d learned about different users. Instead of serving each group more intentionally, we were forced to pick a one-size-fits-most design and discard the rest.
That moment underscored a gap in the system: we had the data to personalize, and even the content to vary — but the layout, the structure, the rules of the design system were fixed. It wasn’t built to adapt — it was built to standardize.
AI Is the Conductor, Not the Composer
As design systems evolve to support more adaptive experiences, a new collaborator enters the picture: AI. With its ability to surface content, adjust layouts, and respond to behavior in real time, it’s tempting to view AI as the new “designer.” But that’s a misunderstanding of its role.
AI is not here to replace design — it’s here to interpret and execute within the systems we define. Think of it like a conductor in an orchestra: it adjusts the tempo, brings in different instruments, shifts the energy depending on the audience and mood. But the sheet music — the structure, progression, and emotional cues — that’s all been written in advance.
Designers provide that sheet music. We define the components, interaction patterns, behavioral rules, and ethical boundaries that guide what AI can and can’t do. Without this scaffolding, AI-led UX risks becoming chaotic, incoherent — or worse, manipulative.
As Brad Frost notes in his article on AI and design systems, AI can supercharge consistency and scalability — but only when designers set the parameters and intentionally govern how components behave. Our job isn’t to micromanage every variation. It’s to build frameworks that adapt with integrity — systems that empower AI to personalize responsibly while staying true to our design vision.
Designing for Adaptability, Not Just Reuse
If AI is going to adjust the experience in real time, the building blocks of that experience need to be designed with flexibility in mind. That starts with thinking modularly — not just in terms of visual hierarchy, but in terms of interchangeable elements, components, and sections that can respond to user needs, behaviors, or contexts.
Atomic Design, a methodology introduced by Brad Frost, offers a structured approach to building design systems. It breaks down interfaces into five hierarchical levels: atoms (basic HTML elements like buttons and inputs), molecules (combinations of atoms forming functional components), organisms (complex UI sections), templates (page-level structures), and pages (specific instances of templates with real content). This approach promotes consistency and scalability while allowing for flexibility in design.
In an adaptive system, these pieces become more than reusable — they become configurable. A card might expand with more detail for an engaged user, collapse to just a headline for someone in a hurry, or highlight different CTAs depending on prior behavior. The component doesn’t change visually for variety’s sake — it shifts because the system understands who it’s serving and what’s most helpful in that moment.
Sometimes, it’s not about the components themselves — it’s about how they’re arranged. Product detail pages (PDPs), for example, often include all possible content for every user, even though most people only engage with a fraction of it. The experience could be dramatically improved by reordering or hiding elements based on behavior.
Take Amazon’s PDP. It’s personalized in content, but the structure never changes. I’ve never once clicked “Buy Now,” nor have I bought from other sellers, yet those elements remain ever-present. After years of data, the layout hasn’t adapted. That’s a missed opportunity — not because they lack the components, but because there’s no logic guiding how those components respond to behavior.
These changes aren’t just visual variations. They’re intent-driven adjustments, governed by rulesets we define in advance:
- Who is this component for?
- What are the conditions under which it adapts?
- What should never change, even when personalization is in play?
That’s the real evolution: not just creating components that scale across screens, but components that flex intelligently within the experience — without breaking consistency or trust.
Guardrails, Not Free-For-Alls: Governing Adaptive Systems
Flexibility without governance is chaos. As design systems evolve to support dynamic, behavior-aware experiences, they must also embed structure, constraints, and ethical guidelines. Otherwise, we risk creating interfaces that are inconsistent, manipulative, or impossible to maintain.
Smashing Magazine reminds us that personalization without clear data boundaries risks alienating users and breaching their trust. Designing responsibly means introducing privacy and consent into the process from the start.
Design systems already govern visual consistency and accessibility — but an adaptive system must also govern when and how experiences change. That means introducing decision-making frameworks into the design layer, not just the product strategy doc.
To govern adaptive systems well, we need to define and document:
- What can change? Content density, module order, visual emphasis, or whether something appears at all.
- When should it change — and why? Based on user behavior (clicks, scrolls, dwell time), engagement level, device or time-of-day context, or journey stage.
- What must remain consistent? Brand voice, accessibility standards, legal disclaimers, and user protections.
NN/g also found that users often don’t understand machine-driven personalization. If they can’t tell why an interface has changed, it erodes trust. Making these systems legible and explainable is part of adaptive design’s ethical responsibility.
These aren’t decisions design makes alone — they’re design-led decisions informed by data, strategy, and collaboration. As designers, we’re not just crafting visuals — we’re helping define the logic behind how a system responds: establishing flexible logic, fallback or alt states, and setting ethical boundaries that guide AI-powered experiences.
But this work only succeeds within a broader product ecosystem. Tagging, behavior triggers, and structural adaptability all require buy-in and input from engineering, data, and product partners. The systems we imagine must be technically feasible, testable, and maintainable at scale.
When designers, PMs, and engineers align on what adapts, when it adapts, and why, we move beyond scalable design — we create experiences that are flexible, trustworthy, and intentional.
This isn’t about slowing down innovation. It’s about ensuring that adaptability doesn’t come at the cost of coherence, brand trust, or user autonomy.
The Future of Design Systems Is Adaptive by Design
The next generation of design systems won’t just help us build faster — they’ll help us build smarter, more responsively, and more responsibly. These systems will no longer exist solely to enforce consistency across platforms — they’ll serve as frameworks for real-time, context-aware experiences that flex to meet users where they are.
As the Design Systems Collective notes, some organizations are already exploring “autonomous UI” approaches — systems that evolve in real time without human intervention, based on context, AI insights, and behavior.
As AI becomes more embedded in how we shape and serve digital experiences, our role as designers becomes even more critical. We’re not handing over the wheel — we’re designing the road map, the guardrails, and the rules of the road.
By architecting systems that are modular, behavior-aware, and ethically governed, we ensure that personalization works for the user, not just for the business.
This is the moment to evolve our design systems into living frameworks — ones that scale, adapt, and serve real people with intention and integrity.
