Design is evolving. We’re moving from tools that users control to smart agents that act on their behalf, based on trust, shared values, and intent. Over on our Helio channel, we featured a great visual from Menno Cramer. He makes a strong case that the future of UX isn’t just about screens anymore, it’s about building smart, responsive relationships between people and machines. Check it out: https://lnkd.in/gZZw6AS3 To keep up, design needs to think about personalization in new ways, not just changing what people see, but understanding how systems should behave, respond, and grow alongside each user. Cognitive, emotional, and contextual intelligence all matter more now. Systems thinking is becoming essential. Here’s how I see Menno’s UX map from the user perspective: → Yesterday: Usability for everyone Designers focused on creating one experience that worked for the majority. The main goal was to make things usable and remove friction. Personalization was minimal, and most interactions were standardized. “Can I do what I came here to do?” This era was about universal access: clean layouts, simple flows, and clear buttons designed to work for most people. → Today: Adapting to users Designers now listen, learn, and build systems that adjust to each user’s behavior. The focus is on understanding intent and making the experience feel smarter and more relevant. Personalization is moderate, based on user preferences, habits, and patterns. “Does this system understand what I mean and need?” This era is about responsive experiences: designs that shift and evolve as users interact with them. → Tomorrow: Working with users Designers are starting to create agents that collaborate with users. These agents aren’t just helpful, they reflect the user’s goals, values, and emotional context. Personalization is deep, relationships are dynamic, and built on trust. “Is this agent aligned with my goals and values?” This next era is about trusted digital partners: agents that think, speak, and act with the user’s best interest in mind. We’re excited about where design is headed. At Helio, we’ve built UX metrics to help designers track what matters, with simple, standardized data they can share across their team. That includes insights from both users and intelligent agents. What do you think about how design is evolving? #productdesign #uxmetrics #productdiscovery #uxresearch
Interaction Design Evolution
Explore top LinkedIn content from expert professionals.
Summary
Interaction design evolution refers to the ongoing transformation in how people and technology connect, shifting from simple user interfaces to intelligent systems and AI agents that collaborate and adapt in real time. This new era is not just about making screens easier to use—it’s about building experiences where digital systems understand and respond to human needs, behaviors, and emotions.
- Embrace dynamic systems: Start thinking beyond fixed screens and workflows, and focus on designing experiences that adjust and learn alongside users.
- Prioritize human context: Incorporate emotional, cognitive, and situational factors into your designs so technology can better relate to and support individuals.
- Think collaboratively: Work closely with engineers, data scientists, and users to architect intelligent systems that consider ethics, trust, and evolving user behaviors.
-
-
From GenAI to GenUI We’re witnessing a shift as significant as the leap from MS-DOS to graphical user interfaces. The AI era marks our latest upgrade in how we interact with technology. For decades, we designed for workflows and specific actions. Everything was deterministic. Behind every interface sat a flowchart, with logic carefully coded. The backend made decisions, and the frontend rendered them. This model worked because we could predict every path a user might take. With agents, this paradigm breaks down. Text alone isn’t sufficient anymore. Chat works for conversation, but interaction demands something more. We need to engage with agents, not just talk to them. Reasoning state and intent become critical factors in the exchange. LLMs can now generate UI, and this capability feels like natural progression. Model Context Protocol enables mini-apps to emerge on the fly, no longer bound by deterministic rules. This opens the door to genuine hyper-personalization. We’ve moved from designing screens to designing for outcomes. Agents now dynamically assemble workflows based on intent, available data, and accessible tools. The fact that agents can create interfaces without traditional designers and developers is revolutionary. We can finally shift from UI-centric thinking to truly user-centered experience design. This fundamentally transforms the designer’s role. We’re no longer pixel pushers or interface assemblers. The work of arranging buttons, spacing elements, and crafting individual screens can now be handled by agents. Instead, designers become architects of experience, defining the principles, guardrails, and intent that shape how agents respond. We set the boundaries of possibility, orchestrate the logic of interaction, and ensure coherence across dynamic, personalized experiences. Our canvas expands from static screens to adaptive systems. We design the intelligence behind the interface, the relationships between user needs and agent capabilities, the quality standards that govern generated UIs. We curate outcomes rather than outputs. The ability to adapt, reorganize, and respond to both user intent and application context is transformative. With reasoning and action combined, agents can generate dynamic artifacts that enable interaction, not merely conversation. What a time to be alive as a designer! 🫶 #ai
-
Over the past few months, I’ve noticed a pattern in our system design conversations: they increasingly orbit around audio and video, how we capture them, process them, and extract meaning from them. This isn’t just a technical curiosity. It signals a tectonic shift in interface design. For decades, our interaction models have been built on clickstreams: tapping, typing, selecting from dropdowns, navigating menus. Interfaces were essentially structured bottlenecks, forcing human intent into machine-readable clicks and keystrokes. But multimodal AI removes that bottleneck. Machines can now parse voice, gesture, gaze, or even the messy richness of a video feed. That means the “atomic unit” of interaction may be moving away from clicks and text inputs toward speech, motion, and visual context. Imagine a world where the UI is stripped to its essence: a microphone and a camera. Everything else, navigation, search, configuration, flows from natural human expression. Instead of learning the logic of software, software learns the logic of people. If this plays out, the implications are profound: UX shifts from layouts to behaviors: Designers move from arranging buttons to choreographing multimodal dialogues. Accessibility and inclusion take center stage: Voice and vision can open doors, but also risk excluding unless designed with empathy. Trust and control must be redefined: A camera-first interface is powerful, but also deeply personal. How do we make it feel safe, not invasive? We may be on the cusp of the first truly post-GUI era, where screens become less about control surfaces and more about feedback canvases, reflecting back what the system has understood from us.
-
I’ve been designing + building products for 20 years. One AI project changed everything I thought I knew. It was 5 years ago. The brief: an AI assistant for financial advisors. "Easy" I thought. I brought the playbook - understand users, map needs, prototype, iterate. Within weeks, every method had failed. User-centred design has given us incredible tools: journeys, personas, usability testing. It created a shared language for innovation and put users at the centre of product development. But it also gave us something dangerous: the illusion that good process guarantees good outcomes. Where design methods break: 🔴 They treat all problems as design problems. Not every challenge needs a workshop. Some need engineering breakthroughs. Some need business model innovation. Some need regulatory change. When your only tool is empathy, everything looks like a user experience problem. 🔴 They assume user needs reveal future possibilities. Advisors thought they wanted better dashboards. Not "AI that predicts my clients needs and anxiety levels". Revolutionary products create needs people didn't know they had. 🔴 Confuses good process with good results. Following the method perfectly doesn't guarantee you're solving the right problem. Great design comes from insight, not adherence to frameworks. What building AI systems has taught me: 🤔 The old tools need rethinking. User research couldn't predict interactions with something that evolves. Journey maps couldn't map AI that creates new paths. Prototypes couldn't capture systems that learn and change. 🤔 The real design challenge isn't the interface - it's the intelligence architecture. Should the system interrupt or wait? Learn from the user or protect their privacy? Optimise for efficiency or explainability? These aren't UX decisions. They're ethical and technical decisions that determine trust, dependency, and agency. 🤔 And critically: AI systems create feedback loops that change user behaviour over time. Traditional design assumes static user needs. AI design requires predicting how your solution will reshape the problem space. We're designing systems that could shape human behaviour for generations. User research and workshops aren't enough anymore. We need a new playbook. What I've learnt: 🟢 Ask "should we?" before "how might we". Consider consequences, not just possibilities. What data does this use? How does it learn? What could break? 🟢 Develop systems thinking. Your decisions ripple through complex networks of technology, behaviour, and culture. 🟢 Design for responsibility, not just iteration. Every design choice becomes a values statement when scaled through AI. 🟢 Question the AI narrative. Not every problem needs an AI solution. Some need better human processes. 🟢 Partner deeply with engineers and data scientists. The best AI experiences emerge from true collaboration, not handoffs. The craft evolves. The responsibility remains the same. Let’s write new rules. Who’s in?
-
UX is evolving. And it's not just about the user anymore. 🤖 Enter AX (Agent Experience). AX expands the design focus beyond just humans to include AI agents, humans, and digital coworkers. In the agentic AI world, all of them are interacting with systems to help get things done. 𝗧𝗿𝗮𝗱𝗶𝘁𝗶𝗼𝗻𝗮𝗹 𝗨𝗫 𝗶𝘀 𝘀𝗶𝗺𝗽𝗹𝗲 𝗮𝗻𝗱 𝗱𝗲𝘁𝗲𝗿𝗺𝗶𝗻𝗶𝘀𝘁𝗶𝗰. You tap a button. Something happens in the product. Job done. 𝗕𝘂𝘁 𝗶𝗻 𝘁𝗵𝗲 𝗔𝗫 𝘄𝗼𝗿𝗹𝗱 𝘁𝗵𝗶𝗻𝗴𝘀 𝗮𝗿𝗲 𝘄𝗮𝘆 𝗺𝗼𝗿𝗲 𝗱𝘆𝗻𝗮𝗺𝗶𝗰: - The agent tracks ongoing goals, nudges next steps, improves over time. - The system plans its own path - it senses, infers, chooses actions the designer didn't script. - Context is learned, not asked. Patterns, preferences, even team dynamics are remembered and reused. - And success is no longer just task completion. It's also things like earned trust, retention, and long-term value. 𝗪𝗲'𝗿𝗲 𝗻𝗼𝘁 𝗷𝘂𝘀𝘁 𝗯𝘂𝗶𝗹𝗱𝗶𝗻𝗴 𝗶𝗻𝘁𝗲𝗿𝗳𝗮𝗰𝗲𝘀 𝗮𝗻𝘆𝗺𝗼𝗿𝗲. We're designing incentives and interactions across humans and AI agents. 𝗗𝗲𝘀𝗶𝗴𝗻 𝗽𝗿𝗶𝗼𝗿𝗶𝘁𝗶𝗲𝘀 𝘄𝗶𝗹𝗹 𝗻𝗼𝘄 𝗶𝗻𝗰𝗹𝘂𝗱𝗲: → How do these AI agents learn and collaborate? → How do we ensure they align with human goals? → How do we build systems that evolve, not just react? The future of experience design is agentic. And this is a huge change in how we design, collaborate, and operate in increasingly AI-integrated systems. And the AX conversation is just beginning. 🔔 Share this with someone who needs to be prepared for the AX future. 👉 Know any new innovative tools or companies powering the AX revolution? Let me know! #AgenticAI #AgentExperience #futureofwork #design
-
Observations on the evolution of design at IDEO: difficulty to produce really thoughtful, complex prototypes has plummeted. Early in our AI explorations, we used these tools for specific points in the design process like research synthesis, ideation support, etc. Now, IDEO designers are combining multiple emerging tech tools to make amazing things, like these Star Trek rocks created by my work spouse Jenna Fizel: 3D-printed stones containing RFID chips that link to a web app. Tell it what you're worried about, and it recommends a Star Trek episode to support you. Prototypes like this used to require extensive technical support and timelines that often made them impractical for early exploration. Now designers are able to make things that would've felt super complex, really quickly. AI and emerging tech at their best is not just about process optimization, they're about advancing our capabilities and allowing us to do cooler, more rigorous, more ambitious things ❤️🔥
-
For years, design leadership has been rooted in a familiar rhythm: refine the interface, elevate the craft, ship the next iteration. But the ground is shifting under our feet. Alan Dye leaving Apple to lead Meta’s new AI-focused design studio is more than a career move. It’s a signal of where design leadership is headed next — from polished pixels to the orchestration of intelligent systems. And I think this shift has been building for a while. Design leaders are no longer stewards of screens alone. We’re becoming stewards of behavior. We’re shaping how intelligence shows up for real people in real workflows. That means expanding our craft: - Understanding how AI perceives, interprets, and responds. - Designing workflows where the interface becomes the smallest piece of the experience. - Thinking about failure modes, trust, escalation paths, and recovery states. It’s a new kind of design practice. One where interaction models stretch beyond UI and into environment, context, and intent. AI isn’t replacing design leadership. It’s redefining the battlefield. And the leaders who thrive will be the ones who think beyond surfaces and start designing the systems of intelligence that sit beneath them. https://lnkd.in/g35fcFiU #ProductDesign #DesignLeadership #AIXDesign #FutureOfWork
-
Being a designer in SF right now feels like standing at the edge of a new craft. We spent 20 years designing screens. Now we have to learn how to design intent. At an AI design meetup last night, Jimmy Shan from Microsoft AI a line that stuck with me: “Classic UI is a body covered in tattoos you can click. AI is the first time we can talk to our tools.” When users can express intent instead of clicking through flows… designers stop being “layout artists.” We become: • decision-makers • taste-makers • people who shape how systems behave • people responsible for what should this thing actually do? Alexandre Pinot from Dust summed it up perfectly: Old design = design for predefined flows New design = design for intent It’s exciting and scary at the same time. We have a rare moment right now to shape a completely new interaction paradigm. And we better do it with good intent - and great taste.
-
AI is finding its interface. The chatbox was always temporary. We kind of knew it. That text input field was good for conversation, slow for real work. The chat promised infinite possibility but delivered linear constraint. But there have been gradual shifts in how AI interacts with us. - Perplexity Labs started building complete dashboards. Ask it to analyze financial trends, and it generates interactive charts and structured data tables/charts - Claude Artifacts evolved from showing code snippets to becoming a full workspace. Now it's an app builder where you describe what you need, and minutes later you're using a functional tool. - And recently, ChatGPT apps. Ask it to find apartments, and Zillow renders an interactive map inside your chat. Request a playlist, Spotify appears with controls. Instacart Smart Shop doesn't just recommend groceries. It reshapes your entire store interface based on your dietary patterns in real-time. Here's the pattern: Adaptive interfaces over static ones. AI builds the right tool for your specific task. Writing needs an editor workspace. Code needs inline review. Data analysis needs interactive dashboards. Shopping needs personalized aisles. One interface can't serve every intent. For product and design teams, this changes everything. You're not designing for "the user" anymore. You're architecting systems that generate the right interface for this user, this task, this moment. The UI becomes the output, not just the input mechanism. We're moving from conversational AI to generative/adaptive UI. The interface doesn't just listen but shapes itself to match what you're actually trying to do. At Figr, we're exploring what this means for design itself. When your design agent understands not just what to build, but how the interface should adapt to different contexts. That's a real unlock.
-
In software, progress has always been a story of abstraction. We moved from machine code to high-level languages, and then to APIs, each layer hiding complexity. We are now on the cusp of the next, and perhaps final, layer: abstracting away the UI itself. It’s a full-circle moment. The first computer interface was the terminal—a direct, conversational layer. We spent 40 years building GUIs to hide its complexity. Now, with Generative AI, we are returning to a terminal-like experience, but one where the "command line" is natural language and the response is a fully-formed, dynamic UI. This inverts the traditional model. The interface is no longer the static contract the user must learn, the user's intent is the contract. In this new paradigm, the UI is a dynamic, real-time output generated in response to a goal. We are moving from designing interfaces to designing interface generators (e.g. Gemini CLI, Cloud Code). The system understands the objective ("plan a multi-city trip for the lowest cost") and generates the necessary UI—maps, calendars, forms—in the moment. This elevates our interaction from a tactical level ("click here, then here") to a strategic one ("achieve this"). APIs become the tools, but the Generative UI becomes the intelligent workshop that arranges those tools for us.
Explore categories
- Hospitality & Tourism
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Healthcare
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Career
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Event Planning
- Training & Development