
Amazon Alexa
2025
Amazon — AI Ecosystem Vision
Led experience design for a large-scale, agent-driven AI initiative at Amazon, working within a cross-functional team to explore how AI-powered systems could operate across multiple device surfaces and interaction modalities.
AI & LLM Product Design
Cross-Functional Collaboration
Customer Journey Mapping
The Challenge
When AI acts autonomously on a user's behalf — making decisions, taking actions, managing tasks — how does the experience communicate what it's doing, why, and how a user can stay in control? This is a fundamentally different design problem from traditional interfaces, where every user action is explicit and intentional.
Trust Progression Frameworks
Defined patterns for how users build confidence in an AI system over time, and how the design scaffolds that journey from initial scepticism through to comfortable delegation. This included models for surfacing AI intent before action, and clear patterns for interruption and correction when the system gets something wrong.
Human-AI Collaboration Across Surfaces
Designed for AI that operates across voice, screen, and device surfaces without a single primary interface — requiring a rethink of conventional interaction assumptions. When there is no screen, how does the system communicate state? When there are multiple surfaces, how does context transfer? This work produced a set of shared interaction principles for ambient, non-deterministic AI behaviour.
Cross-Device Workflow Design
Translated high-level strategic product vision into concrete, testable interaction concepts and narrative prototypes, used to pressure-test assumptions with product strategy and engineering partners early in the process.
About This Work
This project was exploratory and vision-level, intended to inform product direction. Output included UX frameworks, interaction principles, and prototype narratives rather than shipped consumer features. Details are kept general in line with confidentiality obligations.