
Optimizing the Athleta Shopping Experience
Gap Inc, 2024-2025
Driving Growth Through Agile UX Testing
Overview
As the primary UX designer on the Athleta POD, I worked in a fast-paced, agile team to optimize the browse, bag, and checkout experiences. My role focused on identifying high-impact tests, designing rapid iterations, and balancing data-driven experimentation with brand consistency. Through an iterative, hypothesis-driven approach, we launched 40 tests in a fiscal year, leading to a $40M demand lift with a 25% win rate.

The Challenge: Balancing Experimentation & Brand Consistency
Athleta’s POD focused on removing customer friction points and optimizing the purchase journey to drive conversion and lifetime value. However, one of the biggest challenges was navigating the tension between rapid UX experimentation and broader brand foundation work. While the POD was tasked with running high-velocity tests to maximize sales, the brand team prioritized long-term consistency, clarity, and customer trust. Balancing these sometimes contradictory forces was a key learning experience.
My Role & Process
1. Strategic Kickoff & Hypothesis Framing
At the start of each sprint, I participated in kickoff meetings to:
• Understand and challenge test hypotheses against UX best practices.
• Identify high-value tests with the potential to drive measurable impact.
• Align cross-functional teams on test priorities, feasibility, and expected outcomes.
2. Rapid UX Design & Iteration
As part of an agile team with short turnaround times, I created:
• Quick design variations optimized for immediate A/B testing with Optimizely.
• Multiple test options to explore different user behaviors.
• Designs that balanced conversion-focused optimizations with brand alignment.
3. Cross-Functional Collaboration
I worked closely with:
• Data Engineering & Growth Teams to ensure tests were measurable and insightful.
• Developers to implement and refine tests efficiently.
• Brand Leadership to ensure tests aligned with Athleta’s voice and customer experience guidelines.
4. Testing & Measuring Impact
• Heavily measured all tests before recommending implementation.
• If a test was unsuccessful or showed potential for greater impact, I iterated and re-tested.
• Tests ranged from small UI tweaks to major changes that resulted in multi-million dollar sales increases.
Key Achievements & Metrics
✔ $40M Demand Lift (Annualized incremental revenue from tests launched).
✔ 40 Tests Launched in the fiscal year.
✔ 25% Win Rate (Percentage of successful tests).
Key Learnings & Takeaways
• Balancing Data & Brand: Learned how to navigate the tension between rapid testing and brand consistency, ensuring that short-term gains aligned with long-term brand trust.
• Agile UX in Growth Teams: Developed expertise in fast iteration and hypothesis-driven design, where speed and adaptability were critical.
• Testing at Scale: Gained experience in high-stakes UX experimentation, where even minor changes could have multi-million dollar revenue impacts.


Conclusion
This experience sharpened my ability to move quickly, challenge assumptions, and balance data-driven decisions with brand integrity. Working in a fast-moving, high-impact UX growth environment, I saw firsthand how strategic design experimentation can directly translate to business success.