Optimizing E-Commerce Experiences Through Upstream Cycle Insights
In Product Development, adhering to Agile principles, iterating, testing, learning, and adapting are crucial for mitigating risk and fostering innovation, particularly in dynamic markets such as fashion. This case study delves into how research conducted during the Upstream cycle influenced our product roadmap, spanning the double-diamond process from exploration to delivery. It underscores the critical role of collaboration, testing, and shared knowledge in driving meaningful outcomes.
Youcom is a brand specializing in fashion and youth lifestyle. The focus of the Product Tribe is to evolve our digital channels to ensure a delightful omnichannel experience across e-commerce, the app, and physical stores. At the time, the squad I worked in was responsible for the online experience of the ecommerce prior to conversion, encompassed the exploration and consideration phases of the customer journey. This iniative took place primarily on the Upstream Cycle.
Work context
Role: Product Designer
Team Upstream: 2 Product Designers, UX Writing, Product Owner
Team Downstream: Me, Product Owner and Developers
Tools used: Figma, Miro, Teams, Forms
01
Problem Statement & Goals
The Product Details Page (PDP) is a crucial part of the consumer journey, housing detailed product information, photo galleries, color and size selections, and call-to-action buttons. However, our PDPs were not performing optimally. The initiative to improve this page was aligned with the quarter's OKR.
Despite having access to various sources of quantitative data, our primary challenge was the lack of qualitative insights to understand the underlying reasons behind user behaviors.
OKR
Improve the findability of products and the fluidity of the experience to increase digital channel conversion
KPIs
Increase the volume of products added to the cart
Higher conversion rates on desktop and mobile versions
Reduce abandonment rates
02
Exploration
First, we built a collaborative board on Miro so that the designers and Product Owner could collaborate throughout the process and the content could be easily accessible to other team members. Because our objectives were too broad at first and we were dealing with a lot of uncertainty, we recognized that a CSD Matrix (Certainties, Suppositions, and Doubts) would be the best starting point for us to understand where we were and where we want to go next, allowing us to narrow down the subjects to investigate. The Certainty column was primarily fed by the PO, while the matrix was filled with data gathered during the study cycle. During this phase, we worked asynchronously and scheduled alignment meetings for significant checkpoints.
The designers were responsible for desk research and benchmarking. We analyzed recordings collected by Hotjar to understand user behavior, interactions, call-to-action clicks, rage clicks, hesitation, and derived insights from them. We also scrutinized the heatmap for clicks and scrolling, providing us a clear view of where the most important information should be placed and what information was being overlooked, as depicted in the image.
The red area corresponds to the average fold and is viewed by ~75% of the audience. It encompasses the photo gallery for both versions, quick information, and CTA for desktop. The product details are viewed by ~50% of the audience on the desktop version (orange-greenish area) and ~35% on the mobile version (green-cyan area). Cross-selling showcases reached only ~35-15% of desktop users and ~25-10% of mobile users.
CSD Matrix connected to some of our Kanban Cards of initiatives. We started with a lot of post-its in the Doubt column.
Heatmap collected on Hotjar
03
User research
From the research, we developed an affinity map to identify the most important questions we wanted to answer, which were then converted into questions for the user interview script. The questions ranged from open-ended inquiries about online purchase behavior to more specific ones about the purchase experience with the brand and perceptions of the PDP, to be asked after usability testing of the current version.
We interviewed 8 customers over a period of 2 weeks through video calls. After the initial part of the interview, we asked each participant to perform a usability test of the current solution. They were instructed to search for a Youcom product as if they were actually buying it, with participants divided equally to explore desktop and mobile versions. We allowed users to freely navigate the website to understand their paths and preferred interactions. This observation revealed both different and similar behaviors, providing us with valuable insights into the customer journey on our e-commerce platform.
Affinity Map of the CSD post-its converted into a script for an upcoming user interview
Me and another UX Designer conducting remote user interviews
04
Debrief & Insights
We recreated the individual journeys of the clients using screenshots and post-its with comments and relevant reactions to represent their interactions with each part of the page. Each participant was coded with a different color, and red post-its were reserved to highlight critical moments.
Next, we elaborated user stories using the “Given-When-Then” model, highlighting those addressing more post-its with a fire stamp to indicate a high level of value. We also created another affinity map with insights organized by themes and a separate one for quick wins. The debrief file was crucial for prioritizing and constructing the backlog, a task later performed by the PO.
This comprehensive research informed the product roadmap for the next quarters. From this research, we mapped out three hypotheses to be explored, with two of them being addressed in the short term.
To summarize related themes and build a global view, we used a single screenshot of the PDP, divided it by sections, and placed all the related post-its in their respective areas. We then translated this information into insights for initiatives to be worked on. To rank the most promising insights, we used red dots to indicate how many users faced the problem or would benefit from the idea, and blue dots to represent our bets as the design team.
Different user journeys on different devices
PDP page divided by sections to organized content taken from the user journeys
Key insights mapped per area with dots indicating frequency, translated into user stories
05
Iterate, test, learn, adapt
Hypothesis #1 - Quick win
During the interviews, clients stated that comments about the products represent an important factor in the purchase decision. As depicted in the Heatmap above, the comment area, positioned at the end of the page, had been poorly viewed by users and analytics showed low interaction with the rating stars component, which worked as an anchor to the comments section. Therefore, as a quick win, we believed that replacing the rating stars of the products closer to the top (after the title and before the short description) would make it more prominent to user perception, encouraging interaction.
We decided to A/B testing this hypothesis, but the results for the new position results were slightly negative, so the original version (with the stars after the short description) was maintained.
Hypothesis #2 - High Value, High Effort
Research shows that visuals and interactivity are particularly appealing to young audiences, influencing their purchase decisions. User interviews reinforced this, highlighting the potential of improving our photo gallery to positively affect conversion. This opportunity, identified as high impact but requiring significant effort, necessitated strong alignment with stakeholders.
Guided by market trends and user feedback, we focused on evolving the photo gallery design. The other Product Designer and I generated ideas independently, sketching low to medium fidelity screens, using elements from our design system and creating new ones as needed. After a design critique with other squads, we refined the best ideas into high-fidelity layouts and presented them to the PO and PM for business-oriented feedback.
The chosen design was an open gallery, displaying all photos on the page to reduce effort and cognitive load. Before proceeding, we discussed technical requirements and constraints with developers. I then took responsibility for the task flow, including final design, user stories, and writing acceptance criteria for the devs.
To minimize risk, we conducted an A/B test comparing the current version with the new design. The initial results indicated improved page engagement and a positive impact on conversion rates, leading us to adopt the new design definitively. The new components created were included in our Design Systems along with the guidelines.
Hypothesis #3 - Further Research Needed
We hypothesized that breaking the product description into sections rather than a block of text would address user complaints about finding relevant information quickly. Users indicated that product composition and model size/measurements were most important. However, this change required further analysis due to its technical complexity. The Project Manager prioritized it for another Upstream cycle.
Months later, we revisited this hypothesis with additional user research. We conducted an asynchronous online card sorting with users to organize and name PDP sections. Concurrently, a quick Hotjar survey asked users to rank the most relevant information in a PDP page. Results from both studies provided ideas for grouping and UX Writing based on similarities and frequency.
Version A - the winner
Wireframes for different propositions
Version B - the new proposition, discarded
High fidelity prototype for the new PDP page - the winner of the A/B test
Example of the grouping done by one participant of the card sorting
The next stage involved a survey with a larger audience, presenting section prints with naming alternatives for voting. Another Squad tested this hypothesis with our app’s customer base. The compiled results showed little impact on the KPIs, so we decided not to extend the test to the ecommerce base. Recognizing the need for further exploration, we documented the learnings in our research repository for future initiatives.
Hotjar survey results
Prototype of the proposed new information sections
Clipped presentation of results, showcasing the selected alternatives for section names
06
Learnings
Iterative Research and Prioritization
Research insights can reveal multiple opportunities, often requiring further exploration. In the Upstream cycle, it is crucial to align our discoveries with the current business goals. Strategic prioritization was key, allowing us to focus on high-impact changes while documenting less critical issues for future investigation.
Importance of Testing and Embracing Failure
Rapid testing is essential in product development as it minimizes risks and conserves resources. Even when a hypothesis is disproven or has a lower impact than expected, it marks a positive step in our product’s evolution. Each test provides insights that enhance our understanding of users, enabling us to make more precise adjustments.
Collaboration and Reassessment
Collaboration was integral at every stage, from initial research to final design. The ability to step back, reassess, and restart when necessary ensured alignment with user needs and business goals. Regular design critiques and discussions with developers helped us navigate technical constraints and feasibility issues.
Shared Knowledge
This case highlights the importance of a research depository accessible to all team members. Information gathered in one research can enrich another, and hypotheses that are deprioritized can be revisited by other teams, accelerating processes and avoiding redundant work. It also ensures that opportunities are addressed and not forgotten.
By integrating these learnings into our workflow, we’ve paved the way for more assertive and informed evolutions in our product development process. Each step, whether a success or a learning opportunity, has contributed to a more robust, user-centric product roadmap.