top of page
Understand AWS Users’ Primary Use Cases for Using Synthetics's Console

Timeframe

Planned & executed end-to-end research in 4 weeks 

My Role 

1st & solo UXR for Synthetics

Research Method 

Secondary research

Contextual interview

Usability testing 

Stakeholders 

Sr. product manager, Principle designer,

Sr. engineering manager

SUMMARY

Problem

The team disagreed on the design direction of a new feature that was part of an upcoming launch projected to generate $251M in gross revenue, so the Product Lead requested UX research to determine design direction.

Approach

First, I identified research scope through secondary research stakeholder interview.  Second, I conducted contextual interview to uncover use cases and usability testing to evaluate prototypes.  

Finding

Research findings revealed the use case that the new feature would serve and opportunities to enhance it. Moreover, they pointed to a more effective solution that would address the primary use case

Impact

I presented findings and got leadership's buy-in to change priorities to focus on the more effective solution that would benefit both the users and the business.

PROJECT BACKGROUND 

2. Project background

Product introduction

Synthetics is a monitoring product in AWS that helps website owners, many of them developers, to ensure that their website is working properly for the consumers. Synthetics users can generate automated tests that simulate real consumer behaviors 24/7. If any tests failed, Synthetics will immediately notify site owners so they can quickly fix any issues before real consumers experience them.

Research request

Currently, Synthetics' console had 2 views to display test results, but they were quite complex, so the team decided to build a 3rd View to enhance users' experience. 

View 1

View 2

Screen Shot 2023-10-19 at 2.26.32 AM.png

New feature :
View 3

However, in a design review with leadership, 2 conflicting design directions for the new feature arose. Since this is a high-profile feature, the Sr. PM requested evaluative research on the prototype. Upon getting the request, I knew that I had to unpack the problem to decide if that was the best approach.

Timeline

RESEARCH TIMELINE

I wanted to highlight that the phases Analysis & Synthesis and Sharing & Getting buy-in were overlapping with Contextual Interview & Usability testing, because my initial scoping revealed a lack of alignment on the team, so I wanted to share my findings early and often to ensure everyone was onboard.

Role & timeline222.png
3. Planning & scoping

SCOPING & PLANNING

Change the research scope 

After conducting stakeholder interviews, I had 2 main learnings that guided scoping:

  1. The conflicting design directions were actually aiming to solve 2 different use cases, so the cause of the disagreement was which use case the new feature should solve.

  2. Both use cases was supported by research, but were assumptions made by the team.

Initially, I was only asked to conduct evaluative research on 2 prototypes. However, I saw the importance to help the team align on which problem to solve by understanding the use cases. Therefore, I pivoted and changed the research scope by including both generative & evaluative research questions. 

Screen Shot 2023-10-18 at 6.30.10 PM.png

Design the study to get faster results

I considered a two-study design, with a generative study followed by an evaluative one. However, due to the tight deadline, I chose to combine generative and evaluative components into a single study to deliver results faster.

Secondary research informed Recruitment criteria

My team hadn’t done foundational research to create personas before, so I used the org-wide personas that AWS’s centralized UXR team created to align on participant criteria.

My team wanted to gain a broad understanding of existing developer users' needs. Therefore, I recruited both internal & external users with a mix of engineering roles (e.g. SDE front-end engineers, and DevOps)

CONTEXUAL INTERVIEW & USABILITY TESTING

Contextual inteviw

13 participants: 8 internal & 5 external users, with a mixture of engineering roles

60-minute session: 30 min contextual interview + 30 min usability testing. 

  • Contextual interview

    • Goal: to understand participants’ current use cases through observation

    • How & Why: I asked participants to share their screen and show me their most common cases using the product, from beginning to end. Observing their behaviors allowed me to capture details that they couldn’t recall, articulate, or weren’t aware of.

  • Usability testing

    • Goal: to understand if the 2 designs could solve any problems and gather feedback for improvement, such as discoverability and understandability

    • How & Why: I shared the Figma link with participants and asked them to think aloud while I observe. I also asked clarifying questions to make sure that their comments were connected back to their existing use cases they mentioned earlier. 

ANALYSIS & SYNTHESIS

Analysis & Stakeholder buy-in 

I took the approach of (1) Iterative analysis throughout the execution process, and (2) Early & frequent sharing, so I could get more engagement from my busy stakeholders to make sure the findings answer the most important questions they had.

  • 1st round: I categorized each participant's data by interview questions, and highlighted important quotes & observation

  • 2nd round: I compared the similarities and differences across participants and clustered the quotes into themees using a spreadsheet. I also drafted memos of preliminary insights.

  • 3rd round: I further clustered data into larger themes or created frameworks (e.g. user journey). This is an iterative process where I tried different ways of clustering, triangulated with stakeholders' interpretation, and generated more insight statements.

  • 4th round: I summarized the key insights and the corresponding product implications as I collected more input from stakeholders and the wider team

Group 137.png
Group 136.png
7. Presentation

SHARE FINDING & GET BUY-IN

Research findings contradicted the team's assumptions

  1. Participants didn’t find the new feature beneficial for either of the 2 use cases that the team made assumptions about. Instead, it'd be useful for a newly identified 3rd use case

  2. The primary use case was underserved, but the new feature couldn't address it. Fortunately, research findings revealed a simpler solution by redesigning Synthetics' existing console views.

Screenshot 2023-10-20 at 11.30.32 AM.png

Approach to getting buy-in

The findings were completely contradictory to what many stakeholders believed in, so below were some of my approaches to getting buy-in:

  1. Invited stakeholders to attend sessions and debriefing together

  2. Shared a summary after each session through Slack

  3. Showed video snippets for stakeholders to see for themselves

Because I had kept my stakeholders on board throughout the study, everybody was on board with the findings during final presentation. Afterwards, I facilitated a discussion where key stakeholders and leadership resolved their disagreements on the design solution and focused on which use case to prioritize for product strategies.

IMPACT

8. Impact

Strategic Impact

  • Leadership decided to re-prioritize which features to build after learning about users' primary use case

Product Impact

  • Research findings translated into roadmap items for the team to redesign Synthetics' existing console

  • The team learned about how the new feature could add value for a 3rd use case and how to improve the prototype

Team Impact

  • Resolve disagreement and helped the team make decisions based on research data

  • Introduced "observation" as a research approach to the team and generated deep insights about use cases that the team used to determine strategies

REFLECTION

9. Reflecton
  • I learned about the importance of tailoring my sharing approach to get buy-in for research findings. For instance, there was a stakeholder who didn't have time to attend any of the sessions and had doubts about the research findings when I shared written summaries in Slack. Therefore, I put together the most relevant video snippets and shared with him. After watching those curated video snippets, he switched perspectives. 

  • I learned about how to be more agile and learn the most from the same number of participants by incorporating both generative and evaluative components in the same study when appropriate. 

bottom of page