Understand AWS Users’ Primary JTBD Using CloudWatch Synthetics
Time: August 2022
Method: Contextual interview & concept testing
Role: the first and only UXR embedded in one of AWS's product teams. I led the end-to-end research process and completed it in 4 weeks.
Immediate Stakeholders: Principal designer, software development manager, product manager, senior PM
Amazon Web Services (AWS) is the most popular vendor in the cloud infrastructure service market. In 2022 Q1, it took 33 percent of the entire market, followed by Microsoft Azure at 21%, and Google Cloud at 8%
CloudWatch Synthetics is one of the monitoring products in AWS. When users use Synthetics to monitor a website with consumers using it, instead of monitoring usage data coming from real consumers, Synthetics allows users to create tests that generate fake data simulating consumers’ actions on the website. Users use Synthetics to test if the website is up and running, so they could be notified of any problems and fix them before consumers encounter them.
Synthetics' console holds hundreds and thousands of test results. There were 2 views to display the results, but they were complicated and the team wasn't sure if that was the right approach (see images below).
Console View 1
Console View 2
Therefore, the team was planning to build a 3rd view to improve users’ experiences using Synthetics' console. However, when the designer shared the latest prototype with leadership, 2 different design directions emerged and the team couldn’t agree on which to pursue.
Since this is a high-profile feature, the senior PM asked me to conduct evaluative research on the prototype. Upon getting the request, I knew that I had to unpack the problem to understand if that was the best approach.
As my internship was ending, I had 4 weeks to complete the end-to-end research process from scoping, recruiting, execution, and analysis, to presentation.
PLANNING & SCOPING
Getting the context
I started with secondary research to learn as much as possible and because I didn't want to repeat what has already been done.
Afterward, I conducted stakeholder interviews with the product team to figure out what exactly we should do research on and with whom. The key stakeholders included the principal designer, the PM, the SDM, and the senior PM who requested the research.
Broadening the research scope
Initially, I was only asked to conduct evaluative research on 2 prototypes of the new Dashboard to resolve disagreements about the design direction.
I investigated the cause of the disagreement and found out that the different design directions were actually driven by 2 conflicting assumptions about user needs. As I learned that there was a lack of understanding of users’ JTBD for using Synthetics’ console, I realized it was important for the team to step back and understand what users’ primary JTBDs were to solve the most important user problem.
Therefore, I decided to broaden the research scope and included both an exploratory research question and an evaluative one.
In a 1-hour session, I decided to employ contextual interview to understand participants’ current JTBD through observation.
Afterward, I conducted concept testing by showing them the new prototypes to understand if it could fit into their current workflows.
Research findings contradicted the team's assumptions
Contrary to what my stakeholders believed in, participants didn’t find the new feature beneficial for either of the 2 JTBDs that the team made assumptions about.
Furthermore, the whole team was missing a more important user problem that designing for either of their assumptions alone wouldn’t solve.
Getting buy-in from stakeholders
As my initial scoping work had shown me how little alignment there was on the team and the findings were completely contradictory to what many stakeholders believed in, I wanted to make sure that the team would buy into the research findings and wouldn’t feel taken by surprise.
My approaches included inviting them to attend sessions, debriefing together, sharing a summary after each session through Slack, and showing video snippets for stakeholders to see for themselves
Because I had kept my stakeholders on board throughout the study, when it came to the final presentation, everybody was on board with the findings. I facilitated a discussion after the presentation where the team was able to resolve their disagreements on the design solution and focus their discussions on what JTBD to prioritize as part of the product strategies.
Reduce disagreement and helped the team make decisions based on research data
Introduced "observation" as a research approach to the team and demonstrated impacts
The team decided to re-prioritize which features to build after learning about users' primary JTBD
Research findings provided direction for the team to redesign Synthetics' existing console
The team learned about how the new feature could add value for a 3rd JTBD and how to improve the prototype
I learned about the importance of tailoring my sharing approach to get buy-in for research findings. For instance, there was a stakeholder who didn't have time to attend any of the sessions and had doubts about the research findings when I shared written summaries in Slack. Therefore, I put together the most relevant video snippets and shared with him. After watching those curated video snippets, he switched perspectives.
I learned about how to be more agile and learn the most from the same number of participants by including both exploratory and evaluative research questions in the same study when appropriate.