Improving Conversion Rate

Why do users drop off? And what can we do to fix it?

Overview

Dropping off during onboarding for any multistep process is expected. Any added friction always increases the risk of the number getting higher. What was different about this research is that many users across all markets abandoned onboarding before starting the sign-up process. More than 50% of the drop-off happened between downloading the app and reaching the end of the splash screens.

Research Approach

When it comes to user research, there's no one-size-fits-all approach. Sometimes, the nature of the research question requires a more nuanced and multifaceted approach.

After understanding what the business hopes to achieve and how research is intended to support it, it became evident while creating the research plan that it is one of those questions.

To achieve the goal of this research, I had to break up the research activity into several stages, each using a different research method to arrive at a comprehensive understanding of the problem and our users. This allowed me to gain a deeper insight into our users' specific needs, motivations, and pain points. And therefore provide clear recommendations to designers and other stakeholders.

Discovery - Going through the data

A good starting point for any research activity is looking for existing data. We can’t know what we don’t know if we don’t know what we do know. This included reports from the customer support team, their frequently asked questions, and their feedback.

Desk Research was also a part of this stage of the research. A competitive analysis was conducted to provide strategic insights into the functions, flows, and feelings evoked by the design solutions of our competitors.

Discovery Findings

  • Users are unaware of the benefits of opening an account - the value proposition.

  • Users are not clear on the account opening requirements

Hypothesis & Concept Design

Formulating Hypothesis:

The findings from the discovery, alongside themes from the competitive analysis, were shared with the design team. The design team formulated two hypotheses to test:

  1. If we use a tone of voice that resonates with the customers, we can convey the value proposition and benefits more effectively.

  2. If we place the account opening requirements when customers expect them in an easy-to-grasp format, we can alleviate a pain point.

Concept Design:

Testing two tones of voice for the splash screens and the value proposition screen | Testing two different versions of the value proposition screen | Testing placement of requirements in different interaction points.

Changes I would have made - in retrospect

Prioritize the hypotheses and select one of them to address.

Limit the validation to one flow/screen.

Original Screens - where the changes are made

Moderated A/B Testing - Internal Discussion

Background

I met with the customer support team to increase confidence in the proposed solution.

They were asked to share their thoughts out loud in a semi-structured discussion.

The goal of this meeting is to align before unmoderated A/B testing with the customers.

Discussion Guide

Introduction: Background | Instructions | Permission to record

Tone of voice: side-by-side comparison

Value proposition redesign: side-by-side comparison

Account opening requirements: demo

Findings

As is expected from a panel discussion, the findings were not conclusive. However, the internal discussion between participants and their feedback and remarks about the screen provided valuable insight by analyzing themes and paying attention to their sentiment and reactivity. Findings were shared as follows.

Report Format

Findings at this stage of the research were shared with the design team. When findings are not conclusive meetings do a better job in fostering a shared understanding, so I booked a meeting with them and linked the report in the invitation for those who are interested.

Each research activity provides a chance to revise and improve our processes and ways of working. It’s a hidden nugget worth looking for.

In this case, it was the need to include Heuristic Evaluations before conducting any usability testing.

Including a heuristic evaluation has the potential to improve the quality of the research findings. If the test participants encounter a design element that violates a heuristic they’re less likely to focus on the task at hand.

Unmoderated A/B Testing with Real Customers

Before the Test:

The test structure was slightly modified before going live with the A/B testing based on the findings from the moderated test and discussion.

To narrow the scope, changes to the value proposition screen were prioritized.

User Segments: The first segment was in collaboration with UserTesting. The second segment includes users who downloaded the app but did not complete registration - their data was requested from the Data Team.

Test Design: Participants are shown both variants and are asked to score them on a scale, e.g., understanding, trust, etc.

Test Execution: UserTesting was used to design the test. In addition to the scale, participants were prompted to share their over all thoughts in an open form.

The final test and user segments were shared with the marketing team to distribute it as a survey.

Duration: Two weeks

Sharing the Findings

Unmoderated A/B test results are direct, provided due diligence in the test design, as the results are shared directly from UserTesting. The findings report was formatted as follows.