Improving Lingvist's Onboarding

Lingvist is an adaptive language-learning platform using AI

Redesigned Lingvist's onboarding to align with user confidence levels, resulting in improved retention.

Timeline
March 2018 -> May 2018
Role
Product Designer
Platforms

Android

Android

Android

iOS

iOS

iOS

Web

Web

Web

Scale
Startup

Overview

Around 30% of users were dropping off immediately after signing up, without even experiencing the product.
Even before joining Lingvist, I was eager to explore the platform, despite foreign language learning not being my strongest skill. As I interacted with the product, the pain points became apparent. Watching friends and colleagues try it out confirmed my observations—users often struggled to understand how to interact with the first card and questioned how they were supposed to learn.
This initial confusion signalled a significant issue in the onboarding experience, so I set out to address it.

Process

When I joined Lingvist, onboarding wasn’t a top priority. However, after some gentle persuasion, I secured a budget to conduct usability testing with new users.
To ensure the insights reached a broader audience, I invited colleagues to observe and take notes during the sessions. Using Lookback, I recorded every session and created a highlights reel to share with the leadership team and at our All Hands meeting.
The testing revealed several critical usability issues, particularly immediately after signup. Alarmingly, all five participants struggled to complete the first card. In one case, a participant became visibly frustrated and angry before I stepped in to resolve the issue.
While it was tough to watch people struggle with our product, the sessions served as a wake-up call for the company, highlighting the urgent need to improve the onboarding experience.

User interviews

To deepen my understanding of our users, I conducted interviews with existing customers. I aimed to uncover who they were, how they learned using Lingvist, their pain points, and the moments when they experienced the elusive "Aha!" moment.
With a mix of qualitative insights from interviews and quantitative data, I created a comprehensive Learner Journey Map. This map analysed the current onboarding flow, identified measurable goals, examined messaging outside the product, and highlighted pain points throughout the user experience.
The journey map became a foundational tool, helping to align the team and prioritise areas for improvement in the onboarding process.

Tackling drop-off

We began with the most critical issue—drop-off immediately after signup. It became clear that building around users’ expectations and mental models was far more effective than forcing them to adapt to the system we’d built.
This was a tricky internal challenge. Our algorithm was designed to make human learning faster by adapting to each user’s needs—teaching and testing simultaneously. While powerful, this approach was difficult to communicate effectively, and many users struggled to engage with the initial onboarding process.
To address this, I explored other products and experimented with several ideas, but none achieved the desired impact. I returned to the drawing board, focusing on different user scenarios for entering our product for the first time. I identified three distinct personas:
  • "I know it": Users confident in their knowledge.
  • "I think I know it": Users uncertain and seeking reassurance.
  • "I don’t know it": Users starting from scratch.
This shift in perspective helped us rethink how to guide users through their first interactions with the product.

Solution

This new perspective led to a simple yet effective approach: start by asking the user if they know the word.
  • If they respond "Yes", the interface immediately shows them where to type it.
  • If they respond "No", the system guides them to reveal the answer first before typing it.
By aligning the interaction with the user’s level of confidence and knowledge, this solution provided a more intuitive and supportive entry point into the learning experience.

I know it flow

I don't know it

Outcome

We implemented the new onboarding flow in our web app and conducted another round of usability testing. This time, all five participants successfully completed the first card without any issues, marking a significant improvement.
When we released the new design as an A/B test, the results were largely positive:

iOS

  • Day 2 retention saw a 5% increase.
  • Payment metrics showed substantial gains: conversion up 29%, ARPU up 55%, and ARPPU up 22%.

Android

  • Day 2 retention improved by 10%, along with a 10% increase in trial activations.
  • However, the rise in trial activations did not translate to improvements in payment metrics.
These results highlighted the success of the new onboarding flow in improving early user retention and monetisation, particularly on iOS, while also revealing areas for further exploration on Android.