This site uses cookies to improve your user experience. If you continue to use our website, you consent to our Cookies Policy

  1. Home
  2. Insights
  3. How A/B Testing Helps to Improve Mobile App User Retention
how-to-improve-in-app-user-retention-with-mobile-app-a-b-testing

October 1, 2019

How A/B Testing Helps to Improve Mobile App User Retention

Imagine that you have a seemingly great app, but for some strange reason it doesn’t bring enough customers. How to improve it wisely? In this article, we share our experience with A/B testing, which is a great instrument to make data-driven decisions on your product’s new features and maintain user retention.

Dasha Rizoy

Head of Business Development

In order to succeed in the very competitive world of mobile apps, you should constantly bring innovations to your products. However, a question arises: how to make sure that new solutions retain and not distract your app users? Here go A/B tests on mobile apps – a perfect instrument for data-driven decisions.

What is mobile app A/B testing?

A product update constantly remains on the agenda of mobile app developers. Sometimes it happens that a product team has different or even conflicting opinions about the future updates. For example, you might argue about changing the layout, navigation items, color, images, and more. In this situation, you should not go with your gut to predict users’ behavior but rather rely on data-driven insights.

First of all, it's important to develop a proper testing hypothesis on each update, for example: if you change the navigation item, users will be more engaged with the app, and the conversion rates will go up. Then, you run an experiment to either prove or reject the testing hypothesis. Afterwards, you can already make the decision on whether to launch an update or not.

Basically, A/B testing (also called “two-sample hypothesis testing”) is a comparison of the two variants of a product’s update with a single varying element, such as menu, buttons, copy, and other items. A/B testing helps you to find out which variant of the proposed change would users like the most.

For example, if you want to change the color of a button in your app, you compare the two variants – the one with the existing button, and the other one with the new button. Then, you present each version to half of your audience. Afterwards, the A/B test will inform you about which color version scored better in terms of user conversion or other criteria.

user retention

How can A/B testing improve user retention?

User retention remains one of the biggest problems for the vast majority of mobile apps. This is a common story that users download apps and use them just a couple of times. What’s even worse, people may even download apps and then immediately delete them as the apps don’t meet their expectations. To increase user retention, you should make people clearly understand what your app is about and how to use it properly.

While A/B testing is definitely not a panacea for poor user retention, it still can help a lot. With each new test, you might find out very surprising things about your app’s features. The items that you thought worked perfectly, in reality could just distract the users.

A proper app A/B testing should go through all of the stages of a user’s interaction with your mobile application, from the moment of the first encounter to clicking the “purchase” button. We suggest conducting the following basic types of A/B tests to improve app user retention:

Buttons, images, text

Such tests will help you understand whether the app matches users’ expectations as well as how people interact with it.

App’s content (usually, all sorts of tutorials)

Content tests will show you how users react to the text information. While some of them might be happy to read a lengthy tutorial, others will become distracted and angry.

Sign up options

This a sensitive area for A/B tests. While some users prefer signing up with Facebook, others might do it with personal email or other social media account.

Sign up timing

Here goes the main question: when is the best time to ask users to sign up? Should you launch a wall right from the start or allow users to explore the app’s features without an account? Obviously, that depends on the distinct features of the mobile application. Nevertheless, sorting out the timing of signing up might significantly improve user retention rates.

Once you increase the retention, dividends will immediately come thereafter. For example, your mobile ads will become more efficient, even with a modest growth of 15-20% in retention rates. That will allow you to target more users with the same marketing budget.

AB testing

What are the tools for mobile app A/B testing?

At present, there are many tools for mobile app A/B testing that support multiple platforms. Here, we provide a brief overview of the most popular ones:

Apptimize

Apptimize is a cross-platform A/B testing tool that helps track every single user’s journey across all channels, including mobile, web, mobile web, and even in the app store.

Optimizely

Optimizely is one of the top A/B testing tools that allows changing your app’s behavior in real time. You can quickly remove odd features and add the ones that work.

Taplytics

Taplytics is our favorite A/B testing tool. The good thing is that Taplytics allows you to do tests inside the app, for example test buttons and images. Here, we also test menus, navigation items, and many more. In addition, we can track the impact of new features’ launch with its Launch Control flagging tool.

SplitForce

SplitForce also stands among the leading multiple platform A/B testing tools. It offers great options on segmenting users based on a range of criteria.

Vanity

Vanity is the simplest and it is a free A/B testing tool, which is good for beginners. It provides very basic user behavior reports.

AB testing for mobile apps

How to conduct A/B testing?

Based on our experience, we suggest the following basic steps for conducting proper mobile app A/B testing:

1. Focus on just one element for each test

For example, test just one particular icon, or a call-to-action button, or a tutorial text.

2. Set up the test goal

The goal should be clear and precise. For example, increase new users in particular channels.

3. Set up the hypothesis

The hypothesis should be accurate as well. For example, the app’s conversion rate will increase if we replace the button’s text from “Add to cart” to “Buy now”.

4. Create the variant with the proposed change

Usually, variant A stands for the current version, and variant B goes for the proposed change.

5. Select the A/B testing tool

It should be based on your specific needs, from Optimizely or Taplytics to Vanity. Here you can also build an automated testing strategy.

6. Set up the test’s parameters within the selected testing tool

For example, how long would you like your test to run, which channels to collect the data from, and others.

7. Set up the test’s KPIs and analyze the results

You analyze the test’s results based on the preset KPIs. There are several KPIs to evaluate an A/B test and determine the winning variant. They include conversion rates, average sessions, scroll depth, or interaction time with different elements of an app’s page, and others.

The task before each test is to not make its KPIs too general or too specific.

8. Make changes based on the tests’ results

Sometimes A/B tests might show a very small difference in a conversion rate between the winner and the runner-up, from 1 to 5%. In this case, the proposed change will not make a significant impact on your audience. On the other hand, in case the conversion difference is 10% or much higher, then you should definitely consider implementing the new feature as soon as possible.

9. Prepare for a new A/B test for a mobile application

As conversion optimization is almost a never-ending process, start thinking about the follow-up tests to experiment with other app’s features. A good idea would be regular A/B testing team meetings to discuss plans and priorities.

AB testing in mobile apps

Our experience

In Yellow, we are also determined to make solely data-driven product decisions. We are constantly using a mobile A/B testing for our various projects.

Our testing platform is Taplytics, which offers a variety of opportunities for iOS, Android, and mobile web. With the help of Taplytics we launch both code-based and code-free visual experiments, create tests for random groups of users, and the observe test results in real time.

As soon as we finish A/B testing in Taplytics, we select the winner experiment and make it a base segment. Right at this moment, the winning variation goes forward with all users. However, one cannot delete experiments in Taplytics; otherwise, the app will stop its proper functioning. Instead, we archive all the experiments and then start working with the code itself.

Among the most vivid examples of how we successfully used A/B testing was our trivia game project for Bible studies. There, we had to test different app screens to define what users liked the most. In the end, we managed to find the right solution, which not only retained users, but also increased the conversion rates.

Final thoughts

A/B testing your app is a great instrument to make data-driven desisions on launching new features. You can change icons, descriptions, headlines, screenshots, and more based on their effectiveness.

Furthermore, A/B tests on mobile apps answer two very important questions:

  • Why is this app/app’s feature not working (as an alternative, not bringing enough customers)?
  • How to improve it?

By doing a range of A/B tests, you can select the best app’s features and thereby increase its conversion rate and, obviously, revenue. The most important thing is to continually test in order to provide regular product updates crucial for apps’ market success.

Subscribe to new posts.

Get weekly updates on the newest design stories, case studies and tips right in your mailbox.

Subscribe