A/B tests allow you measure which app changes or messaging strategies are the most successful. Each test compares a control group of users (A) against one or several test groups known as variants (B, C, etc.). The control group will receive a default experience, and the variant groups will receive a different experience or message, so you can determine which is the most effective.
Instead of risking negative feedback by launching untested features, you can test new features or messaging strategies on a smaller portion of your customers, optimize, and then launch in full.
In the A/B Tests dashboard, you can view all of your current and past A/B tests, their status (active or inactive), and the goals and targeting options for each.
You can A/B test your app's messages, interfaces, and variables.
- Message copy
- Delivery channels and timing
- In-app triggers
- Audience segments
- Lifecycle campaigns
- Permission requests
- Onboarding tutorials
- Cart abandonment campaigns
- App content
- Background, text, and button colors
- Product flows
- New features
- App logic
- Custom structured data
To get started, jump to Creating an A/B test.
You can see the results of your A/B test in the Analytics dashboard. A/B test analytics.
You can view A/B test analytics at any time during or after the test, whether the test is active or paused. Note that it may take two to four hours for A/B test results to show up in Analytics.
Once you are satisfied with your test results, you can finish your test and roll out the winning variant. Navigate back to your test’s setup page to merge the winning changes to your app or campaign.
We recommend testing one incremental change per test. While you can test a virtually unlimited number of changes and variants, testing just one change at a time gives you the most accurate data with which to make informed decisions moving forward.
Also note that users can qualify for multiple tests at a time, as long as they fit the target criteria for each test.
Multiple tests for one variable
When a user qualifies for multiple A/B tests affecting the same variable, Leanplum will show that user the experience for whichever test was published first. If you want to ensure that users see the newer test, do one of the following:
- Make sure the targets for the two tests are mutually exclusive, or
- Finish any other tests affecting the variable before creating a new test
For more about A/B tests, read A/B test troubleshooting and FAQ.