A/B testing of your ads doesn’t have anything to do with the alphabet. It has more to do with testing two different creatives to figure out what works the best.
Do you know that moment when you have an idea in your head and it sounds great? Well, often, after we’ve executed the idea we realize it is not that great anymore. It happens to the best of us, and there is no certain way of telling if a certain ad will work or not.
The chances are, the one you’ve thought will work the best, often becomes the worst performing ad, and the best performing ad is the one you’ve never thought it will work.
To stay on top of your performance and achieve the best results, you need to constantly measure, analyze, test and optimize.
Yes, the game itself matters a lot. But it is a bit unrealistic to expect that everything will work from the first try.
That’s why it is crucial to properly track everything that happens inside the marketing campaigns. When something doesn’t go as planned, you’ll be able to retrace your steps and identify why it happened.
One of the best methods to test your mobile ads is A/B testing.
How is A/B testing performed? You change only one thing in your ad and observe the results. Are the results getting better or are they not going as planned?
This method is also called split testing, since it is used to discovering which element, visual and copy works the best for your target audience.
Knowing who your target audience is an important step of creating as well as A/B testing of the ads. The same way as you shouldn’t rely on your feelings when deciding which ad is better, the same way you should approach researching the target audience.
What you’re trying to learn with A/B testing of your ads?
To conduct a successful A/B test in your game, you’ll first need to have a clear idea of what you’re trying to achieve with the A/B testing.
If the players of your game are abandoning the game just after they’ve seen the ad, maybe the ad is interrupting their experience. There are many reasons why they could be behaving that way.
After you’ve decided what you’re trying to accomplish with your A/B test, try to assume what will you achieve with the new creatives, new placements, new copies… When coming up with the hypothesis, make it as specific as possible and make sure you have the ways of measuring the results.
Don’t be afraid to come up with all sorts of different hypothesis. They need to be tested, and they don’t necessarily need to be proven correct.
Segment your users according to the hypothesis
Not knowing and understanding your users is like taking a shot in the dark. It is like making a guess or estimate with very little or no assurance as to its accuracy.
If you have already set up mobile app analytics, user segmentation will be much easier. For user segmentation within your app, you’ll use audiences. Audiences are users that you group together on any combination of attributes that is important to your business. That way you can see behaviors of different segments of your users. Mobile app analytics include segments – in which users are grouped by the activity, the ARPU, the app version they user, age, gender, country/region, their interests.
You can A/B test almost anything. If you wanted to test which color of the in-app purchase button works better, you could segment the users into two groups and change the color for one segment of the users. In that case, the A group would have the blue in-app purchase button, unchanged, and B group would have the yellow button you’re wishing to test.
You should always have a control group. A control group is an integral part of A/B testing. The control group is the one for you’ve won’t be changing anything. B group is the one that will use the button with the new color, for example, and the results of the B group will be tested against the performance of the control group.
You can do the same with even more segments, just remember to keep one segment as a control group.
Continue A/B testing
After launching the A/B tests, sometimes things will feel off. The preliminary results could be terrible and your first reaction would be to stop the tests. But don’t be discouraged by the first results, mobile ad networks and platforms need to take time to learn what works the best. You should approach this as a scientific experiment and let the results level through a certain period you’ve designated for the testing.
If you’re having doubts about how big your groups should be, use sample size calculator. The more accurate you want to be, the larger the sample size needs to be.
Implementing the changes
After you’ve successfully finished your A/B tests, you’ll have a clear idea do you want to implement the change or not. It is pretty self-explanatory. Changes that resulted in improvements to your results you’ll probably choose to implement, and changes that didn’t improve anything you’ll probably just want to forget.
After finishing the test, you can start it all over again.
And I can’t stress this enough, always test just one thing. Because if you don’t test just one thing, but multiple, you won’t be able to differentiate which change caused the results.
Using A/B testing for app monetization
In 2018, ironSource started a revolution in the mobile gaming market with the new function of “user-level ad revenue”. Up until that point, it was impossible to calculate the true ARPU and build accurate LTV models. With this function, it is able to determine ad revenue on the user level, from any ad unit, across every ad network on the ironSource mediation platform.
To sum up:
A/B testing is a never-ending process. There is always something to be improved and optimized. After you’ve finished one A/B test, there is always another behind the corner. Changes in the world of mobile games are constant and happen often, to keep up with the trends don’t forget to A/B test everything.
I think you’ll also find these articles interesting! 👇 💁
We are an award-winning marketing agency specialized in mobile apps & games. We help scale products that people love, keeping the attention on data and results. Have questions, need help? 🤗 Email us at firstname.lastname@example.org!