Analytics

Analytics

Analytics, Marketing

4 top mobile a/b testing tools header image

This is a guest post by Natalia Yakavenka from SplitMetrics

Ask any mobile marketer what is the best way to optimize conversion rates for your app page and you’ll most likely get A/B testing as a response. While A/B testing is still most often associated with the web, the concept of a/b testing for mobile app pages is not new. The very first solutions growth hackers used were custom coded landings, but such approach requires time and effort. However, app page conversion optimization only became popular when self-service platforms like SplitMetrics and Storemaven emerged. These platforms brought a completely new level of A/B testing for mobile pages as they provided insights on top of showing the winning variation. Later on, the introduction of Google Play Experiments in 2015, brought A/B testing of app landing pages into the “must have” category for app marketers. Since that time, plenty of new solutions have emerged but we recommend sticking to the 4 most popular tools presented here.

Google play store allows experiments - a limited way to do mobile a/b testsGoogle Play Experiments

When it comes to selecting the best A/B testing tool, the most common question is why go elsewhere if you have the free Google Play Experiments. Indeed, it allows mobile publishers to run free experiments on their app pages, but it comes with significant limitations. The most serious ones are that you can’t test unpublished apps and you’ll never find exactly what worked due to the lack of on-page analytics. Still, Google Play is the perfect solution for those who are not familiar with paid traffic and user acquisition as it doesn’t require driving traffic to the experiment from ad resources. The other three tools require sending traffic to their experiments and are usually for more advanced marketers.        

Distinctive features: absolutely free + requires no additional traffic

Split metrics logo - this mobile a/b testing tool offers many advanced ASO featuresSplitMetrics    

Founded in 2014, SplitMetrics was among the first ones to provide every marketer with an easy-to-use, unlimited, and flexible A/B testing tool. In addition to the regular icon/screenshot testing, it offers pre-launch experiments for unpublished apps and Search, Category and App Store Search Ads testing. Unlike the Google Play service, it offers a multi-armed bandit approach which helps reach significant results fast. But it’s not as ideal as it seems to be — you have to pay for it. Though the price is very reasonable and you have a 30-day trial, you will need to pay a monthly fee for your subscription.

Distinctive feature:  pre-launch experiments and App Store Search Ads testing

StoreMaven is one of the pioneers of the mobile a/b testing and ASO spaceStoremaven

StoreMaven provides easy-to-use A/B testing for the entire app store landing page experience. One of their advantages is offering benchmarks based and best practices based on their broad client base in each of the app store categories. On top of that, StoreMaven clients benefit from their money saving algorithm, StoreIQ. This algorithm helps conclude tests with fewer samples and lower costs by leveraging historical data to quickly determine the winning creatives. StoreMaven provides a fully dedicated Account Manager to make sure clients make the most of their testing budgets. This tool is also a paid tool that is offered as a monthly subscription.

Distinctive feature: Professional Services

One of the features offered by the tune platform is A/B testing to improve your ASOTUNE’s A/B Testing

4) Tune offers many services for app marketers. They are mostly known for their attribution service — measuring paid app installs. However, they also offer A/B testing and optimization tool for the app landing page. Launched in spring 2016, it already provides solid functionality, offering the basic functions of testing different types of assets and showing all measurements and stats collection. While Tune offering is more complete compared to the other testing tools, it’s biggest limitation is that it doesn’t work with other attribution providers. The tool is also limited with regards non-US regions and only supports a small list of regions. In terms of pricing, Tune’s A/B testing tool is not available as a stand alone and so customers have to buy it as part of a suite of services.

Distinctive feature: works very well with other Tune capabilities

A/B testing can be easy with the right tools and is recommended for any app marketer as part of a data-centric growth strategy. Feel free to also try our quiz — test yourself to see how data-driven your game is.

Feel free to share:
Analytics, App Monetization

Header image showing how complex it is to a/b test your ad based app monetization

A/B testing has been an integral part of marketer toolbox for a good reason – it takes a great deal of the guess work away from marketing. In online and mobile companies it also became a popular tool for product managers. Every time a new version is released, why not a/b test against the existing version and make sure nothing got broken. In mobile app monetization, however, this tool is not available.

Why ad based app monetization is so hard to A/B test

The core requirement for A/B testing is to be able split your users into two groups, give each group a different experience and measure the performance of each one so you can compare it later. There are a number of tools who can facilitate the split for you including Google Staged Rollout. If you are measuring IAP monetization it’s easy enough to associate purchases to the users who made them and then sum the revenue in Group A and Group B. In ad monetization however, it’s impossible to associate ad revenue to individual users. The ad partners mostly don’t report the revenue in this level of granularity.

Method 1 – interval testing

One alternative that companies have been using is interval testing. In this method, the app publisher will have one version of the app already published and will roll out a version with the new feature to all the devices. To make sure all the users received the new version publishers will normally use force update method that gives the user no choice. The impact of the new feature will be measured by comparing the results over two different time intervals. For example, Week1 might have contained version 1 and week 2 might contain version 2 so a publisher can compare version 1 vs. version 2 by comparing the results in different date ranges.

Pros

  • Very simple to implement – no engineering effort

Cons

  • Highly inaacurate and subject to seasonality
  • Force update method has a negative impact on retention

Method 2 – using placements or different app keys

This is a pretty clever workaround for the problem. Most ad providers has a concept of placements. In some cases, they are called zones or areas but all 3 have the same use – they are planned so you can identify different areas in your app where ads are shown for reporting and optimization purposes. The way to use this for A/B testing is to create a zone A and Zone B and then report Zone B for users that received the new feature while reporting Zone A for the control group. If you are already using the zones feature for it’s original purpose, you might already have zone 1, 2, 3, 4 and 5 so you would create 1a, 1b, 2a, 2b, ….

Of course, if you are using multiple ad-networks you would need to repeat this set up for every ad-network and after the test period aggregate the results back to conclude your A/B test.

A variation of this method is to create a new app in your ad-network configuration screen. This means you will have 2 app keys and can implement one app key in group A and the other app key in group B.

Pros

  • More accurate compared to other methods

Cons

  • The effort for implementing a single test is very high and requires engineering effort
  • Will be hard to foster a culture of testing and being data driven

Method 3 – counting Impressions

This method requires some engineering effort to set up – every time an impression is served the publisher reports an event to his own servers. In addition, the publishers sets up a daily routine that queries the reporting API of each ad-network and extracts the eCPM per country. This information is than merged in the publisher database so that for every user the impression count for every ad-network is multiplied by the daily average eCPM of that ad-network in that country. The result is the (highly inaccurate estimation of the) ad revenue of that user in that day. Once you have this system in place, you can implement A/B tests, split the users to testing groups and than get the average revenue per user in each group.

Pros

  • After the initial set up there is no engineering effort per test

Cons

  • Settting this system up is complex and requires a big engineering effort
  • Highly inaacurate – it uses average eCPM while eCPM variance is very high
  • Can lead to wrong decisions

Method 4 – leveraging true eCPM

This method leverages multiple data sources to triangulate the eCPM of every single impression. It requires significant engineering effort or a 3rd party tool like SOOMLA TRACEBACK. Once the integration of the data to the company database is completed, publishers can implement a/b tests and can get the results directly to their own BI or view them through the dashboard of the 3rd party tool. Implementing A/B tests becomes easy and a testing and optimization culture can be established.

Pros

  • The most accurate method
  • Low effort for testing allows for establishing a testing culture
  • Improvement in revenue can be in millions of dollars

Cons

  • The 3rd party tool can be expensive but there is usually very quick ROI

 

Feel free to share:
Analytics, App Monetization, Resource

ltv model

In previous blog posts I posted 6 different LTV calculators and received a lot of feedback about the LTV models. Turns out game publishers found them super useful for calculating the LTV of their game. It was great to hear the positive feedback which also led to a lot of conversations about how people are calculating their LTV. Here are some of the learnings I can share.

Specific LTV model is always better than generic one

All our LTV calculators can’t be nearly as accurate as the ones you can build in-house. If you have the money to hire a data sceintist or at least contract one to build a formula for you after you have gethered some data, you will end up with a more accurate model. The reason is simple, in predictive modeling, the more signals you have the more accurate the model will be. All our calculators use retention and arpdau because they need to be widely applicable. However, there are a lot more signals you can feed to a specific model: tutorial completion, level progress, soft currency engagement, challenges completed, … Factoring such signals would give you a better prediction model. Our generic calculators’ main purpose is to get you started, give you a framework to think about LTV prediction and help you do some basic modeling if you are on a budget.

Simplified spreadsheet modeling

Our original spreadsheet model was taking in 31 points of data. However, after talking with readers I learned that most of you only track 4 retention data points and 1 arpdau point. This is why I created a version that is simpler on the input side. Another feedback I received is that you want more outputs: Day 60, Day 90, Day 180 and Day 365 LTV. Here is the new calculator based on all that feedback.

Inputs:

  • Day1 retention
  • Day7 retention
  • Day14 retention
  • Day30 retention
  • ARPDAU

Outputs:

  • Day60 LTV
  • Day90 LTV
  • Day180 LTV
  • Day365 LTV

Method:

This spreadsheet is the same one from the retention modeling we presented in this post but with a few tweaks.

The actual spreadsheet

 

If you want to measure the ads LTV in addition to IAP LTV you should check out SOOMLA Traceback – Ad LTV as a Service.

Learn More

Feel free to share:
Analytics, Marketing

Kongregate's recent blog post suggests that you can double your traffic by tracing your ad revenue

I recently came across a fantastic post by Jeff Gurian. Those of you who don’t know Jeff, he is the Director of Marketing at Kongregate. In his post he brings up a super important point – you can double your traffic by Tracing the Ad LTV or “counting the ads” in the language of the article.

Doubling your traffic only takes a 25% increase in LTV

According to Kongregate’s experience with user acquisition, Jeff explains, the correlation between how much traffic you can get and the bids you place is not linear but rather a power function. “There is always a tipping point where your traffic will increase exponentially relative to the increase in your bid.” says Jeff.

The chart in the post does a good job in explaining this point:

chart illustrating the power curve of the impression volume you can get at different bid levels

Image from original article at Kongregate developer blog

In this example – acquiring traffic with bids of $12.5 as opposed to $10 will allow you to get twice the amount of traffic. In other words, a bid increase of 25% transatles to a volume increase of 100%.

Tracing Ad LTV allows more room in your CPI bids

Not all games have ads but the ones that have added in-game advertising are seeing between 10% to 80% of their revenue coming from ads. 25% is a typical scenario in many games and is also close to the ratio reported by public companies such as Glu and Zynga. The example given in the article (see image below) is showing that tracing Ad LTV can modify your ARPU / LTV analysis by 25%-30%. As we know, higher LTV means that we can afford to pay higher CPI which leads to twice as much traffic per the explanation above.

Illustration of LTV and ARPU calculations with and without tracing-back the ad revenue

Image from original article at Kongregate developer blog

Let SOOMLA do the work and get you the accurate Ad LTV

Many companies skip the Ad LTV since the process for calculating it is often complicated, time consuming and in many cases it is not accurate enough. Their claim is that none of this matters if you are miscounting your Ad LTV. Counting impressions can lead to significant errors in LTV calculations which means your ROI analysis can be off and end up losing money for the company.

Fortunately enough, SOOMLA has developed a solution that automates the Ad LTV calculation and we do that with much greater accuracy so now you can enjoy the benefits of Traceback and double your traffic without worrying about accuracy or extra development effort.

To save valuable resources and ensure you are getting the Ad LTV correct for every cohort you need a specialized system like SOOMLA TRACEBACK. The platform traces the ad revenue and sends it to your attribution partner or in-house BI.

Learn More

 

 

Feel free to share:
Analytics, Marketing

BiggestMistake_in_Ad_LTV_Calculations

Recently I became aware of game publishers that implemented an in-house solution for Ad LTV tracing but were doing a huge mistake in how they think about ad revenue. We all know that any LTV calculation has 2 main factors:

  • Retention
  • Revenue
The Ad revenue is the factor that companies get wrong when they build in-house solutions for Ad LTV tracing. These solutions often assume that each impression pays the same level of CPM. This is a huge mistake that can lead to errors in orders of magnitude and ROI calculations that are way off.

If this is how your company calculates Ad LTV you should read the following examples carefully.

Example 1 – The Rewards Collector

  • User played during the first month and never came back after.
  • Watched 50 rewarded video ad impressions from Vungle – didn’t click or install any ads.
  • Average eCPM for this month from Vungle $15
Ad LTV Based on Impressions The True Ad LTV Error
$0.75 $0 $0.75

This type of error could lead the UA teams to a false positive ROI calculations. The UA team thinks the ad spend on this user is ROI positive while it’s actually a losing buy.

Example 2 – The Ad Whale

  • User played 5 days during 2 weeks
  • Watched 10 interstitial ads from AppNext, clicked on 2 and installed a Match-3 game and a Strategy game
  • Average eCPM reported by AppNext for those days – $5
  • CPI for that Match-3 game – $2, CPI for the Strategy game – $5
Ad LTV Based on Impressions The True Ad LTV Error
$0.05 $2 $1.95

Here the ROI calculation could be false negative. The UA team will stop buying these type of users since ther reported Ad LTV is $0.05 while it’s actually $1.95 and the buy was actually a good one.

Example 3 – The Retargeted User

  • User played 10 days during 1 month
  • Watched 20 video ads through Inneractive
  • Average CPM reported by Inneractive for those days – $5
  • This user was a whale in Game of War and was part of a retargeting campaign so specific CPM bids for that user were high – $80 x 4 ads, $90 x 2 ads, $100x 8 ads, $110 x 2 ads, $120 x 4 ads
Ad LTV Based on Impressions The True Ad LTV Error
$0.10 $2 $1.9
The ROI calculation in this example is also likely to be false negative. The UA team might think this was a bad user to bring to the game although his Ad LTV alone was $2.

 

If your company needs to calculate Ad LTV you should try to avoid these costly mistakes. Check out SOOMLA Traceback – Ad LTV as a Service.

Learn More

Feel free to share:
Analytics, Marketing

What-Are-Ad-Whales

Targeting lookalikes of your best users has been the easiest and most effective way spend mobile ad budgets since Facebook first introduced the feature in 2013. Google and Twitter are now also offering similar features and advertisers use them with similar levels of excitement.

What happens if your app is monetizing with ads and not IAP?

Apps that monetize mostly with advertising have a much more complicated job when trying to acquire new users. With ads it’s really hard to figure out who are the best users of your app:

  • The users who had the most amount of sessions?
  • The users who watched the most amount of ads?
  • Users who performed social actions?
  • Some other in-app event?

Ideally you would want to create a group of the users who generated the most amount of revenue from advertising in your app and get more users like that.

What are Ad Whales and how to find them?

2% of your users install other apps after viewing ads in your app, these users contribute more than 90% of your ad revenue and can be referred to as “Ad Whales”. This group of users highly resembles the users who make purchases in your app. They are a small group that contribute most of the revenue.

Understanding who your ad whales are could be very useful if you want to spend your advertising budget smartly. You could learn more about the demographics and interests of these users and find more users who share similar characteristics. Better yet – you can let the lookalikes algorithm do this job for you and simply sit back and see your user acquisition campaigns target only users who are similar to the Ad Whales you found.

Tracing your ad revenue is critical for discovering Ad Whales

Unlike In-App Purchases, ad revenue events are not generated inside your app. Finding the Ad Whales is almost impossible unless you have an ad traceback system in place. Traceback is a technology that allows you to trace ad revenue back to the user level. Once you have such a system in place, it’s easy to see who are the users that contribute the most amount of ad revenue.

 

SOOMLA TRACEBACK is a platform for tracing ad revenue. It allows you to get granular data about each and every user and identify the users who contribute the most ad revenue.

Learn More

 

Feel free to share:
Analytics, App Monetization

there are many things even the experts don't know about life time value - here are 5 of them. this image is a header image for the blogpost

You might have heard some industry experts talk about LTV (life time value) and how important it is. Here are 5 things even some of the experts don’t know about LTV.

1 – Life time value (LTV) is not just for marketing campaigns

You might have heard that you need to know your life time value to do marketing. This is correct but there are actually more reasons. The first reason for for calculating LTV is related to the early design phase. Before you even start making the game you should analyze the potential LTV based on benchmarks from similar games. This important for fundraising as well as for choosing the right games to build. The second reason is even more important. LTV is the one KPI that wraps both ARPDAU and retention and it is highly correlated with long term success. By actively tracking LTV your team will be focused on the right thing when making decisions about the game and monetization techniques.

2 – There is no real life time value – only predicted life time value

Knowing the real LTV requires waiting a very long time – technically you will have to wait a lifetime. You can assume some maximal lifetime – in games 180 days and 365 days are common values for the maximal lifetime. These time frames are just too long to make any meaningful decisions about marketing, product or monetization. Lets say you made a new feature and want to know if you should keep it or not – waiting 180 days for a decision is just impractical. Whenever someone is talking about life time value he means the predicted life time value. That’s the only parameter you can actually work with. To predict yours, you can use one of these 6 LTV calculators

3 – You can succeed with low LTV but not with declining LTV

There are successful games with LTVs as high as $20 or as low as $0.3. You can succeed with low lifetime value and many games have – this is especially true if you are able to constantly increase it. However, you can’t succeed if your LTV is declining – it means that something is fundamentally broken with your game.

4 – Most companies have both CPI > LTV and CPI < LTV

LTV has to be greater than CPI! There are a ton of articles that explain that If your get the basic formula right you are golden. In fact, there was even a conference with that name (http://ltvgtcpi.com). In real life however, you can’t be golden in all segments so the trick is more around finding your golden segments and expanding on them. If your app uses ads, you will need to trace ad LTV per segment using a traceback platform.

5 – In successful games most of the life time value is created after day 30

If you build a life time value spreadsheet and play around with the numbers you will soon see that typically the first 30 days contribute between 25% to 50% of the total life time value. Plugging in the known ratios of 40%,20%,10% for d1, d7 and d30 retention shows that the yield in days 31 to 180 is twice as much as your first 30 days. This means that you should invest time in giving your most loyal users reasons to play for a really long time. King has mastered that art well and Candy Crush has 1,880 levels in the game. I’m sure they are working on some new ones as we speak.

Plugging in 40%, 20% and 10% as the values for d1, d7 and d30 retention shows us that only one third of the LTV is generated in the first 7 days.

 

If your game uses ads and you want to track the LTV per cohort, segment and testing groups, you need a traceback platform. Check out SOOMLA Traceback – Ad LTV as a Service.

Learn More

Feel free to share:
Analytics, App Monetization

measuring your ad revenue in mobile apps is a tricky business and many publishers makes mistakes doing so. Here are the 5 most common ones.

This post is about the mistakes that mobile app publishers are making when measuring their ad based monetization. Whether your company is using general purpose analytics, attribution, the mediation dashboard or in-house BI to track your revenue from advertising you are probably making at least one of these mistakes.

1 – Week by Week Testing instead of A/B testing

From what I have seen so far this one is a fail for 100% of the mobile app publishers I have talked with. Lets say you want to test a new feature that increases the number of allowed rewarded videos from 3 to 5. There is a right way and wrong way to do it. A/B split is pretty easy to implement on Google play due to their controlled roll-out feature and on iOS it’s not that hard either. However, when it comes to ad revenue companies use week by week testing. In other words they implement something and compare the ad-revenue of this week vs. last week. Here are a few reasons why this is wrong:

  • There could be campaign changes between week 1 and week 2 – campaigns go up and down on the ad-network side all the time if week 2 was better due to a big campaign you might think it’s because the changes you made. A/B tests eliminate that
  • Your user behavior and usage volume might be impacted by real world events like a holiday weekend or a big sporting event – with A/B tests the events impact both groups so it’s a fair test
  • With week by week testing you have to go “all-in” and you don’t even know if the revenue change came from the group who received the change
  • It’s almost impossible to reach statistical significance with week by week testing

The reason why companies don’t implement A/B testing for ad-revenue is that doing so without a specialized ad revenue tracking solution is very complex. However, optimizing with week by week testing is very limited.

2 – Assuming all users are worth the same

Most mobile app publishers assign very specific value to each user when it comes to IAP revenue but fail to do the same for ad revenue. The typical approach is to assume all users are worth the same amount of revenue. This is in-fact very far from reality. First of all, not all users even see ads when it-comes to rewarded videos and even if you look at the group that does see ads there are users that install a few apps and are worth more than $10 while others who only watch the videos end up not generating any revenue.

3 – Not measuring your eCPM decay

“The 1st impression of a user is worth the same amount of money as the 10th impression” – FALSE. The performance of the 1st impression is higher and so the CPM that advertisers are paying in RTB are higher and the eCPMs you are getting from the rewarded video network is also higher for the first impression from the very same reason. As the same users sees more and more impressions in the same day he becomes blind to the ads and the CPM decays. Assuming that all the impressions are worth the same amount of money is a common mistake by mobile app companies.

4 – Focusing on impressions rather than Opt-in ratio

Rewarded video became one of the biggest sources of advertising revenue for mobile app companies. However, it’s important to understand that this is an opt-in type of interaction. With some games, only 10% of the users choose to see the ads while in others it can be as high as 70%. Since the 1st impression pays a lot more than the subsequent impressions, focusing on increasing the number of impressions is a mistake. Companies should focus on increasing the Opt-in ratio instead

5 – Not tracking churn by campaign creative

The last mistake is related to the relationship between ads and churn. There are 2 type of ad interaction that can cause your users to churn:

  • Ads that have a negative experience – are deceptive or have low quality creative
  • Ads of competing apps might steer your users away from your app

Not tracking the impact of different ad creatives placed by the ad-networks in your app could be dangerous.

 

If you want to improve the way you are measuring your ad revenues and stop making these 5 mistakes – check out SOOMLA Traceback – Ad revenue tracking platform.

Learn More

 

Feel free to share:
Analytics, Research

header image of 7 analytics platforms with ltv reporting

If you want to know your LTV and is using a free analytics platform you might find our online LTV calculators interesting. You can also see our guides for Calculating LTV with Flurry and with Google Analytics.

However, another approach is to upgrade to a paid analytics platforms that offers LTV reporting out of the box. Unfortunately, I couldn’t find this feature in any of the free analytics platforms so I guess the only way is to pay the premium. Below you can find 7 tools that offer this option and the following details about each one:

  • Depth of LTV reporting they offer:
    • Historic LTV – this is a report that summarizes the amount of revenue per install. If you wait 180 days
    • LTV prediction report – this is an algorithmic calculation that predicts the LTV early on in the user lifetime based on a formula such as this one
    • Guide for LTV prediction – Some providers offer a resource for using their reports to calculate LTV
  • Platform and engine support – Mobile operating system as well as app building tools and game engines
  • Popularity – based on number of apps that use the platform
  • Price for 1M MAU based on the pricing presented on the provider
Vendor LTV Reporting Platforms and Engines Popularity Price (1M MAU)
Swrve analytics offers ltv reporting out of the box in addition to many other features Historic LTV,  Guides for Prediction iOS, Android, Windows, Unity, PhoneGap Mid  Undisclosed
Historic LTV iOS, Android, Unity High $1,800
DeltaDNA do offer LTV prediction report Historic LTV,  LTV Prediction Report iOS, Android, Unity, GameMaker Low  $15,000
DevtoDev is a new player in the game but they are offering LTV prediction as one of their flagship features Historic LTV,  LTV Forecast Report iOS, Android, Windows, Unity, UE4, Adobe, PhoneGap Low  $2,500
Localytics is a popular platform that offers lifetime value reporting Historic LTV iOS, Android, Windows, Unity, PhoneGap High  Undisclosed
Kissmetrics provides ltv prediction based on the churn ratio Historic LTV,  LTV Prediction Report iOS, Android Mid  $5,000
Omniata offers 60 day and 90 day ltv prediction Historic and Predictive LTV iOS, Android, Unity Low  Undisclosed

Honorable mention goes to Upsight. The company is offering a very flexible solution and are trusted by some of the industry leaders. Before they merged and rebranded their Kontagent platform did have LTV prediction and while the current platform don’t support this feature I’m sure it will be added back in the future.

If you want to also analyze and predict the LTV for your advertising revenue – now there is a solution. Check out SOOMLA Traceback – Ad LTV as a Service.

Learn More

Feel free to share:

DOWNLOAD OUR RECENT REPORT

Join 7850 other smart people who get email updates for free!

We don't spam!

Unsubscribe any time

Categories

SOOMLA - An In-app Purchase Store and Virtual Goods Economy Solution for Mobile Game Developers of Free to Play Games