Reporting on a Firebase AB test

How to set-up and report on a Firebase AB test in AdLibertas

A very popular use-case for our platform is to measure ongoing — or completed — AB tests running in Firebase. This article is to help you get started with measuring Firebase AB tests using the AdLibertas platform.

We include popular metrics (cumulative earned revenue, LTV) but also allow you to add your own.

Background:

Step 1: View the test in AdLibertas

When you have Firebase connected to AdLibertas you can automatically select and report on your Firebase AB tests. By default Firebase assigns a user property to each variant of the AB test, therefore we make it easy to run a report on these users over a timeframe of your choice. To do so, simply create an audience, choose Firebase Experiment, then choose the appropriate Experiment Name that’s running in Firebase.

In our example, let’s analyze a (fictional) test running that changes game complexity for new users.

Next, you can choose one (or more) variants for each audience.

Step 2: Running Reports on a Firebase AB test

Once you’ve built your audiences, then you can run reports on these users over a timeframe of your choice.

Note: This will allow you to test your variants during, before, or after the actual timeframe.

Choose the timeframe.

Once you’ve run the reports, you can measure the impact of these AB tests across multiple metrics and timeframes.

Adding 2 variants into a report.
You can add the count and averages for custom metrics to any audience, including a user's time in the app.

Now you can explore the performance and behavioral differences through the standard Audience Reporting features

Measuring the LTV of multiple test variants

Measuring the cumulative impressions/revenue/actions of variants

Firebase AB Test related reading: Audience Reporting Walk-through :: Creating an audience :: Running Reports :: Understanding Date Ranges

Last updated