LogoLogo
HomeThe PlatformBlogSchedule a demo
  • Getting Started
    • Welcome to AdLibertas
  • The Platform
    • How it works
    • User-Level Audience Reporting
      • Creating Reports
        • Creating a New User Report
        • Creating Advanced User-Level Reports
        • Advanced Audience Builder
        • Custom Event Metrics
      • Report Layout
        • Report Module: Audience Filtering
        • Chart Type Module: Absolute vs. Relative Reports
        • Daily Totals, Per User, Cumulative Totals
        • Lifecycle Reports
        • Forecasting Module
        • Statistics Module
        • Measuring Confidence
      • Advanced Reporting Methods
        • User Measurement & Calculation Details
        • Date Ranges: Define Audience vs. Create Report
        • Exclude GAID tracking opt-outs
        • Scheduled Reports: Keep Updated & Rolling
        • Reporting on a Firebase AB test
        • Understanding “Audience Restraints”
        • Adding user time to your reports
    • Consolidated Revenue Reporting
      • Reporting Discrepancies
      • Reporting Availability & Timezones
      • Ad Network Re-Repost; Also: Revenue Reconciliation Accuracy
      • Consolidated Reporting vs. Consolidated Inventory Reporting
      • Reporting Table – Column Descriptions Common Metrics (Calculated Fields)
      • Facebook Reporting
      • Consolidated Ad Revenue with multiple mediators
    • Business Analytics
      • Analytics Layout
      • Understanding the "Explore Data" button
      • The Data Table
      • Asking a Question
      • Saving a Question
      • Creating a custom dimension
      • Setting up a pulse
    • Custom Dashboards
      • Custom Dashboard Filters
      • Combining data into a single chart
    • Direct SQL Access
    • Exporting Data
      • Ad Network Reports
      • Chart Reports
      • Custom API connections
      • Downloading & Scheduling Data Reports
      • Deprecated: Line Item Change Log
    • General
      • Change your Username & Password
      • Adding Users to your Account
      • Sharing Collaborative Links
      • AdLibertas Cost
  • Data Integrations
    • Connecting in 3 steps
    • Ad Impression-Level Revenue Connections
      • AppLovin Max User Revenue API
      • ironSource Ad Revenue Measurement Integration
      • Impression level tracking with Admob Mediation
      • Collecting MoPub Impression-Level Data as a Firebase Event
    • Ad Network & Store Connections
      • Adding Ad Network Credentials
      • How does App Store Reporting work?
      • Adding access to Google Play
      • Adding Sub User to App Store Connect
      • Getting the most from Ad Network Reports
    • Analytics Connections
      • Data Set Status
      • Connect AdLibertas to Firebase
      • Connecting AdLibertas to BigQuery
      • Firebase Install Counts in Audience Reporting
      • Setting User Campaigns in Firebase
      • Why use revenue to determine Firebase AB test winners?
      • Firebase Best Practices: keeping Google BigQuery Costs Down
    • Custom Integrations
      • Sending Events via Webhooks to AdLibertas
      • Impression level tracking with Admob Mediation
      • Connecting AdLibertas to BigQuery
      • Importing a custom data set
    • IAP Connections
      • Tracking IAP & Subscriptions in Firebase and BigQuery
      • RevenueCat Integration: WebHooks
      • RevenueCat: Setting Universal Identifiers
    • MMP Connections
      • Connecting Adjust
      • Connecting AppsFlyer
      • Connecting Kochava
  • FAQs
    • General
      • Why does AdLibertas need credentials?
    • Audience Reporting
      • Why doesn't my daily active user count match Firebase?
      • Why doesn’t my retention rate match?
      • Why aren't my install rates matching?
      • Why doesn't my relative user count match retention?
      • What is the probability projected LTV becomes actual LTV?
      • Why doesn’t Firebase and AdLibertas revenue match?
    • Reporting
      • What is “non_mopub” revenue
      • How do customers use AdLibertas?
  • Privacy & Security
    • Privacy & Security Details
Powered by GitBook
On this page
  • Why we recommend you use revenue to determine the winner of your AB tests
  • Linking Firebase AB Tests with Revenue
  • Examples of how revenue can be used to determine winners
  • Related articles:
  1. Data Integrations
  2. Analytics Connections

Why use revenue to determine Firebase AB test winners?

Combining ad revenue and accurate IAP revenue to your Firebase AB test helps you understand the actual revenue outcome of your tests.

PreviousSetting User Campaigns in FirebaseNextFirebase Best Practices: keeping Google BigQuery Costs Down

Last updated 2 years ago

Why we recommend you use revenue to determine the winner of your AB tests

While Firebase is an exceptionally powerful tool, its test outputs are limited to a handful of event metrics, and these counts only give directional confidence in success. The AdLibertas platform combines Firebase Analytics with the actual ad impression, which means we can provide easy exploration of ongoing or completed tests, with a variety of metrics– including ARPU and LTV to better help quantify the test outcome.

Linking Firebase AB Tests with Revenue

While the concept of merging these datasets is straightforward, the difficulty comes in the details. This generally means creating a data pipeline to combine the data sources, choosing a storage technology (Google Cloud, Snowflake, Redshift), and analytics (Looker, Tableau, etc.). The full process often requires 6-12 months of R&D alongside server costs of $30K+ depending on the size of your datasets.

We make this easy

The AdLibertas platform simplifies your process by centralizing all of your revenue into a single location. Max allows app developers to send ad revenue callbacks, with the value of each impression, to the device. At this point, the event is relayed to Firebase to consolidate actual ad revenue in a single place. Working with this data, even in a consolidated format, can be very difficult. Our cost-effective platform skips any need for custom integrations, data engineering, or SQL writing, and our customers quickly connect with API keys to get up and running – often seeing same-day insights.

Examples of how revenue can be used to determine winners

Related articles:

.

See a of AB testing the length of a game level or

Read how VisualBlasters started testing giving away a feature for free and. Then how they advanced from AB testing to a

Interested in learning how to properly set up AB tests? for app developers.

Read how AdLibertas customer, Random Logic Games, was able to by AB testing game mechanics.

Read about our recommendations on how to build your own data platform.
More details on how AdLibertas works
walk-through example
watch it on video.
ended up increasing retention and revenue by 21%
full-on live ops strategy.
We’ve created a guide for setting up an optimal framework
drive user-LTV up 10%
Why should you link ironSource and Firebase? - AdLibertasAdLibertas
Logo
Connecting Applovin Max with Firebase - AdLibertasAdLibertas
Logo
It’s not always easy to determine a winner by events or day-1 retention.
It’s easier to determine the winner of a test when you can see the actual revenue projections of the variants.