LogoLogo
HomeThe PlatformBlogSchedule a demo
  • Getting Started
    • Welcome to AdLibertas
  • The Platform
    • How it works
    • User-Level Audience Reporting
      • Creating Reports
        • Creating a New User Report
        • Creating Advanced User-Level Reports
        • Advanced Audience Builder
        • Custom Event Metrics
      • Report Layout
        • Report Module: Audience Filtering
        • Chart Type Module: Absolute vs. Relative Reports
        • Daily Totals, Per User, Cumulative Totals
        • Lifecycle Reports
        • Forecasting Module
        • Statistics Module
        • Measuring Confidence
      • Advanced Reporting Methods
        • User Measurement & Calculation Details
        • Date Ranges: Define Audience vs. Create Report
        • Exclude GAID tracking opt-outs
        • Scheduled Reports: Keep Updated & Rolling
        • Reporting on a Firebase AB test
        • Understanding “Audience Restraints”
        • Adding user time to your reports
    • Consolidated Revenue Reporting
      • Reporting Discrepancies
      • Reporting Availability & Timezones
      • Ad Network Re-Repost; Also: Revenue Reconciliation Accuracy
      • Consolidated Reporting vs. Consolidated Inventory Reporting
      • Reporting Table – Column Descriptions Common Metrics (Calculated Fields)
      • Facebook Reporting
      • Consolidated Ad Revenue with multiple mediators
    • Business Analytics
      • Analytics Layout
      • Understanding the "Explore Data" button
      • The Data Table
      • Asking a Question
      • Saving a Question
      • Creating a custom dimension
      • Setting up a pulse
    • Custom Dashboards
      • Custom Dashboard Filters
      • Combining data into a single chart
    • Direct SQL Access
    • Exporting Data
      • Ad Network Reports
      • Chart Reports
      • Custom API connections
      • Downloading & Scheduling Data Reports
      • Deprecated: Line Item Change Log
    • General
      • Change your Username & Password
      • Adding Users to your Account
      • Sharing Collaborative Links
      • AdLibertas Cost
  • Data Integrations
    • Connecting in 3 steps
    • Ad Impression-Level Revenue Connections
      • AppLovin Max User Revenue API
      • ironSource Ad Revenue Measurement Integration
      • Impression level tracking with Admob Mediation
      • Collecting MoPub Impression-Level Data as a Firebase Event
    • Ad Network & Store Connections
      • Adding Ad Network Credentials
      • How does App Store Reporting work?
      • Adding access to Google Play
      • Adding Sub User to App Store Connect
      • Getting the most from Ad Network Reports
    • Analytics Connections
      • Data Set Status
      • Connect AdLibertas to Firebase
      • Connecting AdLibertas to BigQuery
      • Firebase Install Counts in Audience Reporting
      • Setting User Campaigns in Firebase
      • Why use revenue to determine Firebase AB test winners?
      • Firebase Best Practices: keeping Google BigQuery Costs Down
    • Custom Integrations
      • Sending Events via Webhooks to AdLibertas
      • Impression level tracking with Admob Mediation
      • Connecting AdLibertas to BigQuery
      • Importing a custom data set
    • IAP Connections
      • Tracking IAP & Subscriptions in Firebase and BigQuery
      • RevenueCat Integration: WebHooks
      • RevenueCat: Setting Universal Identifiers
    • MMP Connections
      • Connecting Adjust
      • Connecting AppsFlyer
      • Connecting Kochava
  • FAQs
    • General
      • Why does AdLibertas need credentials?
    • Audience Reporting
      • Why doesn't my daily active user count match Firebase?
      • Why doesn’t my retention rate match?
      • Why aren't my install rates matching?
      • Why doesn't my relative user count match retention?
      • What is the probability projected LTV becomes actual LTV?
      • Why doesn’t Firebase and AdLibertas revenue match?
    • Reporting
      • What is “non_mopub” revenue
      • How do customers use AdLibertas?
  • Privacy & Security
    • Privacy & Security Details
Powered by GitBook
On this page
  • Unique users
  • Installs
  • Active Users
  • Retention Rate
  • Lifetime Value (LTV) calculation
  1. The Platform
  2. User-Level Audience Reporting
  3. Advanced Reporting Methods

User Measurement & Calculation Details

In-depth details on how we calculate important user measurements -- like installs, activity, retention rate, and LTV to help AdLibertas customers understand the subtleties of the platform.

PreviousAdvanced Reporting MethodsNextDate Ranges: Define Audience vs. Create Report

Last updated 2 years ago

Unique users

To tie users across datasets AdLibertas uses the most commonly available pervasive ID to identify unique users.

  • For iOS, this is the IDFV, "The value of this property is the same for apps that come from the same vendor running on the same device." ().

  • For Android, this is the GAID. "The advertising ID is a unique, user-resettable ID for advertising, provided by Google Play services." ().

Note: for Google, users who opt out of GAID tracking have a shared ID. This can cause confusion with reporting, read more in .

This may differ from Firebase: by default Firebase users a to track individual users, this number is re-generated every time a user re-installs the app and therefore may differ from the centralized IDs depending on user activity.

Installs

By default, we opt to use the Firebase user_first_touch_timestamp which a measurement of when a user has first opened the app or visited the site.

This may differ from Firebase reporting: by default, Firebase uses the to tie attribution to installs. However, this event is also known to fire on app updates.

Active Users

By default, both us and consider a user “active” when they’ve fired at least one “engagement event” a day. For Firebase users, this is the user_engagement event. as “when the app is in the foreground…for at least one second.”

However, for some apps, we have seen users who don’t have a user_engagement event, but do show other activity; either because this event hadn't had time to fire or developers have modified this event behavior.

For this reason, we offer the ability to also consider a user active when they fire a session_start event, “when a user engages the app.”

For customers who opt into this additional metric to consider a user "active"– and who have users that are firing a session_start event but not firing a user_engagement_event—you’ll see a higher number of installs and active users on a daily basis. Depending on how many additional users this includes, you’ll see a decreased LTV. This is because all revenue will be shared across more active users.

Why would counts differ across platforms & vendors?

Succinctly put, there are many reasons user counts may differ across analytics platforms, here are some examples:

  • Retention rates may differ between different reporting technologies if they are using different events to measure installs & active users. As hinted at above, even Firebase can show different times between afirst_open event and user_first_touch_timestamp. Across different vendors, each may have a unique way of measuring a user's first and subsequent visits. If your users exhibit a large discrepancy between any of these measurements, retention may vary.

  • Using different IDs: Depending on how the platform is counting users, there could be multiple methods to measure unique users: advertising IDs reset or can be obfuscated, Vendor IDs may not be reset upon reinstall (IDFVs). Certain vendors might have IDs that reset more often (e.g. Firebase as described above)

  • Firebase Installs may differ because the user has installed but has not generated a first_open event. We've seen this happen with pre-load or other low-quality install sources.

  • DAU may differ if users don't trigger a user_engagement event.

  • Firebase unique users may differ if users re-install and reset their firebase pseudo id

Retention Rate

Retention rate is defined as:

A retention rate gives a number to the percentage of users who still use an app a certain number of days after install. It is calculated by counting unique users that trigger at least one session in one day, then dividing this by total installs within a given cohort.

While retention rate is a widely used metric for tracking app performance, there are details that are important to understand when you dive into how the metric is calculated. For most high-level performance measurements, this won't be a concern but for users buying traffic, any changes to retention can alter your LTV calculations, which can skew your performance measurements.

One such detail in the wording above is "trigger at least one session in one day." Meaning if a user isn't active on their day of installation, they will not be considered in subsequent "retention rates." Or put another way: if a user shows up as active on day two, are they really "retained?"

Understanding "Retention Rate" in the dashboard

In the AdLibertas dashboard there are 3 important metrics in the retention section:

  • Retention Rate - measures users that have returned on a given day, having also visited on day 0. This percentage is calculated by the number of active users on a given day divided by the number of users eligible for retention. For instance, users who installed 29 days ago aren't considered eligible for day 30 retention.

  • Daily Active Users - the number of users who were active on the given day.

  • Audience Representation - the number of active users divided by the entire audience size. This is to provide guidance on when you're making major decisions based on the activity of a small number of users.

Relative Reports vs Retention (lifecycle) Reports.

Lifetime Value (LTV) calculation

Lifetime user value is one of the most powerful metrics offered by AdLibertas User-Level Audience Reporting. Simply put this measures the (estimated) revenue on a per-user basis for your defined cohort of users. More details are available in the article below:

AdLibertas calculates LTV by taking the actual earned revenue (both impression-level ad revenue and in-app purchase revenue) of the cohort and multiplying it by the inverse of the retention rate.

Lifecycle reports factor in user eligibility, whereas relative reports do not.

If you choose to use cumulative revevenue earned be aware: LTV calculations include users who have reached the appropriate age, whereas cumulative relative reports only report on the total earned to date. You may inadvertently lessen your user metrics by not letting your cohort mature.

For instance, if on March 30th, you're running a 30d performance report starting on March 1st, your LTV report will use retention rates and earnings only for the users who've aged 30d (those who installed March 1). A cumulative report's day 30 will include revenue from users who've installed on March 1st but will not have revenue from cohorts <30d old (March 2nd-March 30th). Therefore will be lower.

The deeper you delve into install rates & unique users, the more complicated you'll find the topic. There's a reason the

Duplicate Counts: user re-installs, resetting Ad-IDs, or ad platforms may lead to multiple counts for a single user's install.

The eagle-eyed report builder may come across a situation where day-30 of a has more users than exist in the retention report. This is because a relative report includes users based on their activity since day-0, whereas a retention rate will only include users who were active on day-30 AND we're active on day-0.

Related:

This means the two factors that impact your LTV are your chort and earned revenue. Some hyper-casual apps -- or other apps with high user turn-over-- face potential challenges in measuring accurate retention rates (users who don't engage on day of installation). For that reason, many marketing teams use cumulative revenue earned divided by day-0 users as an alternative measurement for ROI calculation.

documentation
documentation
Excluding GAID tracking opt-outs
user_pseudo_id
is defined as
first_open event
Firebase
Defined by Google
defined as
mobile analytics market is worth over $10B.
self-attributing networks
Adjust
relative report
forecasting LTVs
retention
Finding a pLTV model for your mobile app - AdLibertasAdLibertas
Logo