Skip to main content

Understanding Differences Across Vidmob Reports

This guide breaks down what each report is designed to show, why numbers might vary, and how to get the most accurate view of your data.

Sam Schlessel avatar
Written by Sam Schlessel
Updated over a week ago

When reviewing Vidmob reports—whether you’re looking at scores, KPIs, or creative elements—it’s normal to see different numbers across tools. This guide breaks down what each report is designed to show, why numbers might vary, and how to get the most accurate view of your data.


Report Overview: What Each Tool Is Designed to Show

Report

Includes KPIs?

Data Level

What It’s Used For

Creative Manager

Yes

Asset-level

Note: Toggle between Asset-level & Ad-level for Meta only

View performance metrics like VTR, CTR, ROAS for each creative

Element Impact Report

Yes

Asset-level

See how specific elements (CTAs, visuals, length) influence performance

Media Impact Report

Yes

Asset-level

See how your creative is performing across different KPIs

Comparison Report

Yes

Asset-level

Compare creative groups and understand KPI lift based on structure or format

Individual Creative View

Yes

Asset-level

Analyze a single creative’s performance and quality breakdown and assess where the audience is dropping off to review whether the key visual elements were even seen to drive impact.

Criteria Performance Report

Yes

Criteria rule-level

Compare how creatives performed depending on whether they met certain best practices

Pre‑Flight Check

No

Asset-level

Evaluate uploaded creatives before launch, against platform and brand rules

In‑Flight Check

No

Ad-level

Monitor creatives in-market for guideline adherence

Adoption Report

No

Toggle between Asset-level & Ad-level

Track how many assets have been scored and how scoring has changed over time

Diversity Report

No

Toggle between Asset-level & Ad-level

Measure how creatives represent DEI-related criteria (e.g. diverse casting)

Adherence Report

No

Toggle between Asset-level & Ad-level

Determine if uploaded creatives met the set criteria

Impression Adherence Report

Partial (Impression %)

Toggle between Asset-level & Ad-level

See what % of impressions met mandatory creative criteria in live campaigns


Ad-Level vs. Asset-Level: What’s the Difference?

Some Vidmob reports allow you to view data at either the ad level or the asset level — and this choice impacts what you see.

View Type

What It Shows

Ad-Level

Includes all creatives configured within an ad, even if some didn’t run or didn’t receive any impressions. Best for seeing the full scope of what was planned, uploaded, or scored.

Asset-Level

Includes only individual creatives that actually ran and received performance data. This view focuses on what was delivered to the platform and returned data (like impressions or clicks).

The same report may show different creative counts depending on which view you're using.

Example:

An In-flight Check or Adoption Report might show 20 creatives (ad-level), meaning 20 were uploaded or scored. But Criteria Performance or Creative Manager might only show 12 creatives (asset-level), because only those 12 actually ran and generated impressions.


Filter Differences to Keep in Mind

VidMob reports allow filtering by several types of tags. Depending on which one you use, the data may change:

Filter

What It Pulls

What to Watch For

Brand

Assets assigned to a specific brand

Note: The brand needs to be manually assigned to each ad account ahead of time. Learn how here.

Most consistent for asset counts across tools

Market

Regional tags (e.g., Brazil, UK, US)

Some assets belong to multiple markets—counts can vary

Workspace

Where the report was created or scored

Assets may exist in more than one workspace

Ad Account

Media linked to a specific ad account

May overlap with multiple brands or markets

Media Type / Campaign Objective / Platform

Advanced filters for campaign strategy

Small changes can greatly affect results—use with care

For the cleanest comparison across reports, filter by Brand and use consistent date ranges.


Breakdowns in Reports: What Each View Tells You

Use the Breakdown dropdown in Vidmob to explore your creative performance from different angles. Each option helps you uncover a unique layer of insight:

Breakdown

What It Does

What to Watch For

Objective

Shows which creative elements drive specific goals like Views, Link Clicks, Conversions, or Awareness.

KPIs will vary by objective—compare within the same goal for accurate insights.

Campaign

Lets you compare individual campaigns or groups of campaigns against each other or the ad account average.

Campaign size, budget, and timing can affect averages—keep context in mind.

Placement

Shows performance based on where the ad ran (Stories, Feed, Reels, etc.).

Each placement has different engagement patterns—optimize creative to match.

Audience

Highlights which creative elements resonate with specific audience segments (age, gender, etc.).

Small or overlapping segments can skew data—check sample size when reviewing results.

Format

Breaks out performance by ad format (e.g., Video, Carousel, Static).

Some formats perform better for certain goals—avoid comparing formats 1:1 without context.

Duration

Shows which creative details perform best based on the video length (6s, 15s, 30s+).

Shorter videos tend to have higher completion rates—compare similar lengths for best insights.

Brand

Displays performance by brand, based on how brands are set up in your organization.

Please note that Brands must be tagged in advance.

Market

Shows performance by geographic region (e.g., US, UK, Brazil), based on your market setup.

Ad accounts can be tagged to multiple markets—counts may vary slightly depending on setup.


Common Questions and What They Mean

Client Question

Why This Happens

What You Can Do

“Why do the numbers or creative counts differ across Vidmob reports?”

Different reports are designed for different purposes. Some show all uploaded creatives (ad-level), others only show creatives with impressions (asset-level). Filters (Brand, Market, Platform, etc.) also affect results. A creative may be tagged to multiple brands, markets, or workspaces, leading to duplicate appearances. Scoring rules and baseline settings can further impact what is shown.

Align your filters (especially Brand and date range) across reports. Understand the data level each tool uses—use asset-level tools for what ran, and ad-level tools for upload history. Export reports for consistency if comparing over time.

Use Creative Manager or Criteria Performance to focus on what actually ran. Use Pre‑Flight or Adoption for full upload history. Filter by Brand for the most consistent view and to reduce duplication.

“Why are my KPIs missing or incomplete for some creatives?”

Some platforms or formats don’t support certain KPIs (like CTR or ROAS). KPIs may also appear blank if there are zero impressions or the platform delays reporting. Dashes can also appear for null or missing data.

Focus on the KPIs that are supported for your creative type. Check the platform’s known limitations. Use longer date ranges when analyzing conversions or delayed metrics.

“Why are some creatives missing from Criteria Performance, or why are rows empty?”

The Criteria Performance report only shows scored media. If media hasn’t been scored, generate impressions or if a criteria doesn’t apply to a specific media type (e.g. image vs. video), it won’t show data.

Use a Pre-Flight Check to confirm which media has scores. Check media type eligibility for the criteria you’re reviewing. Review recent changes to brand rules that may have triggered rescoring.

“Why is Criteria Performance showing more creatives than other reports — or vice versa?”

Some platforms (like TikTok) may include zero-impression records. Criteria Performance includes scored assets even if they didn’t generate impressions. Other tools may exclude these.

Confirm whether your platform sends records with zero impressions. If so, that's expected. Compare asset counts only for creatives with impressions if you want a tighter match.

“Why did my Criteria scores change from last week?”

Media can be rescored if the criteria rule changes or if brand-level identifiers are updated. Scoring changes may take up to 24 hours to reflect in reports.

Use the media scoring history to verify if and when rescoring occurred. Contact support if you're unsure what triggered the update.

“Why don’t my numbers add up to 100% in Criteria Performance?”

Met vs. Not Met groups are calculated separately—they aren’t meant to be percentages of a whole.

Review each group independently for performance insights.

“Why does the CPP (Cost Per Purchase) change even though impressions stayed the same?”

Conversion events are often delayed and reported days after impressions.

Use a longer date range for more reliable KPIs involving conversions.

“Why is one report showing higher lift than another?”

Different baseline settings (e.g., account average vs. column) can affect how lift is calculated.

Adjust your settings to match the comparison goal—column-level baselines are often more intuitive.


Best Practices for Accurate Insights

  1. Match Date Ranges and Filters
    Ensure you use consistent date filters (e.g., impression date vs. request date) and tag filters (preferably Brand).

  2. Choose the Right Data Level

    • For what ran, use asset-level tools like Creative Manager, Criteria Performance, or Element Impact.

    • For what was scored or uploaded, use ad-level tools like Pre‑Flight Checks, Adoption, and Diversity.

  3. Use Baseline Settings Thoughtfully
    In Comparison and Element Impact reports, ensure you understand which baseline you're comparing against.

  4. Export for Snapshot Consistency
    If you're tracking performance over time, export reports when you first generate them to preserve the data view before live updates change it.

  5. Understand KPI Limitations
    Some creative types don’t generate certain metrics. A missing KPI is usually a platform/data issue.

Did this answer your question?