Skip to main content

The Differences in Methodology of Element Impact Reporting and Criteria Performance

Sam Schlessel avatar
Written by Sam Schlessel
Updated over a week ago

Authored by Garrett Woods

The Differences in Methodology of Element Impact Reporting and Criteria Performance

Understanding how creative elements influence performance is essential to optimizing your campaigns. VidMob offers various reporting methodologies to analyze creative effectiveness including the Element Impact and Criteria Performance. While they may appear to evaluate similar attributes, their underlying logic, inclusion criteria, and outputs differ in key ways. This article outlines those differences to help you interpret your reports with clarity.


Overview of the Two Methodologies

Element Impact Report

  • Purpose: Identifies the direct, measurable impact of specific creative elements on a selected KPI.

  • Media Inclusion: Filters media based on strict data availability and attribute alignment with the KPI.

  • KPI Sensitivity: Highly sensitive to the KPI selected. For time-based KPIs (e.g., 3-second View Through Rate), only considers creative elements present within the first 3 seconds.

  • Tag Detection: The tag detection setting defaults to "first 3 seconds" for time-bound KPIs and cannot be changed. For non-time-bound KPIs (e.g., CTR), you can choose whether to detect tags in the first 3 seconds or at any time.

  • Performance Calculation: Calculates the average KPI for media with the element vs. without, using only creatives where the element is both present and detected under the selected rules.

Criteria Performance Report

  • Purpose: Evaluates how well creatives adhered to best practices and how those practices correlate with performance.

  • Media Inclusion: Includes all media in the dataset, regardless of whether specific criteria were met, provided scores are available.

  • KPI Flexibility: Designed to accommodate multiple KPIs in the same report. Does not apply time-based tag filtering, even for time-bound KPIs.

  • Tag Detection: Criteria performance is based on how elements are tagged and how the criteria are defined. A criteria is considered "passed" if the tagged elements meet the specific rules set within that definition.

  • Performance Calculation: Compares KPI averages for creatives that met vs. did not meet a given best practice, typically using broader and more inclusive groupings.


Key Differences in Practice

Feature

Element Impact

Criteria Performance

Tag Timing

Time-restricted (e.g., CTA must appear in first 3s for 3s VTR KPI)

Tag presence evaluated at any time

KPI Behavior

Report adjusts based on KPI (time-bound or not)

KPI does not restrict tag timing

Customizability

"First 3 seconds" vs. "any time" toggle available for non-time-based KPIs

Requires building custom criteria for time-based tag rules

Media Inclusion Criteria

Includes only media with reliable KPI data and matched elements

Includes all media with criteria scores, even if element timing varies

Ideal Use Case

Tactical optimization and element-specific testing

Strategic adherence tracking and guideline validation


Why Do Percent Lifts Sometimes Differ?

You may see discrepancies between percent lift values across the two reports—even for the same element (e.g., CTA Presence). This is often due to:

  • Different media counts: One report may include more creatives due to relaxed inclusion rules.

  • Different tag timing rules: Element Impact with a time-based KPI only counts elements within the first few seconds, whereas Criteria Performance doesn’t.

  • Score availability: Some creatives may not have a recorded score for a given criteria, which excludes them from Criteria Performance metrics.


Tips for Aligning the Two Reports

  • When using a time-based KPI (like 3s VTR), remember that Element Impact will apply strict timing filters to tags. To compare this to Criteria Performance, consider building a custom criterion that mimics those timing rules (e.g., CTA present in first 3 seconds).

  • For non-time-based KPIs, ensure that tag settings in Element Impact are aligned with your goals (e.g., toggling between “first 3 seconds” and “any time”).

  • If counts seem off, check for missing scores in Criteria Performance or tag settings in Element Impact as potential causes.


Conclusion

Both Element Impact and Criteria Performance serve powerful but distinct purposes. Use Element Impact when you want to identify the precise performance effect of a creative attribute. Use Criteria Performance when evaluating overall adherence to creative best practices and how these practices broadly influence performance.

Understanding their differences empowers you to interpret insights more accurately—and to better inform your creative strategies.

Did this answer your question?