Can Optimized Ad Creative Make Programmatic Perform 30-50% Better? These 5 Studies Say ‘Yes’

Performance lift from Thunder ad creative

Lately, there has been a lot of talk about the burgeoning relationship between programmatic media and ad creative. Most of the discourse has been about how optimal creative can significantly impact the performance of the media in theory. The purpose of this article is to look at the data behind what those gains might be in practice.

Creative optimization is the act of customizing and testing advertisement artwork and messaging to be more resonant with audiences. Since programmatic allows advertisers to precisely target specific audiences, these technologies have inspired advertisers to leverage creative variations for greater media effectiveness.

The dominant tactics include:

  1. Customizing ad creative to what matters to specific audience groups
  2. A/B testing messaging
  3. Sequencing or updating messaging over time

The expectation of gains from these tactics comes from an already established success in creative relevancy and testing in email, social, search and website content. Armed with programmatic buying, big data and algorithmic optimization, advertisers hope to translate similar gains into display ads, native ads and video.

The strategies here seem sound, but what data is there to back it up?

I set out to locate research to help understand:

  • What is the expected performance difference between generalized messaging and customized messaging?
  • What if you vary ad messages over time?

In my search, I uncovered research by comScore, Yahoo!, Adobe, RocketFuel and Marin Software that indeed demonstrates the effects of creative can be powerful.

Here’s what I found.

In a comScore report from 2010, it was found that quality creative contributed 4X as much sales lift as a quality media plan, 52% versus 13%.

Based on this research, not only can creative impact performance quite a bit—it’s potential for creating lift appears to be significantly greater than the media buy itself.

These findings are reinforced by Yahoo! who published a report in 2014 demonstrating that personalized creative significantly outperforms generalized creative. Personalized ads were 54% more engaging and 45% more memorable.

Both creative quality and creative personalization are clocking in with non-trivial improvements.

In addition, in Adobe’s digital trends report for this year, marketers rate targeting and personalization (30%) as the highest digital-related priority area, narrowly ahead of content optimization (29%) which has also climbed up the pecking order.

“Personalization sets us apart, cuts through clutter,” said one Adobe survey respondent.

“The customer has always been in charge, brands have just been slow to accept that. There is a shift happening from brand experience to customer experience and personalization will be the driving force of this,” wrote another.

Combining Adobe’s report with the evidence from comScore and Yahoo!, we have both quantitative and qualitative reasons to customize ad creative. That explains why it’s at the top of the priority list for so many marketers.

But like I mentioned above, companies don’t just stand to gain from personalization. A/B testing and updating or sequencing messaging over time also play a factor.

I wasn’t able to find hard and fast data on the power of A/B testing in aggregate where I was happy with the methodology. While many case studies exist showing significant gains on specific campaigns, few venture to study the overall impact of an A/B or multivariate testing on ad performance. That being siad, industry professionals are generally unified in the belief that an A/B test is only as good as what is being tested, and whether the conditions of the test are scientifically sound.

That being said, research such as the findings of the RocketFuel Guide to Creative Optimization show that simple optimizations of various design elements can routinely lead to performance impacts of 30-160%. Even though this doesn’t cover A/B testing of messaging, it hints to the potential for similar gains in a fully-optimized campaign from a creative standpoint.

Beyond simply testing creatives side-by-side, programmatic is also exposing an opportunity to vary ad messaging over time. Creative variance helps combat creative fatigue, wherein an ad’s message can become less impactful over time if it is not refreshed.

For example, a Marin Software study of Facebook creative rotation strategies demonstrated improved CTRs and engagement rates, driving 35% higher CTRs for direct response campaigns.

There you have it. The evidence supports creative’s role as mission-critical to programmatic success, offering reasonable performance gains on the order of 30-50% for key metrics. Possibly more.

To harness these improvements, the next step at any organization is to develop processes to better connect programmatic and creative teams. That’s going to require new collaboration across teams that have traditionally been siloed from each other. It’s also going to require new processes and technologies to enable a higher volume of creative ideation and production.

Speaking to this emerging era for programmatic and creative teams working together, here is some further reading with other industry leaders weighing in on the topic:

Essential Guide to Programmatic Creative Technologies

This Post Has 5 Comments

Comments are closed.