People-based versus Cookie-based Measurement Comparison

Marketers for many years relied on measuring their audience reach, frequency, and conversions by using the ubiquitous “cookie,” a temporary identifier dropped onto user browsers until they delete them periodically (typically every 30 days or so). This identifier made it possible to track people who saw an ad (reach) and who showed up to the advertiser’s website to do some action (conversions).

However, with the advent of mobile devices like smartphones and tablets, the cookie started to crumble as a reliable way to have a complete view of a consumer’s digital ad experience. In fact, with the introduction of mobile in-app experiences and mobile device IDs, the consumer digital identity fragmented further, making it hard to accurately measure and therefore personalize and optimize ads.

Now with people-based marketing, brands are unifying their view of a consumer by connecting multiple cookies and device IDs. Deterministic identity graphs such as Thunder and LiveRamp are able to use authenticated logins across multiple devices to build a complete, accurate picture of who the consumer is across devices. Most of ad tech hasn’t yet been updated for the people-based world and so still run on just cookies or single device IDs for measurement, causing misleading results.

Thunder teamed up with LiveRamp to produce the first open, people-based ad server and as a result of running billion of impressions a month, Thunder has been able to study the difference between the traditional cookie-based measurement and new people-based measurement.

The fact there is a difference isn’t surprising but the size was quite shocking. Using cookies-only have been shown to only ~50% accurate now in the new multi-device world!

Get the full results in our new whitepaper, “Ad Counting Comparison Study.

Continue Reading

A/B Split Testing Sample Size Calculator

Everyone wants to optimize their advertising and that means figuring out what works and what doesn’t. To that, you need to figure out not only a winner and a loser but you need confidence that you are right.

Unfortunately, it’s not as simple as just comparing the current performance of one ad with another ad — that’s like saying just because one basketball team is currently ahead in the 1st half of the game that it is the best team. You need enough time and data (enough matchups in basketball) to determine a true champion.

In statistics, confidence comes from having enough “statistical significance” which essentially means knowing the results are likely not just chance but would be likely repeated if the match up happened again. To achieve statistical significance, you need a minimum sample size of data based on a target desired lift in performance that would make one ad a meaningful winner over another.

Thunder has built an easy to use sample size calculator which will allow you to now input basic variables in your creative experiment to determine the minimum sample size necessary whether you’re thinking about how many impressions or how many people need to be in the test to get a meaningful result. Thunder Experience Cloud uses minimum sample sizes to ensure it has enough data for each creative version it is testing before its Dynamic Creative Optimization solution serves the winning creative to all consumers, thereby achieving maximum media impact and efficiency with highest confidence.

If a vendor tells you there isn’t a minimum or that they can test thousands of ad versions, you need to ask them if they have a sample size calculator or how they will achieve statistical signfiance with their results. Otherwise, it is highly suspect they are really testing and optimizing your advertising.

Try out our free testing sample size calculator for A/B and multivariate testing.

Inputs for Thunder sample size calculator

 

Continue Reading

New California Privacy Law Compared to GDPR – Summary

GDPR v. California Privacy Laws

Digital marketers just rushed to meet GDPR compliance in May 2018 for digital marketing in Europe. They now need to rush to meet a new California privacy law put in place that will go in effect in January 2020. Compared to GDPR, the California Consumer Privacy Act (also known as CaCPA or CCPA) balances commercial and consumer interest much more to enable digital marketers to continue data-driven marketing while giving consumers more protections and options.

Similarities:

Both CaCPA and GDPR

  1. apply to businesses that are not located within their borders
  2. assign responsibility for enforcement to a governmental authority
  3. do not permit discrimination against individuals who exercise their legal rights
  4. provide individuals with certain rights with respect to personal data; including the right to access and delete their personal data
  5. address some similar concerns (e.g., the importance of access and transparency)
  6. will require businesses to expend time and money to achieve compliance

Key Distinctions:

  1. GDPR comprehensively addresses many privacy concerns (e.g., disclosures businesses must make to data subjects, process for data breach notification to individuals and regulators, implementation of data security, cross-border data transfers, etc.) while CaCPA is focused on consumer privacy rights and disclosures.
  2. GDPR provides comprehensive private rights of action while CaCPA does not create a private right of action except for data breaches (and with many requirements).
  3. GDPR provides a more comprehensive set of rights to consumers, including the right to data correction and the right to data portability, which CaCPA does not have (unless the business decides to respond to a request for portability by providing the data electronically, in which case it must do so it in a readily useable format that can be transmitted to another entity only to the extent it is technically feasible).
  4. GDPR includes considerably more comprehensive requirements on businesses, including privacy by design and default, foreign company registration requirements, data protection impact assessments, 72-hour breach notification, data protection officer requirement, and restrictions on cross border transfers.
  5. GDPR requires data controllers to sign formal, written agreements with processors that meet stated requirements of a processor’s handling of personal data. CaCPA requires only requires a written agreement with a third party in very limited circumstances.
  6. GDPR requires businesses to assume and contract for appropriate technical and organizational security precautions. CaCPA does not mention security other than to provide a cause of action for lawsuits on behalf of consumers for the unauthorized access, exfiltration, theft, or disclosure of personal information that is not encrypted or redacted that results from the failure to implement and maintain reasonable security procedures and practices.
  7. The GDPR requires that businesses must have a legal justification before it collects, processes, or transfers personal information, with a consumer’s informed and unambiguous consent as a single means of achieving that legal justification. CaCPA on the other hand does not require businesses to have such legal justification and uses an opt-out approach

Detailed Comparison

If you’re worried about your compliance with both laws, you should read Part II of GDPR vs California Consumer Protection Act that covers in more detail the nuanced differences and why compliance with one law doesn’t ensure compliance with both.

Thunder’s Role

Thunder Experience Cloud enables the advertising ecosystem to balance consumer interests in privacy with commercial interests in data-driven advertising. Thunder helps ad platforms prevent data leakage, consumers gain privacy, and advertisers obtain transparency through its anonymized people-based measurement solution. Ask us how to protect consumer data while supporting data-driven advertising if you’re interested to learn more.

Continue Reading

GDPR vs California Consumer Privacy Act (CaCPA) Detailed Comparison

GDPR v. California Privacy Laws

If you’re a large digital marketer, ad platform, or agency that reaches any consumer in the EU or California, you will need to soon comply with both GDPR which went into effect in May 2018 and the new California Consumer Privacy Act (also known as CCPA or CaCPA) which will go into effect January 2020. While GDPR is generally seen as more stringent than CaCPA, there are still some nuanced differences and compliance with one doesn’t mean compliance with the other.

In Part I of this series, Thunder summarized the key differences and similarities between the two sets of laws.

In this Part II of the series, Thunder has provided a detailed breakdown for digital marketers, agencies and ad platforms comparing GDPR and California Consumer Privacy Act (known as: CCPA or CaCPA for short) to make sure they are compliant with both:

Jurisdiction

GDPR: Applies to data collection of persons in the EU (whether the company is based there or not)

CaCPA: Applies to data collection of California residents (whether the company is based there or not)

Personal Data

GDPR: Any information relating to an identified or identifiable natural person.

CaCPA: Any data that “identifies, relates to, describes, is capable of being associated with, or could reasonably be linked, directly or indirectly, with particular consumer or household.” A “consumer” is a California resident as defined by tax code. The “personal data” definition is developed through examples, exclusions and cross-references to other laws. Data subject to HIPAA is exempted from CaCPA but data subject to FCRA, and GLBA is excluded only to the extent those statutes conflict with the CaCPA.

Data Subject

GDPR: An identified or identifiable natural person. An identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person.

CaCPA: A California resident as defined under California tax law.

Data Controller

GDPRThe natural or legal person, public authority, agency or other body which, alone or jointly with others, determines the purposes and means of the processing of personal data; where the purposes and means of such processing are determined by Union or member state law, the controller or the specific criteria for its nomination may be provided for by Union or member state law.

CaCPA: For-profit controllers that meet ONE of the following thresholds: (1) Annual gross revenue over $25M; (2) Buys/sells or receives/shares for “commercial purposes” the data of 50,000 California residents; or (3) Derives 50% of revenue from “selling” personal data of California residents. If a controller qualifies under the thresholds, parent companies and subsidiaries in the same corporate group operating under the same brand also qualify.

Processor

GDPR: A natural or legal person, public authority, agency or other body that processes personal data on behalf of a controller. The GDPR also defines a “third party” as a natural or legal person, public authority, agency or body other than the data subject, controller, processor, and persons who, under the direct authority of the controller or processor, is authorized to process personal data.

CaCPA: A “service provider” is a for profit entity that acts as a processor to a “business” and that receives the data for “business purposes” under a written contract containing certain provisions. The CaCPA uses the term “third party” to refer to entities that are neither business nor service providers.

Sensitive Data

GDPR: Per Article 9: Processing of personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person’s sex life or sexual orientation is prohibited.

CaCPA: Sensitive data is not addressed.

Transfers of Personal Data

GDPR: Any transfer of personal data that are undergoing processing or are intended for processing after transfer to a third country or to an international organization shall take place only if the controller and processor comply with the conditions set forth in Articles 44-50. Transfers on the basis of an adequacy decision and methods such as Binding Corporate Rules, Contract Clauses, etc. or in the case of EU-US transfer, the Privacy Shield.

CaCPA: Cross-border data transfers are not restricted. All transfers to “service providers” require a written agreement containing certain provisions (that is, there is the CaCPA equivalent to Article 28 of the GDPR).

Data Portability

GDPR: Per Article 20, the data subject has the right to receive the personal data concerning him or her, which he or she has provided to a controller, in a structured, commonly used, and machine-readable format and have the right to transmit that data to another controller without hindrance from the controller to which the personal data has been provided.

CaCPA: There is a limited recognition of this right under the CaCPA. Cal. Civ. Code Section 1798.100 provides that data subjects that exercise their right to access, must receive the data “by mail or electronically and if provided electronically, the information shall be in a portable and, to the extent technically feasible, in a readily useable format that allows the consumer to transit this information to another entity without hindrance.” There is a related and somewhat contradictory provision on this under Cal. Civ. Code Sec. 1798.130(a)(2).

Consent

GDPR: Opt-in approach requiring informed, freely given, and unambiguous consent

CaCPA: Opt-out approach (for data being sold to 3rd-parties) that doesn’t require consent for adults; however users can ask that their data be deleted

Penalties

GDPR: Under Article 83: • Up to 10 000 000 EUR, or in the case of an undertaking, up to 2 percent of the total worldwide annual turnover of the preceding financial year, whichever is higher for infringements of obligations such as controllers and processors, the certification body, and the monitoring body. • Up to 20 000 000 EUR, or in the case of an undertaking, up to 4 percent of the total worldwide annual turnover of the preceding financial year, whichever is higher for infringements of obligations such as principles of processing, conditions for consent, data subject’s rights, transfer beyond EU, etc. • Under Article 84, each member state can lay down the rules on other penalties applicable to infringements of the GDPR in particular for infringements that are not subject to Article 83, and can take all measures necessary to ensure that they are implemented.

CaCPA: No private right of action for most provisions with the AG of California taking the role of DPA and being able to impose civil penalties up to $7,500 for each violation with no maximum cap. Violators may avoid prosecution by curing alleged violations within 30 days of notification. For certain data breaches there is private right of action with statutory damages set between $100 and $750 per data subject per incident with a requirement to notify the AG before filing a lawsuit and refraining from pursuing the action if the AG office prosecutes within six months of the notification.

Thunder’s Role

Thunder Experience Cloud enables the advertising ecosystem to balance consumer interests in privacy with commercial interests in data-driven advertising. Thunder helps ad platforms prevent data leakage, consumers protect privacy, and advertisers obtain transparency through its anonymized people-based measurement solution. Ask us how to protect consumer data while supporting data-driven advertising if you’re interested to learn more.

Continue Reading

LiveRamp and Thunder Experience Cloud Announce Partnership to Enable Omnichannel, People-Based Measurement and Personalization

people-based ad server

San Francisco, CA – (Sept 25, 2018) – Thunder Experience Cloud and LiveRamp®, an Acxiom® company (NASDAQ: ACXM) and leading provider of omnichannel identity resolution, today announced a partnership to enable people-based marketing in three key areas: targeting, measurement, and personalization.

The partnership provides marketers with a more holistic view of their customers by giving them the ability to track ad exposure and conversion across devices directly to their own person IDs, rather than relying on less accurate identifiers such as Cookie IDs or third party measurement providers.

LiveRamp customers use its identity graph for CRM targeting across the open web and walled gardens. Now, with the addition of Thunder’s people-based dynamic ad server, marketers can run campaigns from start to finish on LiveRamp IDs without pause. Ads can be dynamically personalized and measured in real-time using LiveRamp’s identity graph.

“This partnership is truly changing the standards of measurement and relevance in advertising,” said Paul Turner, GM of Technology at LiveRamp. “With Thunder Experience Cloud, marketers have a one stop shop for creating and measuring high-performing omnichannel campaigns based on the person, rather than the device or cookie, ensuring the right ad gets in front of the right person on any device, and bringing us closer than ever before to achieving true people-based marketing, while maintaining LiveRamp’s high standards of transparency and customer privacy.”

“Thunder is the only open, deterministic people-based ad serving and tracking solution today,” added Victor Wong, CEO of Thunder Experience Cloud. “By partnering with LiveRamp, one of the most trusted data platforms, we are giving marketers person-level measurement accuracy on their advertising while protecting the privacy of the consumer through state of the art encryption and anonymization.”

To learn more visit MarketingLand coverage:

 

Continue Reading

Webinar: Surviving the Doubleclick ID Loss

Alongside Adweek and Neustar, Thunder engaged in a webinar on the topic of the upcoming Doubleclick ID loss in 2019 and how to prepare for it if you’re a data-driven marketer. Learn what sort of advertiser needs to consider switching to an open ID and who is better off sticking with Google’s ID. Watch the full presentation and discussion below:

More on the Ads Data Hub series

  1. What is Google’s Ads Data Hub and is it right for me?
  2. How does Google’s Ads Data Hub Affect My Data Management Platform (DMP)?
  3. How does Google’s Ads Data Hub Affect My Analytics?

 

 

Continue Reading

Call for Advertising Industry to Protect Consumer Privacy, Provide Ad Transparency, and Secure Publisher Data

Thunder’s mission is to solve bad ads. To that end, Thunder joined the Coalition for Better Ads at the end of 2017. Now, Thunder is calling for the industry to go beyond just higher standards for creative. Thunder wants to put in place stronger protection for consumers and publishers while also providing greater transparency for advertisers.

Thunder had the recent honor of guest writing in the Association of National Advertisers (ANA) on what Cambridge Analytica taught the ad industry about what consumers expect and what publishers will need to do going forward. In this column, Thunder CEO also touches on how advertisers can work with these groups to ensure a better Internet where only effective, non-intrusive advertising rules. Here’s an excerpt:

Ultimately, everyone has to give a little something to get much more in return. Moving advertising to an anonymized ID tied to ad exposure will benefit the entire internet. Consumers will get better advertising and privacy, publishers will remove their liability and data leakage, and advertisers will gain transparency into their advertising.

 

 

Continue Reading

Neustar and Thunder join forces to deliver better customer experiences, powered by people-based intelligence

SAN FRANCISCO, Aug. 28, 2018 (GLOBE NEWSWIRE) — Thunder Experience Cloud, the leader in people-based ad serving, and Neustar Marketing Solutions (a division of Neustar, Inc.), the leading unified marketing intelligence platform for marketers, today announced the integration of Thunder’s people-based ad server with the Neustar Identity Data Management Platform (IDMP) and the Neustar MarketShare solution. The partnership will enable brands and agencies to quickly customize ad creatives to each customer, as well as measure performance for real-time optimization.

Thunder’s dynamic creative optimization (DCO) solution is a people-based, dynamic ad server that enables advertisers to factor in data signals such as CRM, weather, device type, time, media exposure, and now, audience data from large Data Management Platforms (DMP) like Neustar.

Customers of Neustar and Thunder will be able to target creative messaging for individual, real people and audience segments across digital channels such as display, video and mobile. By synchronizing people IDs on the open web, they can achieve a higher level of personalization, consistency and accuracy, eliminating irrelevant or redundant advertising.

In addition, Thunder’s people-based Experience Measurement solution tracks the performance of ads from exposure to viewer to conversion to allow for a high level of optimization. From there, joint customers can quickly and easily activate media by person tracked on the open web through the Neustar IDMP. This people-based data set will also be integrated within the Neustar IDMP and the Neustar MarketShare solution.

“Advertisers must be able to have a clear view of how their marketing performs across channels – which creatives and messages are being shown to whom, when and where. Neustar is dedicated to giving the industry access to independent and accurate media exposure data, ensuring brands and agencies have the tools they need for personalized, measurable experiences at scale,” said Steve Silvers, General Manager, IDMP, Neustar.

“There is no excuse for a bad ad,” added Victor Wong, CEO of Thunder. “This integration is another step toward ensuring every ad meets the highest standard of relevancy, frequency and impact, ultimately creating a better customer experience.”

About Thunder:
Thunder solves bad ads. Thunder Experience Cloud enables enterprises to produce, personalize, and track their ads cross-channel to achieve the right consistency, relevancy and frequency. Consumers maintain privacy, publishers safeguard data, and brands gain transparency through Thunder for a better ad experience for all.  To learn more visit: https://www.makethunder.com/

About Neustar Marketing Solutions
Neustar, Inc. helps companies grow and guard their business in a connected world. Neustar Marketing Solutions provides the world’s largest brands with the marketing intelligence needed to drive more profitable programs and to create truly connected customer experiences. Through a portfolio of solutions underpinned by the Neustar OneID® system of trusted identity and through a privacy by design approach, we enhance brands’ CRM and digital audiences, enable advanced segmentation and modeling, and provide measurement and analytics all tied to a persistent identity key. Neustar’s position as a neutral information services provider, and as a partner to Google, Facebook and Amazon, provides marketers access to the most comprehensive customer intelligence and marketing analytics in the industry. More information is available at www.marketing.neustar.

Continue Reading

How does Google’s Ads Data Hub Affect My Analytics? (Part III of the Ads Data Hub Series)

Note: We provided an overview of Ads Data hub in Part 1, and how Ads Data Hub will impact DMP’s in Part 2. This post covers data lakes and how analytics will be impacted in the Ads Data Hub world.

Many large brands today have set up “data lakes” where all their data gets stored and made available to other applications for processing and analysis. These data lakes combined with business intelligence tools such as Tableau have created powerful analytics environments where brands can answer questions such as:

  • What customer segment is most responding to my ads?
  • Which ads are leading to the most amount of lifetime customer value?
  • Do people who see my ads spend more with me?
  • Am I spending more money to reach my customers than they are spending with me?

Brands have staffed up data analysts and data scientists to make sense of all this data and answer these important business questions to improve strategy and validate what partners are telling the brand.

Data lakes ultimately rely on data to flow into them. Google’s recent changes with Ads Data Hub keeps data locked within Google Cloud and cannot be combined outside of Google’s controlled environment. As a result, data lakes for marketing are under threat by recent changes by Google.

Data Lakes without Data

Consequently brands with sensitive customer data are forced to decide whether to upload that data to Google to run in a Google-controlled data lake or keep it off the Google Cloud where they’ll need to find other vendors to solve their needs for tracking, analyzing, and modeling.

If you want to maintain control of your own data lake and preview it from drying up, talk to Thunder about our Experience Measurement solution. 

More on the Ads Data Hub series

Continue Reading

How does Google’s Ads Data Hub Affect My Data Management Platform (DMP)? (Part II of the Ads Data Hub series)

Note: We provided an overview of Ads Data hub in Part 1. In this post, we look at how Ads Data Hub will impact DMP’s in general.

Data management platforms (DMPs) power the marketer’s ability to track, segment, and target audiences across programmatic media. Leading DMP solutions include Salesforce DMP (previously known as Krux), Neustar IDMP, Oracle BlueKai and Adobe Audience Manager

If you weren’t paying close attention, you may not realize that the changes Google have announced have blown a hole in your DMP.

 

Two major capabilities are affected by the pending DoubleClick ID removal from logs and push toward using Google’s Ads Data Hub: (1) segmentation and (2) frequency capping.

First, marketers currently use DMPs to create new audience segments based on media exposure. A DMP can keep track of media exposure if its own tags/pixels can run with the ad, but on many publisher inventory such as Google’s Ad Exchange, DMPs are banned from running their code. These publishers are worried about data leakage, which happens when the DMP pixels proprietary audiences on media (such as sports lovers on ESPN.com) and purchased these users elsewhere without paying the publisher.

Historically, the DMP could still get a record of media exposure from the ad server such as DoubleClick, which would share data on who saw the ads running. Using DoubeClick’s data, the marketer could then still segment audiences within the DMP based on who saw the ad, who converted, etc.

Now that Google has discontinued the sharing of logs with IDs, DMPs are no longer able to see media exposure on either inventories on which they are explicitly banned and or inventories where they are allowed to operate but that Google’s DoubleClick ad server is used by the advertiser. If DMPs are to continue to be useful to the marketer, they will need a new source of data.

Second, some marketers use DMPs to create frequency caps across media platforms. By getting their pixel/code to run with an ad, or by ingesting ad serving logs, DMPs can count impressions exposed to a particular user ID and then send a signal to platforms like DSPs to stop buying a user after a certain amount of exposure. However, without log level data, DMPs will not be able to count frequency for inventory in which they are banned, leading to less accurate frequency measurement and therefore less precise frequency capping.

How do I keep my DMP running at full performance?

Marketers who have invested in a DMP and want to keep its capabilities at full power would be advised to either buy more digital media that allow DMP tracking or find an alternative ad tracking or serving solution that can data transfer log files to the DMP. A combination of these two strategies would allow a brand to continue using its DMP to its fullest by giving the DMP the complete picture of ad exposure tied to person.

If you want to add an independent ad tracker to your DoubleClick stack or to keep powering your DMP with data, talk to Thunder about our Experience Measurement solution. 

More on the Ads Data Hub series

Continue Reading

What is Google’s Ads Data Hub and is it right for me? (Part I of the Ads Data Hub series)

What happened to DoubleClick?

Most marketers today use DoubleClick Campaign Manager (DCM) as their primary ad server for delivering ads and tracking ad exposure to conversion. The largest advertiser and most sophisticated advertisers relied on DCM data to do analytics, attribution, and media activation.

These advertisers would “data transfer” log-level data (the raw data for each impression rather than the aggregate data that hides user-level and impression-level information) to their data management platform, data lakes, and vendors that do analytics or attribution modeling.

 

In April, Google announced it will no longer allow data transfer of DoubleClick log-level data with IDs. This decision effectively destroyed most of the value of the log-level data exported from DCM because advertisers wouldn’t know who saw the ads but only how often an ad in total was served. DoubleClick could be used only to verify that the total amount of impressions bought were actually delivered but all the other powerful use cases like analytics, attribution, and data management would no longer be possible with DoubleClick data.

In June, Google announced it was sunsetting DoubleClick as a brand and folding everything under Google’s brand.

R.I.P. DoubleClick.

Enter Google Ads Data Hub

At the same time, Google pushed forward its own solution to this new problem for marketers — Ads Data Hub. This product is essentially a data warehouse where ad exposure data is housed and can be connected to Google’s own solutions for attribution, analytics, and data management.

One new benefit is access to the Google ID, which is a powerful cross-device ID that uses data from users logging into Google services like Android, Maps, YouTube, etc. Previously, DoubleClick was only tracking and sharing a cookie-based DoubleClick ID, which neither connected cross-device ad exposure and conversion nor reconciled multiple IDs to the same person. For many advertisers doing log-level data analysis and activation, this new ID is a big upgrade because it provides more accurate measurement.

One major downside is that this data cannot leave Ads Data Hub. Consequently, you cannot do independent verification of Google’s attribution or analytics modeling. If Google says Google performs better than its competitors, you will have to trust Google at its word. In the past, you would at least have the raw data to apply your own attribution model if you so wanted, or to re-run Google’s calculations to verify its accuracy (since big companies are not infallible).

By extension, outside ad tech providers (such as DMPs, MTA, etc.) who may be best in-class will have a much harder time working with Google solutions. As a result, you will be dependent on Google.

To do matching of off-Google data such as other ad exposure or conversions that happen offline, Ads Data Hub now requires you to upload and store your customer data in the Google Cloud. In that environment, it can be matched with Google’s ID and tracking so you can build a Google-powered point of view of the consumer journey.

In a way, Ads Data Hub is for those who trust but don’t need to verify. It is a good solution for advertisers who today spend the vast majority (75%+) of their ad budget with Google because ultimately if their advertising isn’t working, no matter what Google says about how it is performing, it would be ultimately accountable for the results. You wouldn’t need to verify calculations to know if your ad budget is wasted.

What else can I do?

Another solution is to add independent ad serving and/or tracking in addition to or in replacement of Google. By doing so, you can still generate log-level data for Google-sold media but it will not be tied to a Google ID. Instead, you will be using your own ID or a vendor’s cross-device ID to understand who saw what ad when, where, and how often.

This approach is best suited for large advertisers who want best in class ad tech solutions to work together, and who cannot spend all their money on a single media platform to achieve their desired results. Typically brands large enough to afford data lakes, independent attribution providers, and data management platforms are the ones who will have the most to lose by moving to Ads Data Hub.

If you already realize you want to take a trust, but verify approach in your ads, talk to Thunder about our Experience Measurement solution. 

More on the Ads Data Hub series

Continue Reading

What CMO’s Say About Ad Experiences

Marc Pritchard famously said “It is time for marketers and tech companies to solve the problem of annoying ads and make the ad experience better for consumers.”

What do his peers think? The CMO Club has partnered with Thunder to publish a “Guide to Solving Bad Ad Experiences,” which includes a survey of over 80 CMOs and an interview with the CMO of Farmers Insurance on the impact of bad ads and how people-based marketing can fix them.

Some key findings include:
  • 74% of CMOs consider brand loyalty as most negatively affected by bad ads
  • 55%+ of CMOs consider frequency and relevancy as the top factors in bad ads
  • 78% of CMOs consider it “inexcusable” to serve ads for products the customer already bought from them
  • 71% of CMOs consider frequency capping important for ad experience but 60% aren’t confident in even their frequency counting!

Click here to download the full research report.

Continue Reading

What is a CMP?

CMP is the hot adtech acronym of 2018. There are actually two meanings to this term: (1) Creative Management Platform and (2) Consent Management Platform. Here’s an overview of both these products and why you may need one.

Creative Management Platform

Introduced in 2016 by Thunder, the CMP acronym original stood for “creative management platform,” a tool for producing and trafficking ad creatives. Rather than just a general purpose creative editor like Adobe Photoshop or Animate, which are applications built for a single designer to use by him or herself, CMPs are meant for an enterprise that has a scale issue with creative.

Many brands, agencies and publishers are increasingly needing to build ads in different sizes and versions for different audiences and media formats. Consequently, creative production demands have grown exponentially while most creative organizations can only scale linearly in their capability by adding more designers and programmers. Because traditional creative editors were built for highly advanced users, a creative bottleneck formed as demand went up and not enough talent or payroll existed to fill the void.

Creative Management Platforms radically simplified ad production by providing easier interfaces and automated production tasks like re-sizing. Forrester began recognizing CMPs in 2017 as part of their broader creative ad tech research which has been timed with the rise in enterprise demand for new marketing creative technologies.

Consent Management Platform

Introduced in 2018, the new CMP acronym stands for “consent management platform.” The European privacy laws known as GDPR required publishers and marketers to obtain explicit consent for certain tracking and targeting data. As a result, a new category of tools emerged to specifically help these enterprises collect and keep track of user consent.

The CMP then feeds that consent information tied to an ID to other selected partners in the digital advertising supply chain. As a result, every party in a publisher’s supply chain understands what data they may use and for what.

Which CMP do I need?

It depends if you’re looking to solve a creative problem or a data privacy problem. Talk to Thunder if you need help with your data-driven creative problems or digital creative production problems. Check out these consent management vendors if you’re looking to solve a privacy preference problem.

Continue Reading

Why Marketers Need More Than One DSP – Understanding The Risks

The average advertiser uses 3 DSPs.  Part #1 of this series examined the reasons digital advertisers make use of multiple DSPs in their programmatic bidding.  Of course, the use of multiple DSPs also creates its own challenges. So in Part #2 below, we look at the challenges created around frequency and bidding against oneself by using multiple DSPs, and how the smart marketer overcomes these challenges.

Don’t multiple DSPs just bid against each other for the same inventory?

When advertisers think of using multiple DSPs to bid on inventory, the most common concern that comes to mind is that the inventory between the DSPs will overlap, and the DSPs will be bidding against each other.  In other words, the advertiser will be bidding against itself, thus inflating its bids and artificially driving up media costs.

And in a world of 2nd price auctions, marketers can see why this is a scary prospect.  We discussed in detail how bidding worked in Part #1 of this series, but here’s a brief summary:

First, DSPs conduct internal auctions and then sends the winning bid to an exchange or SSP for a subsequent auction.  These DSP internal auctions are conducted on a 2nd price basis, which means that an advertiser bidding $25 for an impression will really only bid $5 in the SSP auction if the 2nd highest bid in the DSP’s internal auction was $5.  

What does this mean if the same advertiser had multiple DSPs? Well, if the 2nd highest bid for the same impression in the advertiser’s other DSP was $10, then now the SSP is choosing between bids of $10 and $5 from the same advertiser.  And if the 3rd price in the other DSP was $4, then the advertiser would have cleared the SSP auction at $4 if it had only used the first DSP.

This scenario is certainly possible, but marketers have increasingly overlooked this concern for two reasons. These reasons both stem from the rise of header bidding.

First, for all the bids that are inflated due to the use of multiple DSPs, there are as many bids that a single-DSP marketer that will lose in a header bidding world without using multiple DSPs. As was explained in Part #1 of this series, precisely because SSPs conduct 2nd price auctions, an advertiser can win an exchange’s auction, but lose the unified auction to an exchange that had a higher 2nd price that was lower than the advertiser’s actual bid price.  So, if the advertiser’s main goal is to reach its audience, then it will want to use more DSPs (and win more internal auctions). This inevitably translates to more exchanges submitting the advertiser’s winning 2nd price bid to the header bidding unified auctions, and more wins overall.  

Is this bidding against oneself?  Perhaps, but with header bidding, this is often required to simply win enough auctions to achieve desired scale.

Second, header bidding is bringing about a seismic shift in real-time bidding from 2nd price auctions to 1st price auctions within SSPs and exchanges in order to eliminate this scenario in the first place.  Since header bidding unified auctions select the highest price submitted by participating SSPs and exchanges, SSPs and exchanges are incentivized to maximize the chance of winning the auction, which means submitting the bid with the highest price. In practice, they follow 1st price auctions and submit the winner with their 1st price bid, rather than the 2nd price.  Many SSPs, such as Pubmatic and OpenX, are now following this practice for precisely this reason. Once SSPs and exchanges are using 1st price auctions, the risk of inflating one’s bid goes away as long as an advertiser bids the same amount for the same category of inventory across their multiple DSPs.

How to control frequency with multiple DSPs?

A more serious challenge raised by the use of multiple DSPs than inflating bid prices is the loss of control over ad frequency.  And here, this challenge remains largely underserved, even if the demand for solutions continue to grow among large advertisers.

The main reasons why managing frequency of ads served to individuals matters are (i) to limit the frequency of ad serving to individuals in order to reduce waste, and (ii) to avoid burnout and negative brand associations from over-exposure.  We have all seen bad ad experiences where a brand bombards us with the same ads. So when using a single DSP, advertiser often follow the best practices of capping frequency by day (otherwise known as pacing) and by month, campaign duration or user’s lifetime (to limit the overall exposure to a brand’s advertising).  

However, when using multiple DSPs, frequency capping becomes impossible to accomplish on the DSP level (since the DSP’s don’t actually talk to each other). What solutions are there?

One solution is to control frequency capping on the ad server.  Doubleclick Campaign Manager supports frequency capping, but rather than suppress media buying (as a DSP would via frequency capping) DCM serves a blank ad. This solution is pretty unsatisfying to the advertiser, as it results in significant wasted media spend.  

DMPs, such as Adobe Audience Manager and Oracle BlueKai, claim to offer cross-DSP frequency capping, by tracking ad impressions and then suppressing users via existing integrations with DSPs.  It’s not uncommon to use a DMP to create suppression audiences, so this seems like a natural extension of this capability. Unfortunately, Google blocks DMPs from tracking impressions on GDN inventory.  Currently 14 DMPs are blocked by Google from tracking impressions in GDN. Since Google touches a significant portion of display inventory, frequency capping becomes much less useful without cooperation from the Google ecosystem.

We expect the use of multiple DSPs to be a growing trend for major advertisers that require scale given the evolving mechanics of real-time bidding auctions. Spurred on by this new trend, these same advertisers who need scale will also be the ones most concerned with solving for control over frequency. Stay tuned for solutions that emerge in the marketplace.

Continue Reading

Why Marketers Need More Than One DSP – Understanding Demand Side Platforms

The average advertiser uses 3 DSPs.  There are strong reasons for digital advertisers to make use of multiple DSPs in their programmatic bidding – if you have wondered why advertisers use multiple DSPs, then Part #1 of this explainer is for you.  

Of course, the use of multiple DSPs also creates its own challenges. So in Part #2, we will look at the challenges created around frequency and bidding against oneself by using multiple DSPs, and how the smart marketer overcomes these challenges.

Why do Marketers use Multiple DSPs?

The primary benefits to advertisers of using multiple DSPs are: (i) differentiated DSP features which are needed to execute each campaign, (ii) accessing DSP-specific audience data, and (iii) scaling out the reach of campaigns. Let’s deep dive into each reason.

Benefit #1: Competition among DSPs around Features and Take Rates

DSPs are differentiated in many ways.  One key area is their take rates – the percentage of media spend they charge advertisers.  Another is that DSPs vary in ease of use and level of support. For example, AppNexus has lower take rates than others, but also offers less hands-on support and a powerful but complicated API.  The Trade Desk and MediaMath, conversely, are well known for their customer education and easier-to-use interface. The targeting options they offer and the reporting and analytics available for media insights also vary between each platform.  

By employing multiple DSPs, trading desks also are able to pressure the DSPs to add features and lower take rates by moving spend across DSPs easily.  Most recently, some DSPs have agreed to increased transparency by revealing the fees charged by exchanges, and SSPs that provide the ad inventory. This is a great example of DSPs accommodating customer demands in a competitive environment.

Benefit #2: Audience Data

Many DSPs have unique sources of audience data.  DoubleClick Bid Manager, of course, brings data on users of Google Display Network sites to make targeting options available for AdX sites (most of AdX inventory is GDN) that are not available in other DSPs.  Amazon Audience Platform brings audience data unique to Amazon. MediaMath has a 2nd party data co-op called Helix that benefits many advertisers. Some DSPs, like AppNexus and The Trade Desk, offer IP-range targeting.  

Marketers may be running different strategies with various campaigns, and leveraging multiple targeting options across DSPs empowers them to do so.

Benefit #3: Scale

Ultimately, the primary driver for using multiple DSPs may be the challenge of achieving scale in large budget campaigns with only a single DSP.  A trading desk may simply be unable to spend the budget for a target audience in a large campaign without using additional DSPs.

Why is that?  It’s complicated.  But the explanation below breaks it down.

First, bidding on multiple DSPs increases the odds of winning auctions.  

How?  There’s a couple reasons:

Each DSP conducts its own internal auction before submitting a winning bid to an exchange, which then conducts its own auction to decide which DSP wins.  An advertiser can lose an internal auction in one DSP (for example, DoubleClick Bid Manager), and win an auction in another DSP (say, AppNexus) for the same ad impression.  That’s because DSPs select winning bids not based on bid price alone, but also on the profile of the user and performance factors specific to each advertiser (whether the viewer is likely to click on the ad).  As such, one strategy some trading desks pursue to maximize their chances of winning is to intentionally add a smaller DSP to the mix because they will face less competition winning that DSP internal auction for this reason.

But even once an advertiser wins the DSP auction and the exchange auction, there is increasingly another auction that comes next that they might still not win – the header bidding unified auction.  Before header bidding, publishers would run an auction through a single exchange, and if the winning bid is rejected for some reason, it would run a subsequent auction through another exchange, all in a waterfall process.  With header bidding, publishers run a unified auction across multiple exchanges. Because the exchanges conduct 2nd price auctions (the advertiser pays the price of the 2nd highest bidder), an advertiser could win an exchange’s auction, but lose the unified auction to an exchange that had a higher 2nd price but lower than the advertiser’s actual bid price.  So, the more DSPs with the advertiser’s bid, the more exchanges will have the advertiser’s winning bid, the better chance the advertiser will win header bidding unified auctions.

Here’s an example auction to put this in illustration:

DSP A: The bids are: Advertiser A – $2.00, Advertiser B – $1.00, and Advertiser C – $0.50 -the winning bid is Advertiser A – $1.00 (price paid by Advertiser B)

DSP B: The bids are: Advertiser C – $1.50, Advertiser D – $1.25, and Advertiser E – $0.75 -the winning bid is Advertiser C – $1.25 (price paid by Advertiser D)

The Exchange would look at DSP A and B, and decide the winner to be Advertiser D paying $1.25.

Second, DSPs can’t always bid on every impression on behalf of every advertiser. The infrastructure demands on DSPs to bid on every auction are considerable even before header bidding became ubiquitous.  With the mass adoption of header bidding, a process which duplicates the auction across multiple exchanges at the same time, DSPs’ infrastructure demands become further compounded.

As a result, DSPs can’t always factor every advertiser line item in every internal auction.  There’s a lot of confusion around whether all DSPs can see and bid on all inventory. But that’s really the wrong way of thinking about it.  

In reality, even though DSPs have access to over 90% of the same inventory, they don’t necessarily use their sophisticated and resource-intensive algorithms to score and bid on every single impression they have access to.  They have to filter (partly for cost, partly for other performance factors). This process, of course, leads us back to the first reason advertisers gain scale from using multiple DSPs – you can lose the internal auction of one DSP because you weren’t included in the auction, and win the auction of another DSP, for the exact same impression.

So, there’s several benefits to advertisers from using multiple DSPs – scale, audience data and competition for your business.  In fact, this trend has somewhat altered the trend of in-housing digital advertising operations within brands. Supporting multiple DSPs would be a lot of work for a brand, and is generally handled by trading desks, both agency trading desks and independent trading desks.  

However, the use of multiple DSPs is not without its challenges, as we’ll learn in Part #2 of this blog series.

Continue Reading

How to Test Ad Creatives: Beginner’s Guide to Optimize Your Display Ad Tests

There are so many creative elements that digital marketers can test in their banner ads – from value propositions to taglines to images and styling – that it can be hard to know where to start.  

A/B testing your creatives take a couple weeks to conduct to get proper statistical significant, so it’s often difficult to test every possible creative variation.  So, how should a digital marketer get started with A/B testing their banner ads?

Thunder has conducted hundreds of A/B tests, and distilled our learnings into the best practices for designing creative tests.  When followed, these tips can reduce the amount of time required to optimize your creative!

What is Test Significance?

Before we begin, we should address a commonly misunderstood concept: test significance. Marketers with no background in statistics often miss a critical fact: your tests may tell you less than you think.  

The reason is simple: our testing approach basically surveys the opinions of a smaller group of people within our target population, and sometimes, these small groups don’t completely represent the true opinion of our target population. This can expose marketers to faulty decisions that are based on false positives, that is, tests in which the apparent winner is not the actual over-performer in the target population.  

Statisticians have overcome these sampling errors with “statistical significance” to correct for this type of error, and you should always ask your A/B test vendor how they control for sampling errors including false positives.  If our goal is to learn from our creative testing, then we must ensure that our outcomes are statistically significant!

#1 Test Hypotheses, Not Ads

The first question to ask when designing a creative A/B test is this: What hypothesis do we want to test?  Common hypotheses to test include:

  • Value Proposition (ex: 10% off vs. $25 off)
  • Image (ex. red car vs. blue car)
  • Tagline (ex. “Just do it” vs. “Do it”)
  • Call to Action Text (ex. “Subscribe now!” vs. “Learn more”)
  • Single Frame vs Multi-Frame

Each test should allow you to answer a question, for example: “do my customers like 10% off, or do they like $25 off?”

Many creative tests make the mistake of testing creatives that were created independently of each other, and thus vary in more than one way.  The reason why these tests are ineffective is that the marketer can’t distill the test into a lesson to be applied to future creative design. The only learning from such a test is that the brand should shift traffic to the winning ad.  But no lessons for the next new ad result from such a test.

For example, the A/B test below is comparing different layouts, images, value propositions and CTA text all at the same time.  Let’s say Creative B wins. What have we learned? Not much, other than in this particular set of ads, Creative B outperforms Creative A.  But we don’t know why, and thus have learned nothing that we can apply to future ads.

A/B Test with No Hypothesis

 

By comparison, the following two A/B tests have specific hypotheses – “do red cars work better than blue cars?”  At the end of this test, we will learn that either red SUV’s or blue sports cars outperform the other, and can apply this learning to future creatives.

Hypothesis-Driven A/B Test: Car Type Drives Performance

 

In this next A/B test, the hypothesis is that the value proposition in the tagline drives performance.  A common first A/B test for a brand is to compare feature-based vs value-based taglines.

Hypothesis-Driven A/B Test: Value Proposition Drives Performance

 

#2 Test Large Changes before Small Changes

Large changes should be tested first because they generate larger differences in performance, so you want these learnings to be uncovered and applied first.  

Larger changes – such as value proposition and image – are also more likely to perform differently for different audience segments that small changes – like the background of the CTA button.  As such, by breaking out your A/B test results by audience segment, you can learn what tagline or image pop with particular segments, which can guide the design of a creative decision tree.

Large changes: Value Proposition, Brand Tagline, Image, Product Category, Price/Value vs Feature, Competitive Claims

Smaller changes: CTA text, CTA background, Styling and formatting, Multiframe vs Single Frame

Small changes are likely to drive small lift.  Only test this after testing bigger changes.

 

#3 Test multiple creative changes with Multivariate Test Design

Multivariate test designs (MVT) sound more complex than they are.  Multivariate tests simply allow you to run 2 or 3 A/B tests at the same time, using the same target population.  They are a statistically rigorous way to break Rule #1 above that says you should test a single change at a time.  In the case of MVT test design, you can more than one change by creating a separate creative for every combination of changes, and then learning from these tests.  

For example, if, as below, you are testing 2 changes – message and image – each of which have 2 variations, you have a 2×2 MVT test and need to create 4 ads.

Multivariate test that tests Image and Message at the same time

 

When the test is done, aggregate test results along each dimension to evaluate the results of each A/B test independently. If you have enough sample, you can even evaluate all the individual creatives against each other to look for particular interactions of message and image that drive performance.

To Summarize:

To drive more optimizations more quickly and generate demand and budget for more testing, following these simple tips:

  1. Test hypotheses that generate learnings for subsequent creative design
  2. Test large changes first and setting up multiple variate tests
  3. Test one change at a time, or set up a multivariate test framework

Happy testing!

Continue Reading

Doubleclick ID Alternatives for my Doubleclick Campaign Manager (DCM) logs?

tl;dr DoubleClick logs are used today by marketers for verification, attribution modeling, and other analysis beyond what is available in standard DCM dashboards.  

Log-based analytics require a device or user identifier, so DoubleClick’s removal of the DoubleClick ID represents a disruption of the status quo for log-based analytics solutions.  

Fortunately, DCM logs are not the only source of log-level data, or even the best.  Brands and agencies increasingly use tracking pixels from measurement vendors that have access to deterministic IDs as a replacement for ad server logs and to support more advanced analysis. Skip to the end if you are just looking for a list of recommendations.

How important are logs in digital advertising?

What Happened

Google’s announcement last Friday that DoubleClick is removing the Doubleclick ID from its logs resulted in panic in many corners of the digital advertising world.  What is the DoubleClick ID? For that matter, what are logs and why do people use them? Confused as to what the big deal is?

Here are the answers:

Beginning on May 25, DCM will stop populating the hashed UserID field (which stores the DoubleClick cookie ID and mobile device IDs) in DoubleClick Campaign Manager and DoubleClick Bid Manager (DBM) logs for impressions, clicks and website activities associated with users in the European Union. DoubleClick intends to apply these changes globally, and will announce timing for non-EU countries later this year.

What this Means for Advertisers

DoubleClick, like most adtech platforms, provides reporting dashboards to monitor performance KPIs.  While dashboards provide a good summary on performance, they can’t answer more granular questions that marketers want of their data.  That’s why many marketers ingest logs from their ad servers and DSPs. These logs are broken out into impression logs, click logs and site activity logs.

In order to perform custom analytics with these logs, the logs need to share a common identifier, so that the marketer can tie together recorded impressions from multiple sources (DCM, DSP, etc.) that belong to the same person, as well as clicks and site actions from that person.  

That common identifier is generally the cookie ID or, in the case of mobile app ads, mobile device ID.  DoubleClick currently has a field in all of their logs called UserID that stores a hashed version of the DoubleClick cookie ID or the mobile device ID tied to an impression, a click or a site action.

By removing this field from their logs, DoubleClick is effectively ending their support for ad server logs that are used for analytics, verification, measurement, or attribution modeling. Without the UserID field, marketers can no longer tie together impressions, clicks and site actions. For example, if you were previously filtering suspicious traffic based on frequency of engagement, you will no longer be able to do so (because each row becomes unique without a deduplicating identifier).

The alternative proposed by Google is for marketers to pay to use the dashboard found in the Google Ads Data Hub.  The big issue with this approach is that the marketer has to trust Google to grade their own homework, making the marketing standard “trust, but verify” approach all but impossible.

As a result, brands and agencies using DoubleClick logs will no longer be able to independently:

  • Verify frequency by cookie or person
  • Count total ad exposure by person
  • Analyze true reach of media placements and campaigns
  • Compare reach and duplication by media placement and campaign
  • Attribute or de-duplicate conversions and clicks
  • Report on user conversion rates
  • Identify unique site traffic

What’s the Back Story

This announcement is part of two trends in the market – GDPR as a pretext for raising the walls of walled gardens, and the shift from logs to trackers to collect data for custom analytics.  

First, Google is saying that the upcoming EU law, GDPR, is forcing them to do this, something many pundits have questioned. Walled gardens are continuing to grow taller, and increasingly are leveraging privacy concerns as the pretext for doing so. Media sellers are also now further pushing their own measurement and attribution solutions in a bid to grade their own homework and prevent cross-platform comparison.  

Google has built a more full-featured measurement and attribution product that is currently in pilot with selected large brands known as Google Attribution 360, part of Google Ads Data Hub.  The announcement to remove the DoubleClick ID from logs is connected strategically to the broader release of Attribution 360 later this year. In fact, Google Ads Data Hub was even plugged in the email to agencies informing them of this change.

Second, this announcement is a reaction to the trend of measurement and attribution vendors disrupting the importance of ad server logs, making Google’s decision seemingly reasonable.

Marketers are increasingly relying on vendors to improve their accuracy through features that are not a part of the traditional ad server log. Specifically, savvy marketers want (a) cross-device graphs and (b) the ability to perform causal attribution modeling. Neither of these goals are unlocked by DCM logs today, leading to the emergence of an ecosystem of measurement platforms, each with their own trackers tied to a cross-device graph for data collection. Of course, one such vendor is Google, whose Attribution 360 offering has both of these advanced features.

As such, DoubleClick’s announcement simply represents a formal passing of the torch in responsibilities from the ad server to the measurement provider for those marketers who have already reduced their dependence on DCM logs.

Recommendations

Brands and agencies need to identify vendors who can provide tracking and measurement capabilities (full disclosure – Thunder Experience Cloud is one such vendor). This change needs to occur before current dashboards built off of DCM logs become disrupted.  

If you are evaluating vendors to address this change, we recommend the following as requirements:

  • Ability to source data from impression trackers rather than logs
  • Visibility across all ad exchanges (several vendors are classified as DMPs by Google and thus blocked from tracking impressions on AdX)
  • Can provide the following categories of metrics:
    • Frequency by person and total ad exposure by person
    • True reach and overlap of media placements and campaigns
    • Attribution using any configurable attribution model, both position-based and algorithmic
  • Media agnostic (be wary of solutions that grade their own homework)
  • Independent of any arbitrage of audience data segments that are evaluated by their measurement product

In addition, some “nice to haves” include:

  • Backed by a deterministic people-based graph
  • Can provide reliable logs with interoperable customer ID to other identified vendors within the brand’s adtech stack if requested
Continue Reading

Digiday eBook: The ABC’s of People-Based Testing

Ad testing is meant to solve a very specific problem: Marketers are tired of launching their ads into a void, crossing their fingers and hoping for a boost in conversions. But, as Digiday reports in a new eBook, a number of widely used ad testing techniques dodge the question by failing to keep track of the individual on the other side of the screen.

As a result, people-based testing techniques are slowly but surely catching on, making it far easier for industry pros to identify real effectiveness and impact to put more media budget behind.  To learn more, check out Digiday’s Did Your Ad Work: The ABC’s of People-Based Testing.

Continue Reading

Thunder Unveils Experience Cloud, Appoints BJ Fox as VP of Engineering

thunder creative management platform press

San Francisco, CA – (Feb 21, 2018) – Thunder, the original and leading Creative Management Platform (CMP), today announces it has expanded its offering to three enterprise products: In addition to its Creative Management Platform, Thunder now also offers Dynamic Creative Optimization and Experience Measurement. In addition, the company has added industry veteran BJ Fox to manage and scale product development. Fox brings over 20 years of experience across Internet-based software companies, ranging from startups to large enterprise companies.

The Thunder Experience Cloud is comprised of:

  • Creative Management Platform (CMP)
    • Thunder’s CMP enables users to produce brand experiences across channels and traffic them to the appropriate media platforms.
  • Dynamic Creative Optimization (DCO)
    • Thunder’s DCO personalizes and optimizes brand experiences to increase conversions and media efficiency.
  • Experience Measurement
    • Tracks and measures a brand’s customer lifetime experience, decreasing media waste and improving understanding of how ads work through creative, media and exposure data.

“We are excited to bring BJ on board as we roll out the Thunder Experience Cloud,” said Victor Wong, CEO of Thunder. “His experience in building and managing large engineering teams will be a major asset as we take our platform to the next level of scale and innovation.”

In his new role, Fox will be responsible for overseeing and growing the product development team for Thunder, which expects to grow headcount by 25% in the coming months alone.

Previously, Fox was vice president of engineering at Getjar, a late stage venture-backed mobile ad network that sold to Sungy Mobile before working at Glympse, the popular consumer app for location sharing where he built out the company’s enterprise offerings for Fortune 100 customers and partners. Prior to that, Fox was director of software development at Microsoft, where he held key roles in Windows Azure, Research, and Xbox, managing a team across three continents while driving the initial launch of Windows Azure. He also served as chief architect of Signio, an angel-funded payment startup that was purchased by VeriSign in 2000 for $820M. Most recently, he was vice president of engineering at Xevo, a connected car platform for the auto industry.

Fox said, “Thunder has long been an industry leader; the first Creative Management Platform, it is now paving the way for true people-based marketing with Experience Measurement. I have tremendous respect for Victor and am thrilled to join his team of boundary-pushing thinkers and innovators.”

About Thunder:

Named one of Forbes’ 100 Most Promising Companies in America, Thunder powers ad creative personalization, decisioning and analytics for advertisers, agencies, and publishers across the globe.

Thunder Experience Cloud enables brands to personalize, optimize, and connect ad experiences cross-channel for people-based marketing. Leading brands such as Anheuser-Busch and McCormick rely on Thunder for its creative management platform, dynamic creative optimization and experience measurement to increase conversions and decrease media waste.

####

Press Contact:
Cassady Nordeen
Blast PR on behalf of Thunder
Cassady@blastpr.com

Continue Reading

Panel discussion at OMMA Programmatic –  Can Robots Fix Programmatic Creative?

Thunder CEO Victor Wong had the pleasure of sitting in on a panel about programmatic creative at this year’s OMMA Programmatic event.

“Less than 1/3rd of online users today feel that internet advertising is relevant to them.

“Through programmatic creative and using data, if we can bring that up to 50%, that’s going to lead to hugely impactful outcomes for our clients and hopefully cut down on ad blocking and improve the way people perceive advertising in the future,” said Andrew Sandoval, Director, Biddable Media at The Media Kitchen in an opening statement.

Watch the video recording:

Below is a summary version of the main takeaways from the talk.

Continue Reading