Introduction to Differential Privacy

Privacy has become an increasingly hot topic in ad tech. From GDPR to ITP 2.0, marketers are becoming increasingly conscious of the importance of privacy, which they now have to actively balance against the need for transparency and accountability. Recently, industry leaders have started talking about differential privacy, and how this technology could be the solution to balance privacy with security. Digiday provides a good introduction here.

Before diving into differential privacy, it’s helpful to keep in mind how marketers actually consume data. It may seem counter intuitive, but a savvy data-driven marketer doesn’t actually care about any specific individual in their campaigns. Rather, the marketer is optimizing for the behavior (and results) from the entire group or segment it is targeting. (If you are a marketer, ask yourself this question: in your last analysis, did you care that User #123 converted or did you care how many users in your target population spent money?) This insight helps us realize a system that hides the behavior of any given individual but provides accurate user behavior can strike the balance between user privacy and transparency. Does this solution exist? It can with differential privacy.

Differential privacy is a set of statistical techniques that introduce noise into any given data set in order to protect user anonymity without changing your overall conclusion. Does it sound too good to be true?

Here’s an oversimplified example of differential privacy principles at work. If you wanted to ask a group of people sensitive questions such as “Have you cheated on your spouse?” you will likely get few people who to tell you the true answer. However, imagine before people answered, they were told to privately flip a coin. If the coin lands heads, they tell the truth – yes or no. If the coin lands tails, they then flip the coin again privately. If it is heads, they say “yes” no matter what the truth is. If it is tails, they say “no,” again despite the truth.

As a result of this basic obfuscation, any outsider who looks at the data won’t know if an individual participant’s recorded answer is the truth or not because it could easily have been an arbitrary answer. That said, there is a known statistical distribution of correct answers (50% of answers) versus arbitrary answers (25% no, 25% yes) thanks to random coin flips. For a large population sampled, you will be able to then reveal what is the true rate of spousal cheating without risking any individual’s privacy!

This example, of course, oversimplifies the actual mechanics of differential privacy. In reality, more complex techniques can be applied to each data set in order for more robust data security and greater transparency. But that discussion is better left for the Data Privacy 201 course…

Continue Reading

Truth in Measurement: Evolution of Digital Measurement

Brian Andersen of Luma Partners recently spoke at the Truth in Measurement summit, where leading brands and publishers gathered to discuss adopting a common approach for measurement that balances transparency, privacy, and consumer data protection. His presentation on the evolution of digital measurement touches upon the historical and current ways of measurement as background for understanding how things came to be, and what marketers want today. The full presentation is included below, but here are some highlights from his talk:

The Highlights:

  • Measurement started out focused purely on desktop website traffic, with metrics such as page views, click path, exit rates, etc.
  • The industry became increasingly complicated with the rise of mobile, programmatic, and walled gardens
  • Mobile became particularly complicated because 90%+ of time spent was in apps rather than on the mobile web. This led to the need for specific mobile analytics and measurement companies
  • The emergence of programmatic advertising led to more complicated processes, which created opportunities for bad actors to exploit
  • At the same time, walled gardens have become more ubiquitous. Unfortunately, each take a slightly different approach towards measurement
  • People-based measurement has emerged as the solution embraced by marketers, with a focus on real results (such as revenue) rather than proxy metrics (such as impressions, cookies)
  • The biggest challenge facing marketers today is apply these principles across platforms to get log-level data, which is exactly what Truth in Measurement is trying to tackle

Continue Reading

Hershey’s Talks “Truth in Measurement” and Driving Transparency

 

Vincent Rinaldi, head of addressable media at Hershey’s, talked to Digiday recently about the famous candy maker’s efforts to improve measurement and ad transparency.  Hershey’s is part of a growing number of brands participating in “Truth in Measurement” industry group founded by Thunder to promote discussion and standardization around measurement data sharing.

Here’s an excerpt from Digiday’s February 5th interview:

Hershey’s is one of the founding advertisers behind the ‘Truth in Measurement’ initiative led by ad tech firm Thunder Experience Cloud, a model that will allow advertisers, media owners and ad tech vendors to support a transparent measurement of ad data across platforms.

While nothing concrete has been drawn up yet — a summit will be held in the U.S. later this year between the initiative’s backers to find one — Rinaldi said it would focus on measurement like verification and attribution data, rather than targeting data. The approach is similar to the cross-platform measurement model Unilever is testing in several markets now. It’s more likely that online media owners will back initiatives like ‘Truth in Measurement’ and Unilever’s alternative when they don’t insist on them having to part with the valuable targeting performance data that serves as the backbone of their commercial strategies.

“We want to create community gardens not walled ones,” said Victor Wong, CEO at Thunder Experience Cloud. “The initiative we’re working on will be a technical standard that any advertiser or platform can follow to share data. First, we have to figure out what data at a minimum to get ad transparency into what they buy and then we need to work out what the media owners are willing to share so they’re not compromising the privacy of their users or their own business models.”

If you’re interested in joining the discussion and setting the new standard, please join the Truth in Measurement advisory board.

 

Continue Reading

People-Based Measurement from TV to Digital

As more and more people consume media across multiple screens, device-based measurement has become increasingly inaccurate and incomplete. Thunder and TiVo recently partnered up to discuss some of the challenges with people-based measurement from TV to digital, and ways marketers are tackling this tricky problem.

Challenges with Measuring:

Marketers are scrutinizing their approach towards measurement to make sure they are truly understanding what goes into their media spend, and how this spend translates into results. Some of the areas they’re focusing on include:

  • Quantifying sales and brand impact
  • Increasing marketing ROI
  • Measuring omnichannel campaigns
  • Integrating in-store transactions with digital media data
  • Linking cross device data
  • Improving media attribution and optimize media mix

Applying traditional cookie- or device-based measurement approaches to these areas leads to imprecise, incomplete, and sometimes incorrect insights. As such, marketers have embraced People-Based Measurement as the new way forward.

What is People-Based Measurement

People-based measurement refers to the use of persistent identifiers to capture user behavior across channels and devices. This approach provides a more holistic view of user behavior compared with traditional cookie- or device-specific measurement. Here’s a simple video that explains the differences in approach.

There are two common ways people-based measurement is done: panel-based measurement and direct measurement.

Panel based measurement refers to the use of certain technologies that monitor how certain subgroups of individuals behave, and uses those observations to make general conclusions about the population. The advantage of this method is that you can make conclusions with far fewer data points. The downside, of course, is that your extrapolations are prone to sample bias and may be inadvertently distort reality.

Direct measurement, in contrast, provides far higher accuracy. Unfortunately, this approach requires collecting more more data via a persistent identity, which is a technological challenge for marketers who do not have access to this type of technology.

Both approaches of doing people-based measurement provide far greater accuracy relative to cookie- and device-based approaches. In a cookie-based world, ID’s are temporary rather than persistent, and impressions are subject to fraud, deletion, and blocking. In the device-based approach, each device may offer a persistent, unique identity, but one individual may have multiple devices.

Omnichannel with People-based Measurement

People-based measurement connects ad exposures across all environments – from open web to walled gardens, including linear and OTT video. Ad exposures can be connected to a persistent identifier, which can then be tracked against both online and online conversions.  For example, TiVo’s data set includes exposures from three million active households across 210 DMA’s. Using TiVo and Thunder’s people-based measurement, the marketer can combine the data from television with data from open web and walled gardens to provide a true view of the customer journey.

What is the Impact of Measuring by Person

When marketers evolve from device- or cookie-based measurement to persistent people-based measurement, they typically notice some startling changes in their reach and frequency. These observations include:

  • A decrease in reach (which happens when you connect multiple devices and cookies to a single person)
  • An increase an frequency (which happens because your audience may see the ads on multiple screens)
  • An increase in conversions that are ad attributed (which happens via events that are not tracked by cookies or devices such as offline purchases).

Looking to make the transition to People-based measurement?

There are two ways for marketers to embrace people-based measurement.

The quick and easy approach is a Wrap & Measure test, which uses a people-based ad server to track your ad exposures, person counts, and digital conversions for a particular campaign, and provides a report on a particular campaign to see the people-based difference.

The more comprehensive approach is the “Always-On” People-based ad serving, which uses a people-based ad server to track and personalize your message across all campaigns by person. Marketers using a people-based ad server switch from “per campaign” or “per channel” mindsets to “customer-centric” mindsets that focus more on customer journeys and lifetime customer value. Relative to the quick and easy approach, this more comprehensive approach can fill your data lake in real-time with people-based data.

To get the full scoop of the webinar, see the video below:

Continue Reading

Thunder Announces Industry Group to Improve Advertising Measurement Standards for Brands, Publishers and Consumers

(GLOBE NEWSWIRE) — Thunder Experience Cloud, the leader in people-based ad serving, today announced Truth in Measurement (TIM), a new industry group that aims to develop a technical standard for the sharing and measurement of advertising data in a transparent way that protects both consumer privacy and publisher data. The initiative is led by top executives from some of the world’s biggest brands and ad platforms.

One of the major industry issues TIM aims to improve is of ad transparency in the age of consumer privacy and data leakage. Historically, this has made it difficult for advertisers to measure their ads and verify the results. To tackle these challenges, TIM plans to develop a framework for providing ad exposure measurement tied to anonymized IDs for measurement only.

“Publishers can simultaneously protect consumer privacy and provide transparency to advertisers with TIM’s framework of sharing ad exposure data tied to a privacy-conscious ID,” said Victor Wong, CEO of Thunder. “Rather than be a walled garden, you can be a ‘community garden’ to share data while protecting your business model and consumer privacy.”

“TIM’s measurement solution protects data and is transparent for advertisers, publishers and consumers,” said Alysia Borsa, chief data officer at Meredith. “The ability to accurately map measurement is crucial for every industry player and TIM is leading the way in new verification, analytics and attribution.”

At the same time, TIM will focus efforts on enabling people-based measurement. While the industry has been focused on developing channel-specific measurement for several years, there has been a lack of attention around people-based measurement, i.e. tracking ad exposure and conversion by person across channels. Without this type of de-duplicated measurement, it is impossible to know the true ROI of any campaign.

“The need for higher data protection standards and rise of omni-channel advertising campaigns has made it increasingly difficult for advertisers to track, measure and verify the results of various consumer interactions across multiple platforms,” said Vincent Rinaldi, head of addressable media at Hershey’s. “TIM provides a first step to solving a problem numerous stakeholders face – promoting transparency while simultaneously respecting the consumer’s privacy.”

“Some major platforms have begun redacting data such as user IDs they previously provided, and this is something that can no longer occur in the industry,” said Trace Rutland, director of media innovation at Tyson Foods. “We are excited to be partnering with industry leaders on this initiative to encourage advertisers to holistically and accurately measure impact with TIM’s framework.”

Brands, publishers and consumers can rely on TIM to establish a framework that will solve for their different needs.

To learn more about TIM’s efforts and upcoming events, please visit: http://truthinmeasurement.com

Continue Reading

People-based Measurement From TV to Digital

As people have begun using multiple screens, device-based measurement has become more inaccurate and incomplete. Thunder and TiVo are co-hosting a webinar to share insights on how brands are building people-based measurement stacks from the ground up to measure everything from TV to digital.

RSVP here to learn how new research from Tivo and Thunder, a people-based ad server, is uncovering the impact of adding TV and people-based to measurement methodologies and technology stacks.

Continue Reading

People-based versus Cookie-based Measurement Comparison

Marketers for many years relied on measuring their audience reach, frequency, and conversions by using the ubiquitous “cookie,” a temporary identifier dropped onto user browsers until they delete them periodically (typically every 30 days or so). This identifier made it possible to track people who saw an ad (reach) and who showed up to the advertiser’s website to do some action (conversions).

However, with the advent of mobile devices like smartphones and tablets, the cookie started to crumble as a reliable way to have a complete view of a consumer’s digital ad experience. In fact, with the introduction of mobile in-app experiences and mobile device IDs, the consumer digital identity fragmented further, making it hard to accurately measure and therefore personalize and optimize ads.

Now with people-based marketing, brands are unifying their view of a consumer by connecting multiple cookies and device IDs. Deterministic identity graphs such as Thunder and LiveRamp are able to use authenticated logins across multiple devices to build a complete, accurate picture of who the consumer is across devices. Most of ad tech hasn’t yet been updated for the people-based world and so still run on just cookies or single device IDs for measurement, causing misleading results.

Thunder teamed up with LiveRamp to produce the first open, people-based ad server and as a result of running billion of impressions a month, Thunder has been able to study the difference between the traditional cookie-based measurement and new people-based measurement.

The fact there is a difference isn’t surprising but the size was quite shocking. Using cookies-only have been shown to only ~50% accurate now in the new multi-device world!

Get the full results in our new whitepaper, “Ad Counting Comparison Study.

Continue Reading

A/B Split Testing Sample Size Calculator

Everyone wants to optimize their advertising and that means figuring out what works and what doesn’t. To that, you need to figure out not only a winner and a loser but you need confidence that you are right.

Unfortunately, it’s not as simple as just comparing the current performance of one ad with another ad — that’s like saying just because one basketball team is currently ahead in the 1st half of the game that it is the best team. You need enough time and data (enough matchups in basketball) to determine a true champion.

In statistics, confidence comes from having enough “statistical significance” which essentially means knowing the results are likely not just chance but would be likely repeated if the match up happened again. To achieve statistical significance, you need a minimum sample size of data based on a target desired lift in performance that would make one ad a meaningful winner over another.

Thunder has built an easy to use sample size calculator which will allow you to now input basic variables in your creative experiment to determine the minimum sample size necessary whether you’re thinking about how many impressions or how many people need to be in the test to get a meaningful result. Thunder Experience Cloud uses minimum sample sizes to ensure it has enough data for each creative version it is testing before its Dynamic Creative Optimization solution serves the winning creative to all consumers, thereby achieving maximum media impact and efficiency with highest confidence.

If a vendor tells you there isn’t a minimum or that they can test thousands of ad versions, you need to ask them if they have a sample size calculator or how they will achieve statistical signfiance with their results. Otherwise, it is highly suspect they are really testing and optimizing your advertising.

Try out our free testing sample size calculator for A/B and multivariate testing.

Inputs for Thunder sample size calculator

 

Continue Reading

New California Privacy Law Compared to GDPR – Summary

GDPR v. California Privacy Laws

Digital marketers just rushed to meet GDPR compliance in May 2018 for digital marketing in Europe. They now need to rush to meet a new California privacy law put in place that will go in effect in January 2020. Compared to GDPR, the California Consumer Privacy Act (also known as CaCPA or CCPA) balances commercial and consumer interest much more to enable digital marketers to continue data-driven marketing while giving consumers more protections and options.

Similarities:

Both CaCPA and GDPR

  1. apply to businesses that are not located within their borders
  2. assign responsibility for enforcement to a governmental authority
  3. do not permit discrimination against individuals who exercise their legal rights
  4. provide individuals with certain rights with respect to personal data; including the right to access and delete their personal data
  5. address some similar concerns (e.g., the importance of access and transparency)
  6. will require businesses to expend time and money to achieve compliance

Key Distinctions:

  1. GDPR comprehensively addresses many privacy concerns (e.g., disclosures businesses must make to data subjects, process for data breach notification to individuals and regulators, implementation of data security, cross-border data transfers, etc.) while CaCPA is focused on consumer privacy rights and disclosures.
  2. GDPR provides comprehensive private rights of action while CaCPA does not create a private right of action except for data breaches (and with many requirements).
  3. GDPR provides a more comprehensive set of rights to consumers, including the right to data correction and the right to data portability, which CaCPA does not have (unless the business decides to respond to a request for portability by providing the data electronically, in which case it must do so it in a readily useable format that can be transmitted to another entity only to the extent it is technically feasible).
  4. GDPR includes considerably more comprehensive requirements on businesses, including privacy by design and default, foreign company registration requirements, data protection impact assessments, 72-hour breach notification, data protection officer requirement, and restrictions on cross border transfers.
  5. GDPR requires data controllers to sign formal, written agreements with processors that meet stated requirements of a processor’s handling of personal data. CaCPA requires only requires a written agreement with a third party in very limited circumstances.
  6. GDPR requires businesses to assume and contract for appropriate technical and organizational security precautions. CaCPA does not mention security other than to provide a cause of action for lawsuits on behalf of consumers for the unauthorized access, exfiltration, theft, or disclosure of personal information that is not encrypted or redacted that results from the failure to implement and maintain reasonable security procedures and practices.
  7. The GDPR requires that businesses must have a legal justification before it collects, processes, or transfers personal information, with a consumer’s informed and unambiguous consent as a single means of achieving that legal justification. CaCPA on the other hand does not require businesses to have such legal justification and uses an opt-out approach

Detailed Comparison

If you’re worried about your compliance with both laws, you should read Part II of GDPR vs California Consumer Protection Act that covers in more detail the nuanced differences and why compliance with one law doesn’t ensure compliance with both.

Thunder’s Role

Thunder Experience Cloud enables the advertising ecosystem to balance consumer interests in privacy with commercial interests in data-driven advertising. Thunder helps ad platforms prevent data leakage, consumers gain privacy, and advertisers obtain transparency through its anonymized people-based measurement solution. Ask us how to protect consumer data while supporting data-driven advertising if you’re interested to learn more.

Continue Reading

GDPR vs California Consumer Privacy Act (CaCPA) Detailed Comparison

GDPR v. California Privacy Laws

If you’re a large digital marketer, ad platform, or agency that reaches any consumer in the EU or California, you will need to soon comply with both GDPR which went into effect in May 2018 and the new California Consumer Privacy Act (also known as CCPA or CaCPA) which will go into effect January 2020. While GDPR is generally seen as more stringent than CaCPA, there are still some nuanced differences and compliance with one doesn’t mean compliance with the other.

In Part I of this series, Thunder summarized the key differences and similarities between the two sets of laws.

In this Part II of the series, Thunder has provided a detailed breakdown for digital marketers, agencies and ad platforms comparing GDPR and California Consumer Privacy Act (known as: CCPA or CaCPA for short) to make sure they are compliant with both:

Jurisdiction

GDPR: Applies to data collection of persons in the EU (whether the company is based there or not)

CaCPA: Applies to data collection of California residents (whether the company is based there or not)

Personal Data

GDPR: Any information relating to an identified or identifiable natural person.

CaCPA: Any data that “identifies, relates to, describes, is capable of being associated with, or could reasonably be linked, directly or indirectly, with particular consumer or household.” A “consumer” is a California resident as defined by tax code. The “personal data” definition is developed through examples, exclusions and cross-references to other laws. Data subject to HIPAA is exempted from CaCPA but data subject to FCRA, and GLBA is excluded only to the extent those statutes conflict with the CaCPA.

Data Subject

GDPR: An identified or identifiable natural person. An identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person.

CaCPA: A California resident as defined under California tax law.

Data Controller

GDPRThe natural or legal person, public authority, agency or other body which, alone or jointly with others, determines the purposes and means of the processing of personal data; where the purposes and means of such processing are determined by Union or member state law, the controller or the specific criteria for its nomination may be provided for by Union or member state law.

CaCPA: For-profit controllers that meet ONE of the following thresholds: (1) Annual gross revenue over $25M; (2) Buys/sells or receives/shares for “commercial purposes” the data of 50,000 California residents; or (3) Derives 50% of revenue from “selling” personal data of California residents. If a controller qualifies under the thresholds, parent companies and subsidiaries in the same corporate group operating under the same brand also qualify.

Processor

GDPR: A natural or legal person, public authority, agency or other body that processes personal data on behalf of a controller. The GDPR also defines a “third party” as a natural or legal person, public authority, agency or body other than the data subject, controller, processor, and persons who, under the direct authority of the controller or processor, is authorized to process personal data.

CaCPA: A “service provider” is a for profit entity that acts as a processor to a “business” and that receives the data for “business purposes” under a written contract containing certain provisions. The CaCPA uses the term “third party” to refer to entities that are neither business nor service providers.

Sensitive Data

GDPR: Per Article 9: Processing of personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person’s sex life or sexual orientation is prohibited.

CaCPA: Sensitive data is not addressed.

Transfers of Personal Data

GDPR: Any transfer of personal data that are undergoing processing or are intended for processing after transfer to a third country or to an international organization shall take place only if the controller and processor comply with the conditions set forth in Articles 44-50. Transfers on the basis of an adequacy decision and methods such as Binding Corporate Rules, Contract Clauses, etc. or in the case of EU-US transfer, the Privacy Shield.

CaCPA: Cross-border data transfers are not restricted. All transfers to “service providers” require a written agreement containing certain provisions (that is, there is the CaCPA equivalent to Article 28 of the GDPR).

Data Portability

GDPR: Per Article 20, the data subject has the right to receive the personal data concerning him or her, which he or she has provided to a controller, in a structured, commonly used, and machine-readable format and have the right to transmit that data to another controller without hindrance from the controller to which the personal data has been provided.

CaCPA: There is a limited recognition of this right under the CaCPA. Cal. Civ. Code Section 1798.100 provides that data subjects that exercise their right to access, must receive the data “by mail or electronically and if provided electronically, the information shall be in a portable and, to the extent technically feasible, in a readily useable format that allows the consumer to transit this information to another entity without hindrance.” There is a related and somewhat contradictory provision on this under Cal. Civ. Code Sec. 1798.130(a)(2).

Consent

GDPR: Opt-in approach requiring informed, freely given, and unambiguous consent

CaCPA: Opt-out approach (for data being sold to 3rd-parties) that doesn’t require consent for adults; however users can ask that their data be deleted

Penalties

GDPR: Under Article 83: • Up to 10 000 000 EUR, or in the case of an undertaking, up to 2 percent of the total worldwide annual turnover of the preceding financial year, whichever is higher for infringements of obligations such as controllers and processors, the certification body, and the monitoring body. • Up to 20 000 000 EUR, or in the case of an undertaking, up to 4 percent of the total worldwide annual turnover of the preceding financial year, whichever is higher for infringements of obligations such as principles of processing, conditions for consent, data subject’s rights, transfer beyond EU, etc. • Under Article 84, each member state can lay down the rules on other penalties applicable to infringements of the GDPR in particular for infringements that are not subject to Article 83, and can take all measures necessary to ensure that they are implemented.

CaCPA: No private right of action for most provisions with the AG of California taking the role of DPA and being able to impose civil penalties up to $7,500 for each violation with no maximum cap. Violators may avoid prosecution by curing alleged violations within 30 days of notification. For certain data breaches there is private right of action with statutory damages set between $100 and $750 per data subject per incident with a requirement to notify the AG before filing a lawsuit and refraining from pursuing the action if the AG office prosecutes within six months of the notification.

Thunder’s Role

Thunder Experience Cloud enables the advertising ecosystem to balance consumer interests in privacy with commercial interests in data-driven advertising. Thunder helps ad platforms prevent data leakage, consumers protect privacy, and advertisers obtain transparency through its anonymized people-based measurement solution. Ask us how to protect consumer data while supporting data-driven advertising if you’re interested to learn more.

Continue Reading

LiveRamp and Thunder Experience Cloud Announce Partnership to Enable Omnichannel, People-Based Measurement and Personalization

people-based ad server

San Francisco, CA – (Sept 25, 2018) – Thunder Experience Cloud and LiveRamp®, an Acxiom® company (NASDAQ: ACXM) and leading provider of omnichannel identity resolution, today announced a partnership to enable people-based marketing in three key areas: targeting, measurement, and personalization.

The partnership provides marketers with a more holistic view of their customers by giving them the ability to track ad exposure and conversion across devices directly to their own person IDs, rather than relying on less accurate identifiers such as Cookie IDs or third party measurement providers.

LiveRamp customers use its identity graph for CRM targeting across the open web and walled gardens. Now, with the addition of Thunder’s people-based dynamic ad server, marketers can run campaigns from start to finish on LiveRamp IDs without pause. Ads can be dynamically personalized and measured in real-time using LiveRamp’s identity graph.

“This partnership is truly changing the standards of measurement and relevance in advertising,” said Paul Turner, GM of Technology at LiveRamp. “With Thunder Experience Cloud, marketers have a one stop shop for creating and measuring high-performing omnichannel campaigns based on the person, rather than the device or cookie, ensuring the right ad gets in front of the right person on any device, and bringing us closer than ever before to achieving true people-based marketing, while maintaining LiveRamp’s high standards of transparency and customer privacy.”

“Thunder is the only open, deterministic people-based ad serving and tracking solution today,” added Victor Wong, CEO of Thunder Experience Cloud. “By partnering with LiveRamp, one of the most trusted data platforms, we are giving marketers person-level measurement accuracy on their advertising while protecting the privacy of the consumer through state of the art encryption and anonymization.”

To learn more visit MarketingLand coverage:

 

Continue Reading

Webinar: Surviving the Doubleclick ID Loss

Alongside Adweek and Neustar, Thunder engaged in a webinar on the topic of the upcoming Doubleclick ID loss in 2019 and how to prepare for it if you’re a data-driven marketer. Learn what sort of advertiser needs to consider switching to an open ID and who is better off sticking with Google’s ID. Watch the full presentation and discussion below:

More on the Ads Data Hub series

  1. What is Google’s Ads Data Hub and is it right for me?
  2. How does Google’s Ads Data Hub Affect My Data Management Platform (DMP)?
  3. How does Google’s Ads Data Hub Affect My Analytics?

 

 

Continue Reading

Call for Advertising Industry to Protect Consumer Privacy, Provide Ad Transparency, and Secure Publisher Data

Thunder’s mission is to solve bad ads. To that end, Thunder joined the Coalition for Better Ads at the end of 2017. Now, Thunder is calling for the industry to go beyond just higher standards for creative. Thunder wants to put in place stronger protection for consumers and publishers while also providing greater transparency for advertisers.

Thunder had the recent honor of guest writing in the Association of National Advertisers (ANA) on what Cambridge Analytica taught the ad industry about what consumers expect and what publishers will need to do going forward. In this column, Thunder CEO also touches on how advertisers can work with these groups to ensure a better Internet where only effective, non-intrusive advertising rules. Here’s an excerpt:

Ultimately, everyone has to give a little something to get much more in return. Moving advertising to an anonymized ID tied to ad exposure will benefit the entire internet. Consumers will get better advertising and privacy, publishers will remove their liability and data leakage, and advertisers will gain transparency into their advertising.

 

 

Continue Reading

Neustar and Thunder join forces to deliver better customer experiences, powered by people-based intelligence

SAN FRANCISCO, Aug. 28, 2018 (GLOBE NEWSWIRE) — Thunder Experience Cloud, the leader in people-based ad serving, and Neustar Marketing Solutions (a division of Neustar, Inc.), the leading unified marketing intelligence platform for marketers, today announced the integration of Thunder’s people-based ad server with the Neustar Identity Data Management Platform (IDMP) and the Neustar MarketShare solution. The partnership will enable brands and agencies to quickly customize ad creatives to each customer, as well as measure performance for real-time optimization.

Thunder’s dynamic creative optimization (DCO) solution is a people-based, dynamic ad server that enables advertisers to factor in data signals such as CRM, weather, device type, time, media exposure, and now, audience data from large Data Management Platforms (DMP) like Neustar.

Customers of Neustar and Thunder will be able to target creative messaging for individual, real people and audience segments across digital channels such as display, video and mobile. By synchronizing people IDs on the open web, they can achieve a higher level of personalization, consistency and accuracy, eliminating irrelevant or redundant advertising.

In addition, Thunder’s people-based Experience Measurement solution tracks the performance of ads from exposure to viewer to conversion to allow for a high level of optimization. From there, joint customers can quickly and easily activate media by person tracked on the open web through the Neustar IDMP. This people-based data set will also be integrated within the Neustar IDMP and the Neustar MarketShare solution.

“Advertisers must be able to have a clear view of how their marketing performs across channels – which creatives and messages are being shown to whom, when and where. Neustar is dedicated to giving the industry access to independent and accurate media exposure data, ensuring brands and agencies have the tools they need for personalized, measurable experiences at scale,” said Steve Silvers, General Manager, IDMP, Neustar.

“There is no excuse for a bad ad,” added Victor Wong, CEO of Thunder. “This integration is another step toward ensuring every ad meets the highest standard of relevancy, frequency and impact, ultimately creating a better customer experience.”

About Thunder:
Thunder solves bad ads. Thunder Experience Cloud enables enterprises to produce, personalize, and track their ads cross-channel to achieve the right consistency, relevancy and frequency. Consumers maintain privacy, publishers safeguard data, and brands gain transparency through Thunder for a better ad experience for all.  To learn more visit: https://www.makethunder.com/

About Neustar Marketing Solutions
Neustar, Inc. helps companies grow and guard their business in a connected world. Neustar Marketing Solutions provides the world’s largest brands with the marketing intelligence needed to drive more profitable programs and to create truly connected customer experiences. Through a portfolio of solutions underpinned by the Neustar OneID® system of trusted identity and through a privacy by design approach, we enhance brands’ CRM and digital audiences, enable advanced segmentation and modeling, and provide measurement and analytics all tied to a persistent identity key. Neustar’s position as a neutral information services provider, and as a partner to Google, Facebook and Amazon, provides marketers access to the most comprehensive customer intelligence and marketing analytics in the industry. More information is available at www.marketing.neustar.

Continue Reading

How does Google’s Ads Data Hub Affect My Analytics? (Part III of the Ads Data Hub Series)

Note: We provided an overview of Ads Data hub in Part 1, and how Ads Data Hub will impact DMP’s in Part 2. This post covers data lakes and how analytics will be impacted in the Ads Data Hub world.

Many large brands today have set up “data lakes” where all their data gets stored and made available to other applications for processing and analysis. These data lakes combined with business intelligence tools such as Tableau have created powerful analytics environments where brands can answer questions such as:

  • What customer segment is most responding to my ads?
  • Which ads are leading to the most amount of lifetime customer value?
  • Do people who see my ads spend more with me?
  • Am I spending more money to reach my customers than they are spending with me?

Brands have staffed up data analysts and data scientists to make sense of all this data and answer these important business questions to improve strategy and validate what partners are telling the brand.

Data lakes ultimately rely on data to flow into them. Google’s recent changes with Ads Data Hub keeps data locked within Google Cloud and cannot be combined outside of Google’s controlled environment. As a result, data lakes for marketing are under threat by recent changes by Google.

Data Lakes without Data

Consequently brands with sensitive customer data are forced to decide whether to upload that data to Google to run in a Google-controlled data lake or keep it off the Google Cloud where they’ll need to find other vendors to solve their needs for tracking, analyzing, and modeling.

If you want to maintain control of your own data lake and preview it from drying up, talk to Thunder about our Experience Measurement solution. 

More on the Ads Data Hub series

Continue Reading

How does Google’s Ads Data Hub Affect My Data Management Platform (DMP)? (Part II of the Ads Data Hub series)

Note: We provided an overview of Ads Data hub in Part 1. In this post, we look at how Ads Data Hub will impact DMP’s in general.

Data management platforms (DMPs) power the marketer’s ability to track, segment, and target audiences across programmatic media. Leading DMP solutions include Salesforce DMP (previously known as Krux), Neustar IDMP, Oracle BlueKai and Adobe Audience Manager

If you weren’t paying close attention, you may not realize that the changes Google have announced have blown a hole in your DMP.

 

Two major capabilities are affected by the pending DoubleClick ID removal from logs and push toward using Google’s Ads Data Hub: (1) segmentation and (2) frequency capping.

First, marketers currently use DMPs to create new audience segments based on media exposure. A DMP can keep track of media exposure if its own tags/pixels can run with the ad, but on many publisher inventory such as Google’s Ad Exchange, DMPs are banned from running their code. These publishers are worried about data leakage, which happens when the DMP pixels proprietary audiences on media (such as sports lovers on ESPN.com) and purchased these users elsewhere without paying the publisher.

Historically, the DMP could still get a record of media exposure from the ad server such as DoubleClick, which would share data on who saw the ads running. Using DoubeClick’s data, the marketer could then still segment audiences within the DMP based on who saw the ad, who converted, etc.

Now that Google has discontinued the sharing of logs with IDs, DMPs are no longer able to see media exposure on either inventories on which they are explicitly banned and or inventories where they are allowed to operate but that Google’s DoubleClick ad server is used by the advertiser. If DMPs are to continue to be useful to the marketer, they will need a new source of data.

Second, some marketers use DMPs to create frequency caps across media platforms. By getting their pixel/code to run with an ad, or by ingesting ad serving logs, DMPs can count impressions exposed to a particular user ID and then send a signal to platforms like DSPs to stop buying a user after a certain amount of exposure. However, without log level data, DMPs will not be able to count frequency for inventory in which they are banned, leading to less accurate frequency measurement and therefore less precise frequency capping.

How do I keep my DMP running at full performance?

Marketers who have invested in a DMP and want to keep its capabilities at full power would be advised to either buy more digital media that allow DMP tracking or find an alternative ad tracking or serving solution that can data transfer log files to the DMP. A combination of these two strategies would allow a brand to continue using its DMP to its fullest by giving the DMP the complete picture of ad exposure tied to person.

If you want to add an independent ad tracker to your DoubleClick stack or to keep powering your DMP with data, talk to Thunder about our Experience Measurement solution. 

More on the Ads Data Hub series

Continue Reading

What is Google’s Ads Data Hub and is it right for me? (Part I of the Ads Data Hub series)

What happened to DoubleClick?

Most marketers today use DoubleClick Campaign Manager (DCM) as their primary ad server for delivering ads and tracking ad exposure to conversion. The largest advertiser and most sophisticated advertisers relied on DCM data to do analytics, attribution, and media activation.

These advertisers would “data transfer” log-level data (the raw data for each impression rather than the aggregate data that hides user-level and impression-level information) to their data management platform, data lakes, and vendors that do analytics or attribution modeling.

 

In April, Google announced it will no longer allow data transfer of DoubleClick log-level data with IDs. This decision effectively destroyed most of the value of the log-level data exported from DCM because advertisers wouldn’t know who saw the ads but only how often an ad in total was served. DoubleClick could be used only to verify that the total amount of impressions bought were actually delivered but all the other powerful use cases like analytics, attribution, and data management would no longer be possible with DoubleClick data.

In June, Google announced it was sunsetting DoubleClick as a brand and folding everything under Google’s brand.

R.I.P. DoubleClick.

Enter Google Ads Data Hub

At the same time, Google pushed forward its own solution to this new problem for marketers — Ads Data Hub. This product is essentially a data warehouse where ad exposure data is housed and can be connected to Google’s own solutions for attribution, analytics, and data management.

One new benefit is access to the Google ID, which is a powerful cross-device ID that uses data from users logging into Google services like Android, Maps, YouTube, etc. Previously, DoubleClick was only tracking and sharing a cookie-based DoubleClick ID, which neither connected cross-device ad exposure and conversion nor reconciled multiple IDs to the same person. For many advertisers doing log-level data analysis and activation, this new ID is a big upgrade because it provides more accurate measurement.

One major downside is that this data cannot leave Ads Data Hub. Consequently, you cannot do independent verification of Google’s attribution or analytics modeling. If Google says Google performs better than its competitors, you will have to trust Google at its word. In the past, you would at least have the raw data to apply your own attribution model if you so wanted, or to re-run Google’s calculations to verify its accuracy (since big companies are not infallible).

By extension, outside ad tech providers (such as DMPs, MTA, etc.) who may be best in-class will have a much harder time working with Google solutions. As a result, you will be dependent on Google.

To do matching of off-Google data such as other ad exposure or conversions that happen offline, Ads Data Hub now requires you to upload and store your customer data in the Google Cloud. In that environment, it can be matched with Google’s ID and tracking so you can build a Google-powered point of view of the consumer journey.

In a way, Ads Data Hub is for those who trust but don’t need to verify. It is a good solution for advertisers who today spend the vast majority (75%+) of their ad budget with Google because ultimately if their advertising isn’t working, no matter what Google says about how it is performing, it would be ultimately accountable for the results. You wouldn’t need to verify calculations to know if your ad budget is wasted.

What else can I do?

Another solution is to add independent ad serving and/or tracking in addition to or in replacement of Google. By doing so, you can still generate log-level data for Google-sold media but it will not be tied to a Google ID. Instead, you will be using your own ID or a vendor’s cross-device ID to understand who saw what ad when, where, and how often.

This approach is best suited for large advertisers who want best in class ad tech solutions to work together, and who cannot spend all their money on a single media platform to achieve their desired results. Typically brands large enough to afford data lakes, independent attribution providers, and data management platforms are the ones who will have the most to lose by moving to Ads Data Hub.

If you already realize you want to take a trust, but verify approach in your ads, talk to Thunder about our Experience Measurement solution. 

More on the Ads Data Hub series

Continue Reading

What CMO’s Say About Ad Experiences

Marc Pritchard famously said “It is time for marketers and tech companies to solve the problem of annoying ads and make the ad experience better for consumers.”

What do his peers think? The CMO Club has partnered with Thunder to publish a “Guide to Solving Bad Ad Experiences,” which includes a survey of over 80 CMOs and an interview with the CMO of Farmers Insurance on the impact of bad ads and how people-based marketing can fix them.

Some key findings include:
  • 74% of CMOs consider brand loyalty as most negatively affected by bad ads
  • 55%+ of CMOs consider frequency and relevancy as the top factors in bad ads
  • 78% of CMOs consider it “inexcusable” to serve ads for products the customer already bought from them
  • 71% of CMOs consider frequency capping important for ad experience but 60% aren’t confident in even their frequency counting!

Click here to download the full research report.

Continue Reading

What is a CMP?

CMP is the hot adtech acronym of 2018. There are actually two meanings to this term: (1) Creative Management Platform and (2) Consent Management Platform. Here’s an overview of both these products and why you may need one.

Creative Management Platform

Introduced in 2016 by Thunder, the CMP acronym original stood for “creative management platform,” a tool for producing and trafficking ad creatives. Rather than just a general purpose creative editor like Adobe Photoshop or Animate, which are applications built for a single designer to use by him or herself, CMPs are meant for an enterprise that has a scale issue with creative.

Many brands, agencies and publishers are increasingly needing to build ads in different sizes and versions for different audiences and media formats. Consequently, creative production demands have grown exponentially while most creative organizations can only scale linearly in their capability by adding more designers and programmers. Because traditional creative editors were built for highly advanced users, a creative bottleneck formed as demand went up and not enough talent or payroll existed to fill the void.

Creative Management Platforms radically simplified ad production by providing easier interfaces and automated production tasks like re-sizing. Forrester began recognizing CMPs in 2017 as part of their broader creative ad tech research which has been timed with the rise in enterprise demand for new marketing creative technologies.

Consent Management Platform

Introduced in 2018, the new CMP acronym stands for “consent management platform.” The European privacy laws known as GDPR required publishers and marketers to obtain explicit consent for certain tracking and targeting data. As a result, a new category of tools emerged to specifically help these enterprises collect and keep track of user consent.

The CMP then feeds that consent information tied to an ID to other selected partners in the digital advertising supply chain. As a result, every party in a publisher’s supply chain understands what data they may use and for what.

Which CMP do I need?

It depends if you’re looking to solve a creative problem or a data privacy problem. Talk to Thunder if you need help with your data-driven creative problems or digital creative production problems. Check out these consent management vendors if you’re looking to solve a privacy preference problem.

Continue Reading

Why Marketers Need More Than One DSP – Understanding The Risks

The average advertiser uses 3 DSPs.  Part #1 of this series examined the reasons digital advertisers make use of multiple DSPs in their programmatic bidding.  Of course, the use of multiple DSPs also creates its own challenges. So in Part #2 below, we look at the challenges created around frequency and bidding against oneself by using multiple DSPs, and how the smart marketer overcomes these challenges.

Don’t multiple DSPs just bid against each other for the same inventory?

When advertisers think of using multiple DSPs to bid on inventory, the most common concern that comes to mind is that the inventory between the DSPs will overlap, and the DSPs will be bidding against each other.  In other words, the advertiser will be bidding against itself, thus inflating its bids and artificially driving up media costs.

And in a world of 2nd price auctions, marketers can see why this is a scary prospect.  We discussed in detail how bidding worked in Part #1 of this series, but here’s a brief summary:

First, DSPs conduct internal auctions and then sends the winning bid to an exchange or SSP for a subsequent auction.  These DSP internal auctions are conducted on a 2nd price basis, which means that an advertiser bidding $25 for an impression will really only bid $5 in the SSP auction if the 2nd highest bid in the DSP’s internal auction was $5.  

What does this mean if the same advertiser had multiple DSPs? Well, if the 2nd highest bid for the same impression in the advertiser’s other DSP was $10, then now the SSP is choosing between bids of $10 and $5 from the same advertiser.  And if the 3rd price in the other DSP was $4, then the advertiser would have cleared the SSP auction at $4 if it had only used the first DSP.

This scenario is certainly possible, but marketers have increasingly overlooked this concern for two reasons. These reasons both stem from the rise of header bidding.

First, for all the bids that are inflated due to the use of multiple DSPs, there are as many bids that a single-DSP marketer that will lose in a header bidding world without using multiple DSPs. As was explained in Part #1 of this series, precisely because SSPs conduct 2nd price auctions, an advertiser can win an exchange’s auction, but lose the unified auction to an exchange that had a higher 2nd price that was lower than the advertiser’s actual bid price.  So, if the advertiser’s main goal is to reach its audience, then it will want to use more DSPs (and win more internal auctions). This inevitably translates to more exchanges submitting the advertiser’s winning 2nd price bid to the header bidding unified auctions, and more wins overall.  

Is this bidding against oneself?  Perhaps, but with header bidding, this is often required to simply win enough auctions to achieve desired scale.

Second, header bidding is bringing about a seismic shift in real-time bidding from 2nd price auctions to 1st price auctions within SSPs and exchanges in order to eliminate this scenario in the first place.  Since header bidding unified auctions select the highest price submitted by participating SSPs and exchanges, SSPs and exchanges are incentivized to maximize the chance of winning the auction, which means submitting the bid with the highest price. In practice, they follow 1st price auctions and submit the winner with their 1st price bid, rather than the 2nd price.  Many SSPs, such as Pubmatic and OpenX, are now following this practice for precisely this reason. Once SSPs and exchanges are using 1st price auctions, the risk of inflating one’s bid goes away as long as an advertiser bids the same amount for the same category of inventory across their multiple DSPs.

How to control frequency with multiple DSPs?

A more serious challenge raised by the use of multiple DSPs than inflating bid prices is the loss of control over ad frequency.  And here, this challenge remains largely underserved, even if the demand for solutions continue to grow among large advertisers.

The main reasons why managing frequency of ads served to individuals matters are (i) to limit the frequency of ad serving to individuals in order to reduce waste, and (ii) to avoid burnout and negative brand associations from over-exposure.  We have all seen bad ad experiences where a brand bombards us with the same ads. So when using a single DSP, advertiser often follow the best practices of capping frequency by day (otherwise known as pacing) and by month, campaign duration or user’s lifetime (to limit the overall exposure to a brand’s advertising).  

However, when using multiple DSPs, frequency capping becomes impossible to accomplish on the DSP level (since the DSP’s don’t actually talk to each other). What solutions are there?

One solution is to control frequency capping on the ad server.  Doubleclick Campaign Manager supports frequency capping, but rather than suppress media buying (as a DSP would via frequency capping) DCM serves a blank ad. This solution is pretty unsatisfying to the advertiser, as it results in significant wasted media spend.  

DMPs, such as Adobe Audience Manager and Oracle BlueKai, claim to offer cross-DSP frequency capping, by tracking ad impressions and then suppressing users via existing integrations with DSPs.  It’s not uncommon to use a DMP to create suppression audiences, so this seems like a natural extension of this capability. Unfortunately, Google blocks DMPs from tracking impressions on GDN inventory.  Currently 14 DMPs are blocked by Google from tracking impressions in GDN. Since Google touches a significant portion of display inventory, frequency capping becomes much less useful without cooperation from the Google ecosystem.

We expect the use of multiple DSPs to be a growing trend for major advertisers that require scale given the evolving mechanics of real-time bidding auctions. Spurred on by this new trend, these same advertisers who need scale will also be the ones most concerned with solving for control over frequency. Stay tuned for solutions that emerge in the marketplace.

Continue Reading