Advertising Effectiveness Measurement
How do I Know Which Half of my Advertising isn’t Working?
John Wanamaker wasn’t the only one to ask this question. In fact, with at least 60% of all ads under-performing, wasting millions of dollars in advertising investment, most marketers today are asking the same thing. Sadly, long-standing creative testing processes are broken, so answers are elusive. The standard practice of spending one’s full advertising effectiveness budget on pretests for 20% of their creative sheds no light on the remaining 80%. How many millions of dollars are at risk?
So, a solution was finally built. A creative testing system now measures every client ad, and every competitor ad, across all media types, with results in 24 hours, at a cost well below the old processes. Does this new process answer Wanamaker’s question? You be the judge.
Advertising Effectiveness Measurement is Broken
Measuring advertising effectiveness began in the late 1800’s and has been the focus of countless studies ever since. Methodologies have ranged from focus groups, readership studies, polling, Day-After-Recall, breakthrough, brain-wave tracking, galvanic skin response, eye-tracking, moment-by-moment, integrated market research, digital audience response and much more. Given the size of big brand advertising budgets, market researchers have sought better ways to justify media and production spend and link to sales results.
The predominant creative testing model has been the custom pretest, which determines an ad’s effectiveness based on consumer responses before it airs. Another is custom post-testing, which provides periodic or continuous in-market research monitoring a brand’s performance, including brand awareness, brand preference, product usage and attitudes.
Advertisers determine the top 10-20% of their ads in terms of planned media spend and have them pretested before they air. Some advertisers send finished art to the pretest, and others send concepts, unfinished art or animatics. All pretesting can be expensive and slow. Traditionally, a client will wait days, weeks or longer for results, meaning he may not have time to edit the pretested ads if problems are found. Major advertisers like Unilever and Procter & Gamble have both expressed concern that the current timeline and cost for ad testing doesn’t fit today’s needs. See this startling article in Net Imperative, “The end of testing? Unilever shakes-up market research in real-time world.”
The reason advertisers only test 20% of their advertising, and usually only TV or Video, is because their budget goes only so far. At the cost of pretesting and post-testing today, they can only test ads with the biggest planned media buys. Still, that leaves 80% of ads that are not tested. Since every ad, news story, social media post, etc. contributes to a company’s reputation, that’s a lot of risk.
How much risk? The study below, provided by ABX, shows ad scores for all ads tested during a 60-day period. See how many fall beneath the horizontal yellow line, which stands for “Average.” Each tiny dot is a separate ad. Each vertical cluster is a different media type.

The higher the scores, the more likely those ads will positively affect business results; the lower the score, the more likely the ad budget has been wasted. With at least 60% of ads underperforming in-market, how can an advertiser know which ones they are? If they did know, they could adjust their media plan/spend. In addition, creative teams receive very little feedback, so how can they learn and constantly improve the creative process?
The Solution: Measuring Every Ad – and Every Competitor Ad
On the face of it, the idea that anyone would consider measuring “every ad” sounds ludicrous. Certainly, using the custom pretest method, the cost alone would be impossible. So how can one affordably test the lion’s share of their creative?
“There are two types of research, customized and syndicated. Customized research is conducted for a specific client to address that client’s needs. Only that client has access to the results of the research. Syndicated research is a single research study conducted by a research company with its results available, for sale, to multiple companies.”
Thanks to new technology and know-how in creative testing, it is now possible to measure all of one’s ads, and their competitors’ ads, at a cost well below custom. These solutions test a universe of ads for entire industry categories on an ongoing basis, with results provided to paying clients.
Some syndicated research firms focus on singular industries in their syndicated ad testing, and some focus on the entire market but limit their work to certain media only, like TV and video. One relatively new firm measures the entire market and all types of media through bespoke technology and market research know-how. In fact, this firm (ABX) has now measured more than 365,000 ads and has the largest norms in the industry with hundreds more being added each day.
Major advertisers who elect to use a syndicated solution can understand the competitive picture and how their own ads stack up. More importantly, because some of these firms deliver results very quickly, advertisers can identify an under-performing ad and replace it, or adjust the original ad. Without the speed of syndicated creative testing, clients wouldn’t be as successful in moving media spend from losers to winners.
The best measurement programs utilize both custom and syndicated services. For some guidance on choices, please see, “Should I Look at Faster, Lower Cost Options to Legacy Ad Testing?”
Syndicated Creative Testing and Links to Sales?
If one has relied on custom pretesting and post-analysis, one may have wondered why advertising results don’t correlate well to outcomes or perform well in market mix models. Traditionally, models utilize media spend data, but not creative testing results. The reason is simple: media spend data is readily available for every ad, but creative scores are not, since only 20% or so have been tested. Thus, the modeler has no way to qualify the various ads and thus assumes they are equal. Correlations are often poor and the true effect of advertising remains a mystery.
When all ads are tested in a syndicated system, creative scores are available for every one of them. The market mix modeler can merge media spend with creative scores, thus providing the needed qualitative weighting. In several case studies done by market mix modelers, advertising results show tremendous improvements in correlations to sales. Through further work, modelers have discovered that creative accounts for more than 60% of ad success. A major study from Nielsen Catalina Solutions (NCS) in 2016-2017, “Five Keys to Advertising Effectiveness,” agrees that Creative is by far the most important element. To see several exciting case studies, go to “Can Creative Testing Improve ROAS, ROI and Predictive Analytics.”
The New Creative Process
Historically, custom testing has provided the creative team with in-depth feedback on the top 20% of their ads. But for the 80% of untested ads, the creative team has been in the dark. Once the untested ads were placed in media, it was harder to know what worked and what didn’t. Today’s digital media testing offers fast feedback on behavioral measures like clicks, likes, retweets and so on. But even then, creatives don’t know what it was about their designs that yielded results. What about results on Radio, Print, OOH, FSI’s, In-Store Signage and other media types beyond the most-tested, Television?
Syndicated creative measurement provides answers readily. Most firms deliver in several days, and a few like ABX in 24 hours. Now, the creative team can see for itself how its work is playing out, and whether certain ads need adjustment. For more on this topic, see “How is Speed Changing Advertising Effectiveness Measurement?“
In addition, access to online dashboard delivery systems offer a treasure trove of creative ideas from one’s own past campaigns as well as competitors’ and other industry campaigns. If copywriters can see what types of ads yield a high Reputation or Purchase score, or a high Dislike score, they can start projects well-informed and ready to go.
To get a better idea of what makes a great ad across all media, see our Top Ads Of The Week Measured For Ad Effectiveness.
Comparing Ads Across Competitors, Categories, Media Types
The only way to really know how one’s creative stacks up is to compare the scores for the overall ads, and for their individual KPIs, against other ads within a common demographic target. The only way to do this is through syndicated creative testing that uses at least one common demographic audience across the board. Most ads are measured against very specific target audiences, and thus are not comparable horizontally.
An interesting solution to this was devised by JJ Klein, Chairman of ABX, and his team. If every ad is measured by a GenPop audience as well as a targeted demographic audience, ads could be compared across the entire database by KPI, Category, Media Type and more. Thus, a company gets a true measure of how ads are stacking up to competitors. Klein’s team has now measured more than 365,000 ads and international norms are being added in 14 countries.
Measuring Local Advertising with National Creative Testing Panels
Many clients are anxious about using national ad testing panels for local ads. After all, these national panels may not be aware of the local company or events. JJ Klein and his team looked into this by testing local ads using local panels and again using national panels. They found no statistically significant difference in the creative scores between the two panels.
Why is this? A good ad is a good ad and a bad ad is a bad ad, no matter if its measured by a national or local panel resides. In national creative testing, the panelists may not have heard of many of the brands whose advertising they will evaluate, so this is the same for local ads.
Global in-Market Creative Testing
On the other end of the spectrum from local creative testing is global. Up until now, global testing had to be handled on a custom basis in pretests and post-market tests. Several of the largest research firms have these capabilities across many countries.
There has been no syndicated creative testing process providing the in-market point of view across the globe until 2017, when methods used in the US successfully by ABX have been exported to 14 countries. Norms are available within and across countries by Category and Media Type. Advertisers will have international advertising effectiveness data never before seen. Countries to date include: Brazil, Canada, China, Australia, France, Germany, India, Italy, Japan, Mexico, Russia, Spain, the United Kingdom and the U.S.
Gender Equality in Advertising
The Association of National Advertisers (ANA) has created the #SeeHer program with a goal of improving the portrayal of women and girls in advertising by 20% by 2020. Major advertisers including ad giants like P&G and Unilever are working to improve the way they portray women and girls in advertising and to associate their brands with programming that positively portrays females.
This is not only the right thing to do socially, but data from Advertising Benchmark Index shows that as gender portrayal improves, ads become more effective and both short- and long-term purchase intent improves significantly. Work is now being done with Multicultural groups to make sure people of all ethnicities are respectfully represented.
For more information on this critical area in advertising click HERE.