Data Without Context, Results Without Consequence, Counting Without Analysis… An Industry Without Conscience?



 "There are three kinds of lies: lies, damned lies and statistics

Leonard H. Courtney



Is something rotten in the state of adland?
An app for a very big brand is reported as having been downloaded 51,000 times… A significant global brand is reported as having added 3,580 fans on Facebook as a result of marketing activity… A marketer touts the size of their facebook fanbase – 3,600,112 fans.

And that’s all we’re told. 

Case study after case study, opinion piece after piece, article after industry article unthinkingly regurgitates this kind of data in the naive belief that it actually signifies anything – or that all of us are fool enough to swallow this nonsense.

I don’t think I am the only one it irks.

If we want both ourselves and our efforts to be taken seriously – better, if we want our output to be valued appropriately – then this habit must be exposed, ridiculed, and stopped.


All noise, no signal
Until we put data into some kind of context – of total category users, total brand users, total potential prospects – or of marketing investment, competitor activity and performance, etc., this kind of data is utterly meaningless. 

It's schoolchild statistics.

(Incidentally, a back of envelope calculation as to what % of total brand or category users some of these numbers represent exposes some of them as to be so small as to utterly laughable).

A data point on its own  – what the web analytics guru Avinash Kaushik has termed a “lonely metric” – is devoid of meaning. 

Yet too many of us stand in apparent awe at apparently big numbers. Or expect others to “ooh”  and “ahh” with childish and gullible glee.

We’re behaving like hack journalists. Not experts in and advocates of the effectiveness of creativity.


Noise without consequence
Not only is this kind of context-free data free of all meaning and insight. Much of it is entirely without real consequence.

All that intermediate measures tell us is that someone saw our content. And perhaps that they did something  – or if you like, ‘engaged’  – with that content.  But just because people  were exposed in some way to our content does not mean it was successful. As Rob Campbell has put it: “I am aware of Hitler. But I am not a Nazi.”

They are evidence that something was happening in response to our communications. They might tell us about the spread and popularity of our content.   But they are not evidence of effectiveness.  They are not business results and they tell us nothing about the financial value created for a business.  

However, judging by some of our industry’s public discourse, it would seem that large parts of ad- and marketingland are behaving as if they don’t know the difference between effects and effectiveness. Or as if they think they’re in the entertainment business. In which getting people to watch – and maybe ‘engage’ with – our content is the whole end purpose of the enterprise.

And we complain that advertising and marketing don’t get taken seriously enough by those occupying the boardrooms of the corporate world.


Counting versus evaluating
Our industry’s predilection for the slick case study combined with the plenitude of new forms of data seems to be encouraging the habit of counting. At the expense of real insight and analysis.

If there’s one thing digital stuff is good at, it’s leaving behind it a vast trail of data.  It gives us more and more things we can easily and immediately count – searches, views, visits, time on page, bounce rate, exit rate, time on site, linking,  forwarding, following, referring, clicking, friending, liking, +ing, and so on. 

All these things are easy to monitor and count, they tend to move quickly, and they’re relatively easy to link to marketing activity. And they give us lots of (seemingly) big numbers to tout in those case study videos. Tens of thousands of this! Hundreds of thousands of that!

But counting and analysis are very different things. 

As anybody who has submitted a paper for the EFFIEs, IPA Effectiveness Awards or the Cannes Effectiveness Lions will know, analyzing the impact of communications content (as distinct from all the other myriad of factors that might have made a difference) on on actual business performance  is complex, challenging, and takes time to do. And it requires meaningful timeframes of data to investigate, given that advertising of whatever form rarely pays back in the short-term.

Those content with merely counting intermediate effects either don’t understand what the point of our business is.  Or they’re lazy.

In truth, most of industry is possessed of a conscience. The EFFIEs, IPA Effectiveness Awards, and the Cannes Effectiveness Lions are evidence of this. 

So I don’t mind if some in ad- and marketingland  want to behave as idiots or lazy journalists.

But I do mind when their noisy behaviours risk bringing the rest of our industry into disrepute. Worse, when they encourage the very short-termism that we so often complain is the scourge of marketing and the creation of real, sustainable, long-term value for businesses.


Dear planners, publicists, and journalists
If you want yourself, your agency, your industry and its output to be taken seriously…

  1. Stop merely peddling vanity metrics
  2. Learn the difference between effects and effectiveness
  3. Stop kidding yourself that intermediate measures are proof of effectiveness
  4. Distinguish between counting and real analysis
  5. Don’t succumb to the standards of hack reporting
  6. And wake up to the fact that some of us see straight through this kind of bullshit




Avinash Kaushik, Web Analytics 2.0: The Art Of Online Accountability And Science Of Customer Centricity


  1. Phil Adams

    I spent 18 years in advertising before spending the last 5 “in digital”. It’s so true that digital activity throws off more inconsequential data than you can shake a stick at.
    Another interesting angle on this is one shared with me many years ago by Charlie Robertson.
    Effectiveness = has effects.
    Efficacy = has the DESIRED effects.
    Efficiency = delivers effects that represent a positive return on investment.

  2. mrbarrington

    Is this entirely the fault of the agencies that publish meaningless statistics, or should journalists shoulder more of the blame, since they’re the ones who ought to give context to the noise around us?
    Journalists in all fields have developed the habit of regurgitating press releases verbatim, rather than adding the layer of analysis that you’re looking for.
    So if a campaign only converts, say, 1% of 10m views into Facebook Likes, wouldn’t you expect an ad agency – of all things – to opt always for the more impressive-sounding meaningless stat ( “100k Likes!”)?

  3. Martin

    Thank you for visiting, mrbarrington!
    I think it’s a case of “a pox on both their houses.”
    In your example, I’d absolutely expect many an agency to do that.
    But that doesn’t make it right.

  4. Tokumoto

    Ad agencies tendency to be held unaccountable for creation of value is an epidemic. I see this on traditional and digital agencies, we deliver “brand metrics” that somehow create more business to the client.
    This sort of black box thinking is supported by marketing textbooks and scholars that find difficult to measure advertising efficiency, or are lazy to explain the math.

  5. Gerald

    Yes. No doubt. I mean there a smart (genuinely) digital people discussing this topic already for a while. Counting fans is the least interesting but most simple way to express “I was successful” or not. Nothing more than that.

  6. tom

    Hi Phil
    I like this
    “Effectiveness = has effects.
    Efficacy = has the DESIRED effects.
    Efficiency = delivers effects that represent a positive return on investment”
    I would be a pedant and say effectiveness = effects against stated (and benchmarked) objectives rather than simply having an effect. Efficiency then being a ratio statement of input to output (again against benchmarked objectives).
    Great stuff as usual Martin. My twopenny to add to the point is that digital (broadly speaking) has somewhat fallen victim to measuring that which can be measured rather than measuring that which we would wish to change.
    This results in a lot of “intermediate measures” being bandied around – it is easier to measure the effect of online advertising on online behaviour.
    Establishing causal links between online advertising and shifts in brand equity, whether behavioural, financial or mental, is harder to do and so, to the detriment of the channel, has been shied away from.
    Or something.

  7. Martin

    Thanks Tom!
    I’m with both yourself and Phil on the issue of distinguishing between effectiveness and efficiency.
    The efficiency argument is a dangerous one. Not least of all because the quickest way to increase one’s ROI ratio is to reduce the marketing investment.
    Your insight that we should be measuring “that which we would wish to change” rather than “that which can be measured” is a vital lesson we could all take to heart. I may steal it and pass it off as my own thinking 😉
    Thanks for visiting and commenting!

  8. Martin

    Hey Gerald,
    I almost agree. Counting likes is easy.
    Except that I do take issue with your suggestion that “Counting fans is the… most simple way to express ‘I was successful or not.”
    Likes don’t constitute ‘success’. They merely suggest that something happened. Success is revenue and profit – no profit and loss account ever counts likes!

  9. Warren

    Hey Martin,
    Happy NY. Good to have your fingers back on the keyboard.
    I’ll preface the next comment by saying I enjoy your blog and would buy you a beer any day of the week… but
    You are starting to sound like one of those bitter old CDs in the corner office.
    Think of any form of advertising/marketing from the past 50 years and your post is equally valid.
    I’ll declare my hand and say I work mostly in digital and I do because it’s quick to market, responsive, measurable and (almost) infinitely flexible. I bring a measure of cynicism to it though to ensure it’s always a bridge to something lasting and profitable for brands.
    There’s as many numbskulls in digital as there are in Direct or PR or any stream of comms. Not even sure why you pay that much attention to the reporting/’analysis’ you bemoan to be honest.
    95% of the industry would agree that type of ‘success’ needs to be looked at more deeply.
    Without knowing exactly what your bent is, I’d suggest you make your new year’s resolution to start looking positively at digital and see what it can do for your work.
    After all, you wouldn’t be here if you didn’t think it worked right?
    Warren (@nottheword)

  10. Martin

    Hi Warren,
    Happy New Year to you!
    You are of course right. For better or for worse, no one part of our industry has the monopoly on stupidity!
    So my beef is certainly not with the contribution of ‘digital’. Much of my time these days is focused on helping work out what that can be. My beef is with how people report and think about data (which digital is good at generating).
    I love digital. I just hate the practice of counting without thought or analysis.
    I hope that reassures!