Martin Weigel

View Original

Data Without Context, Results Without Consequence, Counting Without Analysis... An Industry Without Conscience?

See this content in the original post

"There are three kinds of lies: lies, damned lies and statistics”

Leonard H. Courtney

Is something rotten in the state of adland?

An app for a very big brand is reported as having been downloaded 51,000 times... A significant global brand is reported as having added 3,580 fans on Facebook as a result of marketing activity... A marketer touts the size of their facebook fanbase - 3,600,112 fans.

And that’s all we’re told. 

Case study after case study, opinion piece after piece, article after industry article unthinkingly regurgitates this kind of data in the naive belief that it actually signifies anything - or that all of us are fool enough to swallow this nonsense.

I don’t think I am the only one it irks.

If we want both ourselves and our efforts to be taken seriously - better, if we want our output to be valued appropriately - then this habit must be exposed, ridiculed, and stopped.

All noise, no signal

Until we put data into some kind of context - of total category users, total brand users, total potential prospects - or of marketing investment, competitor activity and performance, etc., this kind of data is utterly meaningless. 

It's schoolchild statistics.

(Incidentally, a back of envelope calculation as to what % of total brand or category users some of these numbers represent exposes some of them as to be so small as to utterly laughable).

A data point on its own  - what the web analytics guru Avinash Kaushik has termed a “lonely metric” - is devoid of meaning. 

Yet too many of us stand in apparent awe at apparently big numbers. Or expect others to “ooh”  and “ahh” with childish and gullible glee.

We’re behaving like hack journalists. Not experts in and advocates of the effectiveness of creativity.

Noise without consequence

Not only is this kind of context-free data free of all meaning and insight. Much of it is entirely without real consequence.

All that intermediate measures tell us is that someone saw our content. And perhaps that they did something  - or if you like, ‘engaged’  - with that content.  But just because people  were exposed in some way to our content does not mean it was successful. As Rob Campbell has put it: “I am aware of Hitler. But I am not a Nazi.”

They are evidence that something was happening in response to our communications. They might tell us about the spread and popularity of our content.   But they are not evidence of effectiveness.  They are not business results and they tell us nothing about the financial value created for a business.  

However, judging by some of our industry’s public discourse, it would seem that large parts of ad- and marketingland are behaving as if they don’t know the difference between effects and effectiveness. Or as if they think they’re in the entertainment business. In which getting people to watch - and maybe ‘engage’ with - our content is the whole end purpose of the enterprise.

And we complain that advertising and marketing don’t get taken seriously enough by those occupying the boardrooms of the corporate world.

Counting versus evaluating

Our industry’s predilection for the slick case study combined with the plenitude of new forms of data seems to be encouraging the habit of counting. At the expense of real insight and analysis.

If there’s one thing digital stuff is good at, it’s leaving behind it a vast trail of data.  It gives us more and more things we can easily and immediately count - searches, views, visits, time on page, bounce rate, exit rate, time on site, linking,  forwarding, following, referring, clicking, friending, liking, +ing, and so on. 

All these things are easy to monitor and count, they tend to move quickly, and they’re relatively easy to link to marketing activity. And they give us lots of (seemingly) big numbers to tout in those case study videos. Tens of thousands of this! Hundreds of thousands of that!

But counting and analysis are very different things. 

As anybody who has submitted a paper for the EFFIEs, IPA Effectiveness Awards or the Cannes Effectiveness Lions will know, analyzing the impact of communications content (as distinct from all the other myriad of factors that might have made a difference) on on actual business performance  is complex, challenging, and takes time to do. And it requires meaningful timeframes of data to investigate, given that advertising of whatever form rarely pays back in the short-term.

Those content with merely counting intermediate effects either don’t understand what the point of our business is.  Or they’re lazy.

In truth, most of industry is possessed of a conscience. The EFFIEs, IPA Effectiveness Awards, and the Cannes Effectiveness Lions are evidence of this. 

So I don’t mind if some in ad- and marketingland  want to behave as idiots or lazy journalists.

But I do mind when their noisy behaviours risk bringing the rest of our industry into disrepute. Worse, when they encourage the very short-termism that we so often complain is the scourge of marketing and the creation of real, sustainable, long-term value for businesses.

Dear planners, publicists, and journalists

If you want yourself, your agency, your industry and its output to be taken seriously...

  1. Stop merely peddling vanity metrics

  2. Learn the difference between effects and effectiveness

  3. Stop kidding yourself that intermediate measures are proof of effectiveness

  4. Distinguish between counting and real analysis

  5. Don’t succumb to the standards of hack reporting

  6. And wake up to the fact that some of us see straight through this kind of bullshit

Sources

Avinash Kaushik, Web Analytics 2.0: The Art Of Online Accountability And Science Of Customer Centricity