Hitting EJECT on the escape pod

prometheus-317

 

It’s time to get off the grid, and off the hamster wheel.

To everyone who you visited, read, commented, debated, and shared over the past year, thank you for coming to this little corner of the internet.

The usual intermittent service will resume at some point in 2017.

Whatever you’re celebrating, may it be a good and peaceful one.

 

Just because you think you’re a ‘native’ doesn’t mean you’re smarter

19-hockney-getty

If there is one bias in our industry that it has yet to properly face up to, it is its systematic bias against age. According to the UK’s Institute of Practitioners in Advertising, the average age of an employee in the advertising industry is thirty-four. More than two fifths are said to be aged thirty or under. Less than six per cent are over fifty. One need not look far for evidence of bias and judgement. According to Keith Weed, the Chief Marketing and Communications Officer for the packaged goods giant Unilever:

We have three versions of people. On one side we have the digital natives who have been born and bred in a digital world. On the other side we have people like myself in my 50s who have children in their 20s and if I didn’t engage with them on digital platforms I wouldn’t have a relationship with my children. In the middle we have what I call the ‘lost generation’, people in their late 30s and early 40s who don’t yet have grown up children who are digital natives and weren’t digital natives themselves. These are the very people who are leading so many of our brands and businesses, and they’re bluffing too much about digital from what they read in the Financial Times or Marketing Week.

There it is. The ‘digital native’. A creaking unevidenced metaphor of dubious origins and toxic ethics. 

Now I doubt if anybody wielding the term ‘digital native’ has any idea of the origins of the term, let lone interest in whence it came. But it is worth reviewing briefly its origins. If only to recognise how spectacularly lacking in evidence it really was. It was  Marc Prensky, an educational consultant who published a paper in 2001 which argued that:

A really big discontinuity has taken place. One might even call it a “singularity” – an event which changes things so fundamentally that there is absolutely no going back. This so-called “singularity” is the arrival and rapid dissemination of digital technology in the last decades of the 20th century. Today’s students – K through college – represent the first generations to grow up with this new technology. They have spent their entire lives surrounded by and using computers, videogames, digital music players, video cams, cell phones, and all the other toys and tools of the digital age. Today’s average college grads have spent less than 5,000 hours of their lives reading, but over 10,000 hours playing video games (not to mention 20,000 hours watching TV). Computer games, email, the Internet, cell phones and instant messaging are integral parts of their lives. It is now clear that as a result of this ubiquitous environment and the sheer volume of their interaction with it, today‟s students think and process information fundamentally differently from their predecessors. … What should we call these “new” students of today? Some refer to them as the N-[for Net]-gen or D-[for digital]-gen. But the most useful designation I have found for them is Digital Natives. Our students today are all “native speakers” of the digital language of computers, video games and the Internet.”

Now the language we use matters. And it invariably betrays the mind and motives of those who utter them. The language of ‘natives’ versus ‘immigrants’ is no exception. Indeed it is freighted with conscious or unconscious bias, value judgement, and assumptions of power.

Look to the dictionary and we see that the term “native” refers to something innate, inherent, belonging. The term “immigrant” by contrast, describes someone who comes form somewhere else to live and settle in a place. So the ‘native’ belongs. And the ‘immigrant’ does not. One has rights and privilege – the other must negotiate and earn. One will always belong. The other will always be from somewhere else. As Prensky notes:

As Digital Immigrants learn – like all immigrants, some better than others – to adapt to their environment, they always retain, to some degree, their “accent,” that is, their foot in the past. The “digital immigrant accent” can be seen in such things as turning to the Internet for information second rather than first, or in reading the manual for a program rather than assuming that the program itself will teach us to use it. Today’s older folk were “socialized” differently from their kids, and are now in the process of learning a new language… Those of us who are Digital Immigrants can, and should, laugh at ourselves and our “accent.”

We’ve inhaled this odious and condescending nonsense uncritically. And the toxic and prejudicial nature of the metaphor goes further. For the language of ‘immigrant’ and ‘native’ is the language of place and space. As Bayne and Ross note, “the metaphor demands a territory, a nation-state or a landmass”. And here the metaphor becomes more unacceptably odious. Just as the world is becoming ever more digitised, as every day behaviours, transactions, connections, conversations, environments, and objects are becoming enabled through digital networks, just as the world is becoming more digital, and just as the digital world is becoming ubiquitous, the metaphor of the digital native insists that the immigrant does not belong in this world. He or she does not belong in the world into which they are born. To assume the position of grand arbiter, to assume the role of border control to the entire world and suggest that some people are natives and not at home in their own world, is to insist on their own permanent state of alienation. It is an inexcusable and repugnant social position to take, and should be rejected wholesale.

But let’s put – as Marketing- and Adland so easily do – the corrosive quality of value-laden language, and the niceties of ethics and civic values to one side. The notion of the ‘digital native’ is a glib generalisation, and as with all glib generalisations, it obscures the real truth. The entertainment and information diet of young people is not entirely digital in its delivery. Look to the UK and we see that live TV accounts for 43.5% of the total daily video-viewing time of 16-24 year olds. Facebook and YouTube combined account for just 16% of their video-viewing total daily. And when one zeros in on their digital diet, we see that not all so-called ‘natives’ behave equally.  Some are deeply engaged and active. Some are intermittently engaged and active. And the vast majority are fairly passive consumers, observers or lurkers. It’s what some have termed ‘participation inequality’. Yet the language of digital nativism ignores and bulldozes over the fact that behaviours in any marketplace are segmented.

The idea of digital nativism not only treats an entire generation as one homogeneous mass, but it assumes that all of it is possessed of familiarity and mastery. They have nothing to learn. They already know it. Now so-called ‘digital natives’ have plenty of practice at using certain forms of technology – say, using Snapchat or Instagram.  They may be versed in the rites, rituals, unwritten codes and rules that can come with these platforms. But that familiarity does not automatically confer upon them a mastery of technology in all its forms and all its applications. After all, I can hold a pen and write. But that does not mean that I am qualified and capable as a screenwriter, newspaper columnist, poet, author, speech writer, or novelist. It baffles the mind that more people have not called bullshit on the vacuous claims of digital nativism.

And what of the non-natives? The so-called immigrants? Mr Clift, the former marketing chief of Unilever, worried in an interview that “Most of our brands are managed by people who have had to learn [this stuff]” as if active learning was somehow inferior to inherited knowledge. Yet the fact of the matter is that familiarity can dull the mind as much as it can lend it advantage – and here I am reminded of the the essay by David Foster Wallace, ‘This Is Water: Some Thoughts, Delivered on a Significant Occasion, about Living a Compassionate Life’.  Beginning as commencement address at Kenyon College in 2005, it was published in 2009, and it, Wallace provides us all with a cautionary tale:

There are these two young fish swimming along and they happen to meet an older fish swimming the other way, who nods at them and says “Morning, boys. How’s the water?” And the two young fish swim on for a bit, and then eventually one of them looks over at the other and goes “What the hell is water?”

Wallace’s point of course is a simple one – that being born into a reality, a system, can make us entirely blind to that system. Only by stepping outside a system, after all, can we truly apprehend it. And that is the risk every so-called ‘native’ must contend with. In sharp contrast, so-called ‘immigrants’ or newcomers by the very fact of their not being familiar with the system, can be far more curious, more thoroughly, deeply questioning, less likely to take things at face value, or subscribe to inherited, received wisdom.

We could, then – natives and immigrant alike – do worse than look to the British painter David Hockney for inspiration and example. After a decades-long career as an innovative painter, and at the age of seventy-four, Hockney started drawing on his iPhone and, eventually, the iPad. His recording of the shifting seasons became a project of almost a hundred iPad drawings, entitled The Arrival of Spring in Woldgate, East Yorkshire, 2011. In an interview he spoke of his willingness to experiment with technology:

I’m really only interested in technology that is about pictures. I’m interested in anything that makes a picture. I was always interested in photography because it makes a picture. And even fax machines, when I found out you could make a picture if you did them right. There’s no such thing as a bad printing machine. So long as it prints, it’s doing something. If you feed the right things into it, the right things will come out of it. I’ve always gone into anything technological. I’m convinced that technology and art go together – and always have, for centuries.

And this is the point. What matters is having the intellectual energy and disposition to be interested, to keep being interested, to keep pace with change, and to change and adapt in response to it.

There is another truth about being a non-‘native’.  We’ve seen more of everything. And we’ve seen a greater variety of everything. We’ve seen more brands, more opportunities to be seized, more problems to be solved, more briefs, more work, more pitches, more presentations, more ideas. We’ve had more client conversations, met more different kinds of clients, worked with a more kinds of businesses, sat in more meetings, endured more conference calls, grappled with more feedback, done more presentations, sat through more research, talked to more consumers. And with divining the regularities and patterns, the recurring challenges, the repeated mistakes, comes knowledge, and dare I say it, a form of wisdom. And yet as Jay Haines, founder of executive search firm Grace Blue puts it, “Singly, the most undervalued commodity in this business is experience. It’s overlooked in favour of the bright shiny object”. Phil Adams articulates well all that is lost or ignored, when we undervalue experience:

We have a deeper understanding of brands, what they are for, how they work, how they relate to corporate culture and, most importantly, how to make them charming, disarming and therefore tolerable interruptions to people’s lives. We have a deeper knowledge of what it takes to have branded ideas serve valuable commercial purpose. And this is far more difficult than understanding the nuts, bolts and nuances of a new digital platform… We will be quicker to spot a crap idea which is in danger of putting down roots. And we will be more ruthless with the necessary mercy-killing so that creative attention can be diverted to more productive matters. Conversely we will have a more educated nose for that little acorn idea whose potential would go unrecognised by less experienced eyes, including those of its authors. Young idea farmers have a lot to learn from their elders about the cultivation, nurture and stewardship of creativity.

One more thing. We should beware making broad claims about cognitive decline in later adult years. As Timothy A. Salthouse, professor of psychology and director of the Cognitive Aging Laboratory at the University of Virginia, has written:

Although there is no shortage of opinions about cognitive ageing, it sometimes seems that relatively few of the claims are based on well-established empirical evidence … assertions about cognitive ageing may be influenced as much by the authors’ preconceptions and attitudes as by systematic evaluations of empirical research.

Research conducted by the universities of Michigan, Rochester Medical Center and North Carolina has shown that in some important respects the brain actually improves with age. Research has shown for example, that empathy shows an inverse- U-shaped function across the adult life span, with middle-aged adults scoring higher than young adults and older adults.  According to Sarah Konrath, one of the co-authors of the study, late middle-age adults reported that they were more likely to react emotionally to the experiences of others – and they were also more likely to try to understand how things looked from the perspective of others. Professor Kathleen Taylor, of St. Mary’s College of California an expert on adult learning, argues that in contrast to middle-aged adults, younger people are more likely to connect with others from their own place of need.

Meanwhile, according to Dr. Gary Small, professor of psychiatry and director of the UCLA Center on Ageing, an ageing brain can better tease out patterns and see the big picture. Whereas younger people may have better short-term and get-to-the-point-quickly memory, older folks have had a greater variety of experiences and are better able to build a wider image out of a lot of different parts of memory. They can make more connections because they have more things that have happened to them. As Steve Jobs (who would have failed the ‘digital native’ vetting test) said when he spoke to Wired in the 1990s:

A lot of people in our industry haven’t had very diverse experiences. So they don’t have enough dots to connect, and they end up with very linear solutions without a broad perspective on the problem. The broader one’s understanding of the human experience, the better design we will have.

The idea of digital ‘natives’ is as unevidenced and as socially and ethically noxious as any other form of nativism. It’s being used as an easy passport by bullshit artists looking to hoodwink the gullible. And being valued by those who do not know how to respond to change. We should be valuing curiosity, empathy, insight, energy, and adaptability. Not when we celebrate our birthdays.

Sources:

Financial Times, ‘Alert at ‘lost generation’ of marketeers‘, 4.6.10

Interview Magazine

CNN, ‘The aging brain: Why getting older just might be awesome’

Phil Adams, ‘No content marketing for old men

O’Brien Ed., Konrath S.H., Grühn D., & Hagen A.L., (2012). Empathic concern and perspective taking: linear and quadratic effects of age across the adult life span. The Journals of Gerontology, Series B: Psychological Sciences and Social Sciences, doi:10.1093/geronb/gbs055

Marc Prensky, ‘Digital natives, digital immigrants

Are we giving up on intensity?

wonderfullifegeorgebaileyextremecloseup

Back in 1986, Stephen King wrote about what he believed made for a good advertising idea:

A good advertising idea has to be original enough to stimulate people and draw an intense response from them… Any advertisement is competing not just with other advertisements but also with editorial, programmes, people, events and life itself… if an advertisement is to succeed it has to involve the receiver and entice him into participating actively in whatever is being communicated about the brand”

Ripple dissolve to the present day…

The daily need for new content”

Creating more work for less money”

Stories are superior to ads”

In the whole history of mass advertising, the number of transformative ideas that have created wealth via advertising you can count on one set of fingers and toes”

It’s about delivering relevant content at the right time”

Now of course the media landscape is shifting beneath our feet.

Of course we must think about the new ways we can and must connect people to what we make.

Of course we must work on new stages.

Create new shapes for new spaces.

And new experiences for new kinds of attention.

But when did it happen?

When did we give up on intensity?

When did we decide that trading intensity of response for reach was the great leap forwards for marketing communications?

When did we become so fixated upon production and distribution efficiencies that we stopped asking ourselves what kind of ideas the world needs?

When did we fall out of love with ideas?

When did we mistake borrowing the reach of celebrities (sorry, influencers) and packaging it up in hyper-relevant mediocrity as the great, necessary innovation in marketing communications?

When did we decide we need to bring so little to the table?

When did we decide that a steady stream of assiduously targeted, contextually relevant wallpaper was the way to go?

When did we decide that the measure of success was production efficiencies?

When did we decide that striving for media efficiencies was preferable to striving for behavioral change and real businesses effectiveness?

When did we decide that always having nothing to say was preferable to sometimes actually having something to say?

When did we fall for the siren call of infinite inventory and conclude that we must fill it?

When did quantity become more desirable than quality?

When did we conclude that the essence of our creativity was clever distribution strategies?

When did we give up on the idea that we are in the memory business and opted instead to be in the exposure business?

When did we decide that relevance was to be preferred over the capturing of imaginations?

When did we reduce the implications of marketing’s new-found “physical and mental availability” orthodoxy down to the mere need for reach?

When did we decide to vote for entropy?

When did we decide to erase the first lessons of branding – vividness, coherence, consistency?

When did we decide that the coherence and shape and form of a brand is worth giving up for a million tiny forgettable moments of cost-effective relevance?

Of course the media landscape is shifting beneath our feet.

Of course we must think about the new ways we can and must connect people to what we make.

Of course we must work on new stages.

Create new shapes for new spaces.

And new experiences for new kinds of attention.

But when will we look beyond the narrow horizon of reach and relevance?

When will we stop squeezing the idea out of what we call (without so much as a trace of irony) ‘content’?

When will the flight to quality begin?

When will we embrace intensity again?

Going beyond ourselves

98743551

Art is one of the prime ways we have of opening ourselves and going beyond ourselves. That’s what art is, it’s the product of the human being in the world and imagination, all coming together. The irrepressibility of the life in the works, regardless of the times, the histories, the life stories, it’s like being given the world, its darks and lights. At which point we can go about the darks and lights with our imagination energised.”

Ali Smith

Going beyond ourselves feels like the most urgent necessity for all those in the marketing community. And yet nothing is no guaranteed to shut out the world and close down minds as the institutional and intellectual paraphernalia with which we surround ourselves. Happily some of us are digging tunnels out into the daylight.