Category: Technology

The machines are not coming


There is a dark secret lurking behind every critique of technology, and the secret is us. Smartphones, CPUs and algorithms may indeed be inescapable in modern life, but they are ubiquitous because we demand them. We like our newly endless capacity for self-photography. We like our ability to weigh in, to be consulted, everywhere, all the time. We like the friendships we develop online, however odd, thin or ephemeral. So it isn’t the machines that are coming for us. We are coming for ourselves.”

Michelle Dean reviewing Speak by Louisa Hall, Guardian, 21.07.15

The fracking of attention

FILE - In this March 29, 2013 file photo, workers tend to a well head during a hydraulic fracturing operation at an Encana Oil & Gas (USA) Inc. gas well outside Rifle, in western Colorado. The Obama administration is proposing a rule that would require companies that drill for oil and natural gas on federal lands to publicly disclose chemicals used in hydraulic fracturing operations. The new "fracking" rule replaces a draft proposed last year that was withdrawn amid industry complaints that federal regulation could hinder an ongoing boom in natural gas production. (AP Photo/ Brennan Linsley, File)

“Tell me to what you pay attention and I will tell you who you are”

Jose Ortega y Gassett

Empires of the mind

If buyers of a brand do not think their brand is different or unique (Ehrenberg)…

And if what matters is the creation of ‘mental presence’ (Moran)…

Or ‘memory structures’ (Sharp)….

And if we define this as “The degree to which a given brand comes to consumers’ minds in the context of a particular purchase occasion or consumption occasion.” (Moran)…

Or as “The probability that a brand will be recalled early in a consumer’s consideration set, under a variety of situations and via a variety of stimuli, to the exclusion of competing brands” (Vieceli and Shaw)…

And if this is dependent on “The quantity (how many) and the quality (how fresh and relevant) of the network of brand information in memory, or the brand’s ‘share of mind’” (Romanuik and Sharp)…

And if the brain is not some rigid filing system in which memories reside as fully formed recordings but is something altogether more dynamic and malleable…

And if our brain’s networks of nerve cells or ‘neurons’ are not fixed entities, but are dynamic and continuously modified by experience…

And if thinking, learning, and behaviour all actually change the brain’s physical structure and organization, rewiring the brain, creating and eliminating new connections between neurons (Buonomano & Mezernich, Heilman & Nadeau, Schacter, Svoboda)…

Then perhaps is is not unreasonable to regard our task as the creation, expansion, maintenance, and defence of mental real estate.

And so perhaps it was Judith Williamson (Marxist and academic) who put it better than any brand expert, when she wrote that advertising’s purpose was to create “empires of the mind.”

And if she did put it better than anyone, then perhaps we struggle and jostle for memory.

And perhaps we shouldn’t be quite so shy about the language and metaphors of conquest and battle that have become so unfashionable and politically incorrect in adland.

And if we think of our task as some kind of (benign) synaptic imperialism, perhaps we’ll think beyond the mere act of contact or engagement, and think about what mental infrastructure we leave behind, maintain, and expand.

So far so good.

Highjack people’s cognitive processes, leave some memory traces, reinforce and fresh them repeatedly over time, and (see Binet and Field) extract the financial rewards that lie in so doing.

High-fives and cigars all round.

And yet.

And yet.

(At this point I should note that I have Ted Florea, CSO at PNYC to thank for challenging me to think harder, and to go beyond this convenient and possibly self-serving conclusion).

It’s personal

The easy industry rhetoric of interruption, disruption, attention, and engagement disguises the truth that building brands involves a deeply intimate, and personal process.

As the philosopher Matthew Crawford reminds us:

Attention is the thing that is most one’s own.. in a very real sense this determines what is real for us; what is actually present to our consciousness. Appropriations of our attention are then an especially intimate matter.”

The attention we marketers make claims upon is not ours.

Yet the quantity of things that – from the sublime to the banal – compete and jostle to divert and capture this, our most personal of resources expands at a rate that is hard to truly grasp.

As Charles Clavey has put it, reviewing Crawford’s book:

From the quotidian – the daily onslaught of emails, texts, tweets, and updates – to the innovative – the use of ambient perfume to market coffee, for instance – the world around us relentlessly colonizes our precious attention.”

Modernity offers us almost no escape from the frackers, and hackers of our attention. And we have only just begun to build the internet of things. 

Inhabiting an environment ever more highly engineered to distract us and to redirect our attention, there remains but one uncolonised and unmonetised part of our lives. As Jonathan Crary puts it:

The huge portion of our lives that we spend asleep, freed from a morass of simulated needs, subsist as one of the great human affronts to the voraciousness of contemporary capitalism. Sleep is an uncompromising interruption of the theft of time from us by capitalism. Most the seemingly irreducible necessities of human life – hunger, thirst, sexual desire, and recently the need for friendship – have been remade into commodified or financialized forms. Sleep poses the idea of a human need and interval of time that cannot be colonised and harnessed to a massive engine of profitability, and thus remains an incongruous anomaly and site of crisis in the global present… The stunning, inconceivable reality is that nothing of value can be extracted from it.”

All aboard the hedonic treadmill

Of course the resource of attention in  our waking hours is not an infinite one.

As the oft-quoted political scientist Herbert Simon predicted, information consumes attention:

In an information-rich world, the wealth of information means a dearth of something else: a scarcity of whatever it is that information consumes. What information consumes is rather obvious: it consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention and a need to allocate that attention efficiently among the overabundance of information sources that might consume it.”

In the name of both efficiency and survival, we’re adapted to allocate our attention to the new and novel.

We are built to focus our finite mental resources on new sights, sounds, thoughts and feelings and to filter out the rest.  After all, potential threats and new resources are much more likely to be novel than familiar. Simply put, survival prospects are are not good for an animal that is not suspicious of novelty.

However, there is also survival value for the initial caution to fade if the stimulus is actually proves to be non-threatening. What the Polish psychologist Robert Zajonc terms, ‘exposure effect’ occurs because nothing life-threatening follows the repeated exposure of a stimulus. With time and repetition then, this stimulus eventually becomes a safety signal.

Even the most exciting and novel of stimulus with repetition eventually becomes part of life’s wallpaper of familiarity. The author, poet and naturalist Diane Ackerman vividly captures how the new and exciting eventually becomes submerged within that taken for granted world: 

Novelty will always rivet one’s attention. There is that unique moment where one confronts something new and astonishment begins. Whatever it is, it looms brightly, its edges sharp, its details ravishing… It is a form of revelation, a new sensory litany. But the second time one sees it, the mind says, Oh, that again, another wing walker, another moon landing. And soon, when it’s become commonplace, the brain begins slurring the details, recognising it too quickly, by just a  few of its features, it doesn’t have to bother scrutinising  it. Then it is lost to astonishment, no longer an extraordinary instance but a generalised piece of the landscape.”

So essential to any species’ survival is this arousal by and adaptation to novelty that infants less than a day old will stare at a new image for about forty-one seconds – and then tune it out when repeated exposures render it familiar.

To combat what Pinker has termed the ‘anaesthetic of familiar’, the industries of entertainment and marketing  are compelled to keep bringing us new sensations.

This why you get what the Dutch Nobel Prize-winning ethologist and ornithologist Niko Tinbergen dubbed ‘supernormal stimulus’.

It was Tinbergen’s research into how phoney, exaggerated stimulus could appeal to natural instincts far more than the original targets for which they had evolved that led him to coin the term.

In fact Tinbergen’s work demonstrated how dummies could actually exert far more influence than any real, natural stimulus.

For example, he showed how song birds preferred to feed a fake baby bird on a stick if the dummy beak was wider and redder than the chick’s.

He also demonstrated how song birds would abandon their normal, real pale eggs dappled with grey to sit on polka-dot Day-Glo blue dummies. So large they would constantly fall off them.

Of course animals only tend to encounter supernormal stimuli when they are the subject of experiments.

However, we humans can produce our own.

As Deidre Barrett puts it in her examination of the contemporary manifestations of super normal stimuli:

Candy sweeter than any fruit, stuffed animals with eyes wider than any baby, pornography… Instincts arose to call attention to rare necessities; now we let them dictate the manufacture of useless attention-grabbers.”

This is why you get IHOPs Country Fried Steak & Eggs – a 8oz steak with gravy, two eggs, hash browns and two buttermilk pancakes that provides almost an entire day’s worth of calories, two and half days’ of sodium, and up to two days’ of our sugar requirements.

This is why you get social media headlines such as “Disney Princesses Twerking Will Shatter Your Childhood.”

This is why even The Telegraph peddles headlines such as “You’re doing your morning routine wrong.”

This is why politics isn’t about policies but about media-baiting soundbites.

This is why you get the grotesquerie that is Donald Trump.

This is why movies for adults look like movies for children.

This is why Kim Kardashian exists.

This is why the language of marketing has infiltrated the common, everyday discourse.

This is why Guy Debord was moved to argue (even before the internet and its consequences) that:

All of life presents itself as an immense accumulation of spectacles. Everything that was directly lived has moved away into a representation.”

And such is the relentless, never-ending parade of attention-hacking novelty, argues Professor Tomlinson, that our contemporary consumer culture is now characterized by an expectation of ‘delivery’, rather than of real satisfaction:

What I mean by this is that there is probably a widespread disbelief in the capacity of consumer goods to provide ultimate satisfaction; but that there is, none the less, an expectation that the capitalist system will  – and should- continue to deliver the goods… We expect consumer goods to be functional or novel or amusing.. We expect warranties and after sale service. But in the vast majority of cases these expectations stop short of a conviction that they will, in any profound sense, satisfy our deepest desires. This is not however an obstacle to continued consumption, particularly since it is combined with the happy expectation that something new is always on the way and so it is not necessary – nor does it do – to invest too much into the thing of the moment.”

A question of ethics

So attention is personal – what we attend do defines our reality.

Attention is finite – it is a scarce and thus valuable resource – not just to those who wish to monetise it, but to those to whom is belongs.

And attention is hackable – the world is overpopulated with those skilled in the art of capturing and redirecting attention for their own purposes.

Surely then, we have a responsibility – dare one say, an ethical duty – to the audience.

And to the attention we see to hack.

We enjoy pouring scorn on the reckless bankers and tax evading corporations for behaving as if they operated outside society, with no duty to the broader body politic. But the fact of the matter is that much of ad- and marketing behaves towards people with little more sense of duty, obligation, and responsibility than these popular pariahs.

So let us start with recalling the words of Howard Gossage:

Our first duty is not the the old sales curve, it is to the audience. It is not simply right to treat the audience in the fashion. If we can’t look at it from a broad, ethical point of view, then we ought to look at it personally, to please ourselves – we are all members of the audience too, we are bored or irritated right along with everybody else.”

And yet judging by its output, much of ad- and marketingland holds its audience in something approaching contempt.

With all the grace and socials skills of a drunken boor gatecrashing a wedding party, most advertising makes no concession to fact that it is interrupting something that people have chosen, or opted-in to watch. In return for absolutely nothing, it loudly demands our attention. It’s the marketing equivalent of being robbed at gunpoint.

If you think contempt is too strong a word, just listen to the stream of snark and condescension muttered towards those on the other side of the focus group mirror and upon whose interest and custom our livelihoods and lifestyles ultimately depend.

If you still think contempt is too strong a word, just look at the imagination and empathy free wasteland that is called marketing to women. The best we believe we can do for women through advertising is ‘empower’ them. Because unlike men, they we’re weak, down-trodden, esteem-challenged creatures, who aren’t yet ready for some jolly good entertainment. That’s for men. Who are empowered.

The fact of the matter is that no insight department or process can be a substitute for genuine, human empathy. And if we must turn to them for a more generous perspective on those we serve, we are probably more fucked than we think we are.


The implications of our First Duty run far and wide.

Tom Goodwin rightly rails against the digital landfill that is thrown our way – and all because in their myopic quest to grab our attention, marketers choose to ignore the truth that the foundation of advertising’s contract with the consumer is value exchange:

I search on Google and find myself on nasty sites like or second tier media websites designed only around selling my eyeballs to advertisers. I feel violated. I visit the BBC News site and find myself unable to see any video news without being forced to waste 30 seconds watching a crudely cut down TV ad. Prerolls ads insert themselves mid way in articles… Pages take forever to load swamped by cookies and content I don’t need. I’m led to articles on websites where I’m “welcomed” by welcome screens and where pop ups increasingly barge their way past browser settings.. its a disgrace for people who now face their attention being stolen, their data plans killed and their time wasted for things they don’t want.”

If we were to hold ourselves to that First Duty, we would cease the shameless, intrusive, low-value hijacking of attention.

If we were to hold ourselves to that Duty, we would play it straight with consumers, stop passing off self-interest as altruistic, impartial information or entertainment, and stop kidding ourselves as to ethical integrity of native advertising. For its essence is to make marketing content appear similar to the publication’s look and feel. 

And yet we insist on kidding ourselves that we are not in the business of camouflaging our attention hacking. Here, for example, is Advertising Age:

When executed well, with time and care put into ensuring the content is in the right tone and has the right message, native formats blend into vertical streams in a manner that’s much less disruptive to users (while still marked as “sponsored” so as not to be deceptive).”

Only an industry breathing in the fumes of its own inward-looking and self-regarding rhetoric, uncaring about its duty to the audience could with a straight face suggest that content can both blend in and be clearly marked.

Beyond the sizzle

Better targeted, more thoughtfully personalised, more context-responsive content may help ameliorate some of the worst excesses of thoughtless attention hacking.

But perhaps we can and should go further than all this.

Perhaps we should do more than merely engage in a never-ending arms race in the pursuit of people’s attention.

Perhaps rather than add ever more to the unending torrent of cultural spam that we delight in characterising as ‘content’, we should turn our own attention to things of real, enduring substance.

Indeed, perhaps we should stop asking more of people’ attention, and find ways of giving it back to them.

Rather than use the wonders of technology and digital connections to hack ever further at consumers’ attention and giving them ever more things to do, see, click on, download, read, scroll through, interact with, engage with, with ever more competitions to participate in, in-store QR codes to scan, long-from content films to endure, pre-roll advertising to suffer, quizzes to answer, ad campaigns to upload their faces into, and content to co-create… perhaps we should actually find ways of helping consumers save time, and preserve their cognitive resources.

People are working harder. They’re saddled with debt. Or unable to acquire the debt to give them the head start they need. They’re facing uncertain futures. Many are having their futures disrupted into oblivion. And everywhere are the agents of attention hacking.

For the most part they don’t need advertising. What they need is better products. And easier, more enjoyable lives. 

So perhaps more of us should refuse to take advantage of human cognition’s Achilles heel, and cease fuelling the toxic arms race of novelty for the sake of novelty.

We can rail all we want about the polluting effects of marketing’s more egregious attention robbing, but when much of it is merely in the service of polluting products, it’s tantamount to complaining about about the quality of the sizzle, and ignoring the quality of the actual sausage.

I leave the last words to Helen Edwards:

What about customer service? You ring your own call centre for the first time in years, navigate the endless bifurcations and hang on limply, listening to mindless music just like your consumers do. What about product quality? You look afresh at your packs, and wonder why they can’t be more user-friendly ; you remind yourself of those little formulation compromises.  What about fair trading and sustainable sourcing? Don’t even go there.

Suddenly, it all becomes clear. Yes, the fourth wave of content marketing has arrived, but consumers aren’t waving, they’re drowning in an ocean of branded pap, and the interactive lifeline they really want to be thrown is the one whereby companies promptly answer calls in person, keep their promises, make better products and contribute to a better world.

Improve the substance. That’s how competition really works, and it’s what you’ve always known deep down. It will mean big investment, though: in R&D, in a new call centre, in a greener supply chain and in ethnographic research. How do you get that past the board?

Frankly, you have no idea how to solve that one. But at least you have now defined the problem.”

The biggest challenge facing marketing is not how we hack people’s attention.

It’s what we direct that attention towards.



Diane Ackerman, A Natural History Of The Senses

Deirdre Barrett, Supernormal Stimuli: How Primal Urges Overran Their Evolutionary Purpose

Buonomano D.V, and Mezernich, MM, ‘Cortical plasticity: from synapses to maps’, Annual Review of Neuroscience, 1998;21:149-86.

Charles Clavey, ‘ Are you out of your mind?’, LA Review Of Books, 05.10.20

Matthew Crawford, The World Beyond Your Head: How to Flourish in an Age of Distraction

Matt Crenshaw, ‘Is native advertising about to go the way of pop-up ads?’, AdvertisingAge, 04.09.2015

Jonathan Crary, 24/7: Late Capitalism and the Ends of Sleep

Guy Debord, The Society Of The Spectacle

Helen Edwards, ‘Don’t drown your consumers in an ocean of branded pap’, Marketing, 05.11.2014

Andrew Ehrenberg, Neil Barnard, John Scriven, ‘Differentiation or Salience’, Journal of Advertising Research, November/December 1997

Andrew Ehrenberg, ‘Repetitive advertising and the consumer’, Journal of Advertising Research, Vol. 40, No. 6, November/December 2000

Andrew Ehrenberg, ‘What Brand Loyalty Can Tell Us’, Admap, October 2004, Issue 454

Andrew Ehrenberg, Neil Barnard, Rachel Kennedy, Helen Bloom, ‘Brand Advertising As Creative Publicity’, Journal of Advertising Research: Vol. 42, No. 4, July/August 2002

Tom Goodwin, ‘I miss the days of expensive advertising’, 13.08.2015

Kenneth M Heilman, MD, Stephen E. Nadeau, MD, and David Q. Beversdorf, MD. “Creative Innovation: Possible Brain Mechanisms” Neurocase (2003)

William Moran, ‘Brand Presence And The Perceptual Frame’, Journal of Advertising Research, October/November 1990

Jenni Romanuik, Byron Sharp, Andrew Ehrenberg, ‘Evidence concerning the importance of perceived brand differentiation, Australasian Marketing Journal 15 92), 2007

Jenni Romanuik, Byron Sharp, ‘Where knowledge of your brand resides: the Pareto share of brand knowledge’, in Report 44 for Corporate Sponsors, 2008, Ehrenberg-Bass Institute for Marketing Science

Jenni Romanuik, Byron Sharp, ‘Using known patterns in image data to determine brand positioning’, international journal of market research, Vol. 42, No.2, 2000

Jenni Romanuik, Andrew Ehrenberg, ‘Do brands lack personality?’ Report 14 for Corporate Members, March 2003

Jenni Romanuik, Byron Sharp, ‘Conceptualizing and measuring brand salience’, Marketing Theory, Volume 4(4), 2004

Daniel Schacter, The Seven Sins of Memory

Daniel Schacter, Searching for Memory

Byron Sharp, How Brands Grow: What Marketers Don’t Know

John Tomlinson, The Culture Of Speed: The Coming Of Immediacy

Julian Viecli, Robin Shaw, ‘A Model of Brand Salience’, in Mark Uncles, ed. Perspectives on Brand Management, 2011

Judith Williamson, Decoding Advertisements: Ideology and Meaning in Advertising


Your brand is software, not a person: A new metaphor for changed times


“The price of metaphor is eternal vigilance”

Norbert Wiener

It is a truth universally acknowledged that consumers find it natural to imbue brands with human personality characteristics, such as ‘honest’, ‘optimistic’, ‘cool’ or ‘tough’. 

That brands are like human beings, that they have differentiating personality traits, and that consumers have relationships with them remains still, largely unchallenged dogma in marketing circles.

And yet – the legend of Ernest Dichter, and the acres of subsequent academic theorizing notwithstanding – the fact remains that personality traits are poor at differentiating brands, within any given category different brands do not attract different kinds of buyers, and people are not that attached to brands.

Surely it is time to retire the brand-as-person metaphor.

Indeed perhaps it’s time to find a metaphor (if we must have one) more suited to our times. For while much of our inherited brand theory baggage values the intangible, as Guy Murphy has noted:

The democracy of information has allowed consumers to focus hard on the more rational and ‘real’ aspects of the product itself. Social commentary and ratings have particularly brought tangible product performance itself to the fore. The most emotionally rich brands can be undermined by a one-star product rating. This is as true for products as it is for the companies that make them… There is a shift occurring from adding intangible value to adding tangible value. Instead of adding emotion to function, brands are beginning to add function to function. We don’t have to play out a brand as a metaphor any more, when we can do it for real. Brands will seem more valuable not just because they feel better but because they work better, or serve you better. This is not a denial of the emotional component of brands, it is just a different way of generating it. It is not about the power of associations, it is about the power of reality”

In such a world, perhaps we should be seeking a metaphor and perspective that more consciously attempts to reconcile the invisible, mental dimensions to a brand’s existence with its practical, and concrete manifestations in the world.

And in world which as Marc Andreessen has argued, is being eaten by it, we could perhaps, do worse, than think of brands as software.

After all, as he puts it :

In many industries, new software ideas will result in the rise of new Silicon Valley-style start-ups that invade existing industries with impunity. Over the next 10 years, the battles between incumbents and software-powered insurgents will be epic. Joseph Schumpeter, the economist who coined the term “creative destruction,” would be proud.”

No, a brand is not a person.  But it is software. Both of the mind. And in the world.


This much we know. Brands are memory structures in the mind.  They are the sum total of all our memories, encounters, impressions and associations connected with a brand. 

And consciously or unconsciously accessed, they work to make a brand thought of in purchase and consumption occasions.

In this way, they operate as heuristics, rules of thumb that by-pass the necessity to conduct an elaborate cost-benefit analysis of all the options available every time we are faced with choice.

Viewed through this lens, brands are decision-making software.


This of course, has always been the case – it is the oldest and indeed original function of branding. But today we are now subcontracting our mental processes to software. 

For now we have technology that can remember, find, anticipate, recommend, remind, analyse, compute, manage, measure, monitor for us. And in the wake of this technological – and with it, cultural – upheaval, brands are evolving into software that resides in the world.

Brands can now can remember what we like, and what we bought.

They can anticipate when we need to restock, repurchase, or renew.

They can suggest purchases, content, and experiences we will probably like.

They can compare and recommend purchase options.

They can respond to our service, upgrade, and replacement needs.

And of course as brands become more like software, unshackled from the constraints of the physical world, they and their functions can inevitably flow through our lives with ever greater ease.

So the brand is software. Of the mind. And in the world.


The brand-as-software metaphor certainly feels better aligned than the brand-as-person metaphor to the real role of branding role in consumer decision-making.

And it certainly appears better able to encourage us to take advantage of the new opportunities technology opens up for us.

Indeed the brand-as-software metaphor is much less of a metaphor, much less than a just figure of speech than the brand-as-person metaphor. It better reflects how brands actually operate, in the mind, and in the world.


So what might thinking of brands as software actually encourage us to do differently?

Perhaps it will encourage us to think of brands as fluid, adaptable, responsive, upgradable, and permeating.

Perhaps it will encourage us to think how utility (not just storytelling) can create meaning.

Perhaps it might encourage more of a rapprochement and productive relationship between those who make products and services, and those who tell stories.

Perhaps we might demand that brands be just as useful in the mind as they are in the world.

Perhaps we will grant equal status to the creation of memory structures, and to the creation of utility in the world, rather than set them in competition with each other.

And perhaps thinking of brands as an external manifestation of human mental processes will encourage us to think of software in human terms,  rather than think of humans in machine terms.


So far, so good, possibly. But we cannot talk about the brand-as-software without talking about the OS or platform. If brands are indeed software, then what do they run on?

In a multi-platform, multi-device, multi OS world, the brand-as-software must of course be able to run effortlessly on, by way of example, both iOS and Android. If brands wish to be part of the fabric of the world, then they must be able to adapt to and run on all the dominant platforms out there. And that demands a new degree of adaptability of what we build.

But there is a more important OS the brand-as-software must run on.

The human OS.

That sounds easy and obvious, but the fact of the matter is that most brand builders don’t give enough of a damn about creating for the human platform. They ignore being interesting and pursue difference. They fail to work back from human drives, needs, wants, and interests back to strategy, and instead insist on working forwards from brand and product. They either fail to reinforce and sustain long-term memory structures, or they fail to refresh them.

Brands must work on the human OS, or fail. For the human OS always takes precedence over platform-level OSs, and is the one that we should all be designing for. Always.



Special thanks to Ben Malbon for prompting me to ask some hard but good questions.

Guy Murphy, ‘Brands in the digital age: The opt-in age of brands’, Admap, December 2013

Marc Andreessen, ‘Why software is eating the world’, The Wall Street Journal, August 20, 2011

All watched over by machines of loving grace? Notes of disquiet and disbelief




“I like to think (and the sooner the better!) of a cybernetic meadow where mammals and computers live together in mutually programming harmony… and all watched over by machines of loving grace.”

Richard Brautigan, ‘All Watched Over By Machines Of Loving Grace’

“How is it that when I speak of belief, I become aware always of a shadow, the shadow of disbelief haunting my belief?'”

Graham Greene, Monsignor Quixote

“Anyone who makes up his mind to evade the uncertainty of belief will have to experience the uncertainty of unbelief”

Pope Benedict XVI, Introduction to Christianity



Preface: What we build will shape us

Innovation is never a neutral quantity. Technologies and artifacts are shaped by the values, priorities, and assumptions of their developers, and often their users as well. Of course, many technologies are designed or refined with particular goals in mind, but here I am referring to a different and less deliberate shaping process, through which artifacts come to reflect and reproduce beliefs about the types of functions and ways of living and working that are important.” 

Julie Cohen, ‘What is privacy for’

We shape our buildings, and afterwards our buildings shape us.” 

Winston Churchill

“Our artifacts organize the world for us, subtly shaping the ways that we make sense of it. Over time we come to perceive the world through the lenses that our artifacts create. For example, an automobile club map and the step-by-step instructions uttered by an in-car GPS or the iPhone’s Siri interface represent local geography in radically different ways. Intriguingly, there is evidence to suggest that over time the processes of mediation and configuration become deeply encoded in our neurobiology, producing individuals who are literally wired to think and act differently” 

Julie Cohen, ‘What is privacy for’

As goes adtech, so goes our social code” 

John Battelle, ‘Why the banner is heroic and adtech is our greatest artifact’

So what is the impact of technological change, of our emergent ‘technium’, as Kevin Kelly has dubbed it – the ecology of ‘technological species’ and their complex interactions with us, and with each other? And what are the effects of what we are building on how we live, and indeed on who we are?

I have chosen to rely upon the observations, analysis and opinion of others. Caught between a head filled with sci-fi wonders since a teen and a tendency to worry, I make little commentary but provide extracts instead. This is as much a personal exploration as anything else.

So let’s start with the very good news.


We’re building a new infrastructure

What we today call “adtech” will tomorrow become the worldwide real-time processing layer driving much of society’s transactions.”

John Battelle, ‘Why the banner is heroic and adtech is our greatest artifact’ 

Data embodies a priceless collection of experience from which to learn. Every medical procedure, credit application, Facebook post, movie recommendation, fraudulent act, spammy e-mail, and purchase of any kind – each positive or negative outcome, each successful or failed sales call, each incident, event, and transaction, – is encoded as data and warehoused. The glut grows by an estimated 2.5 quintillion bytes per day… And so a veritable Big Bang has set off, delivering an epic sea of raw materials, a plethora of examples os great in number, only a computer could manage to learn from them. Used correctly, computers avidly soak up this ocean like a sponge…. The process of machines learning from data unleashes the power of this exploding resource. It uncovers what drives people and the actions they take – what makes it tick and how the world works. With the new knowledge gained, prediction is possible.”

Eric Siegel, Predictive Analytics: The power to predict who will click, buy, lie, or die

Think about what programmatic adtech makes possible. An individual requests a piece of content through a link or an action (like touching something on a mobile device). In milliseconds, scores of agents execute thousands of calculations based on hundreds of parameters, all looking to market-price the value of that request and deliver a personalized response. This happens millions of times a second,  representing hundreds of millions, if not billions, of computing cycles each second. What’s most stunning about this system is that it’s tuned to each discrete individual – every single request/response loop is unique, based on the data associated with each individual… Every time you turn up your thermostat, this infrastructure will engage, determining in real time the most efficient response to your heating needs. Each time you walk into a doctor’s office, the same kind of system could be triggered to determine what information should appear on your health care provider’s screen, and on yours, and how best payment should be made (or insurance claims filed). Every retail store you visit, every automobile you drive (or are driven by), every single interaction of value in this world can and will become data that interacts with this programmatic infrastructure.”

John Battelle, ‘Why the banner is heroic and adtech is our greatest artifact’

What we today call ‘adtech’ will tomorrow become the worldwide real-time processing layer driving much of society’s transactions. That layer deserves to be named as perhaps the most important artifact extant today.”

John Battelle, ‘Why the banner is heroic and adtech is our greatest artifact’


We’re building a better world

The web has transformed the way we work, play and interact with each other. Some of health’s greatest problems are being rethought. Millions of so called citizen scientists are plotting cancer gene patterns via online games. Early intervention in dementia is more common after the success of an online test for potential patients. Education is being opened up on a global scale through the use of massive open online courses – from khan academy’s tutorials to coursera’s degrees. All industries are being disrupted. Farmers in Ghana are saving time and money by using their smartphones to trade their products before the long walk to market begins and underprivileged women in South Africa are breaking out of poverty after training that means they can answer the tech problems of US customers. Even the animal videos posted online – much mocked by some – are now enabling scientists to gather meaningful data from around the world about animal relationships that would never have been accessible before.”

Martha Lane Fox

Utopia confronts reality not with a measured assessment of the possibilities of change but with the demand for change. This is the way the world should be. It refuses to accept current definitions of the possible because it knows these to be a part of the reality it seeks to change. In its guise as utopia, by the very force of its imaginative presentation, it energizes reality, infusing it with ‘the power of the new.’”

Krishan Kumar, ‘Utopianism’

The critics of early computers capitalized on a common fear: that a Big Brother brain would watch over us and control us. What we know now of our own brains is that they too are only networks of mini-minds, a society of dumber minds linked together, and that when we peer into them deeply we find that there is no “I” in charge. Not only does a central-command economy not work; a central-command brain won’t either.

In its stead, we can make a nation of personal computers, a country of decentralized nodes of governance and thought. Almost every type of large-scale governance we can find, from the body of a giraffe, to the energy-regulation in a tidal marsh, to the temperature regulation of a beehive, to the flow of traffic on the Internet, resolves into a swarmy distributed net of autonomous units and heterogeneous parts.

No one has been more wrong about computerization than George Orwell in 1984. So far, nearly everything about the actual possibility-space that computers have created indicates they are not the beginning of authority but its end.

In the process of connecting everything to everything, computers elevate the power of the small player. They make room for the different, and they reward small innovations. Instead of enforcing uniformity, they promote heterogeneity and autonomy. Instead of sucking the soul from human bodies, turning computer-users into an army of dull clones, networked computers — by reflecting the networked nature of our own brains and bodies — encourage the humanism of their users. Because they have taken on the flexibility, adaptability, and self-connecting governance of organic systems, we become more human, not less so, when we use them.”

Kevin Kelly, The electronic hive: Embrace it’

I like to think (and the sooner the better!) of a cybernetic meadow where mammals and computers live together in mutually programming harmony like pure water touching clear sky… where we are free of our labors and joined back to nature, returned to our mammal brothers and sisters, and all watched over by machines of loving grace.”

Richard Brautigan, ‘All Watched Over By Machines Of Loving Grace’

There is only one time in the history of each planet when its inhabitants first wire up its innumerable parts to make one large Machine. Later that Machine may run faster, but there is only one time when it is born.

You and I are alive at this moment.

We should marvel, but people alive at such times usually don’t. Every few centuries, the steady march of change meets a discontinuity, and history hinges on that moment. We look back on those pivotal eras and wonder what it would have been like to be alive then. Confucius, Zoroaster, Buddha, and the latter Jewish patriarchs lived in the same historical era, an inflection point known as the axial age of religion. Few world religions were born after this time. Similarly, the great personalities converging upon the American Revolution and the geniuses who commingled during the invention of modern science in the 17th century mark additional axial phases in the short history of our civilization.

Three thousand years from now, when keen minds review the past, I believe that our ancient time, here at the cusp of the third millennium, will be seen as another such era. In the years roughly coincidental with the Netscape IPO, humans began animating inert objects with tiny slivers of intelligence, connecting them into a global field, and linking their own minds into a single thing. This will be recognized as the largest, most complex, and most surprising event on the planet. Weaving nerves out of glass and radio waves, our species began wiring up all regions, all processes, all facts and notions into a grand network. From this embryonic neural net was born a collaborative interface for our civilization, a sensing, cognitive device with power that exceeded any previous invention. The Machine provided a new way of thinking (perfect search, total recall) and a new mind for an old species. It was the Beginning.”

Kevin Kelly, ‘We are the web’


The necessity of doubt

This is Hieronymus Bosch’s extraordinary painting The Garden of Earthly Delights:


The image below is a detail.

No strangers to the perils of skating, it was the Dutch who gave us the expression “skating on thin ice”.  What appears to be at first glance a boat or sled is actually a giant skate. We’re looking at a tiny man on an oversized ice skate.


Bosch was fond of using oversized objects to represent objects in man’s imagination. Our skating man  here thinks he is the one controlling his movement and the direction he’s going in. But his vehicle is imaginary, a figment of his fantasies about his own ability to do things. He’s riding forces much greater than he is. And there in the upper left we witness his inevitable fall through the ice.

For despite all the the powers and joys it is yielding today, and despite all the breathless prophecies of the frictionless future to come, there is a very real possibility that it is our over-sized ice skate. And that while we think we are in control of it, we are in truth riding forces we do not fully comprehend, and which we cannot fully control.

No doubt there will be readers smarting already with righteous indignation. But the fact of the matter is that if we are to make any real, intelligent progress then criticism must be entertained. Technology is a tool, not an article of faith. And to question it is not an act of apostasy.

Room must be allowed for doubt and disquiet. For as the writer and social critic Lillian Smith said, “Faith and doubt, both are needed, not as antagonists but working side by side, to take us around the unknown curve.”

So let’s now exercise a little doubt, try to peer around the unknown curve, and speculate about what potential scenarios and options might await us.


Disquiet #1: It’s an infrastructure of surveillance

Networked information technologies enable surveillance to become modulation. Surveillance may be defined generically as attention that is purposeful, routine, systematic, and focused. Networked information technologies enable surveillant attention to become continuous, pervasively distributed, and persistent.”

Julie Cohen, ‘What is privacy for’

Spying on you is…. The official primary business of the information economy” 

Jaron Lanier, ‘Who Controls The Future?’

From the cash-register scanners in a million supermarkets, videostores, and pharmacies, there already pours a steady flow of statistical data about customers and their purchases, ready to be correlated, helping companies serve us more efficiently – or else giving them an unfair advantage, knowing vastly more about us than we do about them. Soon, computers will hold financial and educational records, legal documents, and medical analyses that parse you all the way down to your genes. Any of this might be accessed by strangers without your knowledge or even against your stated will.”

David Brin, ‘The transparent society’


Disquiet #2: There is more to come

It’s about to get worse, though. Companies such as Google may know more about your personal interests than your spouse, but so far it’s been limited by the fact that these companies only see computer data. And even though your computer habits are increasingly being linked to your offline behaviour, it’s still only behaviour that involves computers.

The Internet of Things refers to a world where much more than our computers and cell phones is internet-enabled. Soon there will be internet-connected modules on our cars and home appliances. Internet-enabled medical devices will collect real-time health data about us. There’ll be internet-connected tags on our clothing. In its extreme, everything can be connected to the internet. It’s really just a matter of time, as these self-powered wireless-enabled computers become smaller and cheaper.

“Lots has been written about the ‘Internet of Things’ and how it will change society for the better. It’s true that it will make a lot of wonderful things possible, but the “Internet of Things” will also allow for an even greater amount of  surveillance than there is today.”

Bruce Schneier, ‘Will giving the internet eyes and ears mean the end of privacy?’

Moore’s Law has also made mass automated surveillance dirt cheap. Government surveillance that used to cost millions of dollars can now be carried out for a fraction of that. We have yet to fully grasp the implications of cheap surveillance. The only thing that is certain is that we will be seeing a great deal more surveillance—of ordinary citizens, potential terrorists, and heads of state—and that it will have major consequences.” 

Bill Davidow, ‘With Great Computing Power Comes Great Surveillance: The dark side of Moore’s Law’

Items of interest will be located, identified, monitored, and remotely controlled through technologies such as radio-frequency identification, sensor networks, tiny embedded servers, and energy harvesters — all connected to the next-generation internet using abundant, low-cost, and high-power computing.” 

Former director of the CIA, David Patraeus


Disquiet #3: The surveillance-industrial emerges

You should care about surveillance because once the system for surveillance is built into the networks and the phones, bad guys (or dirty cops) can use it to attack you. In Greece, someone used the police back door on the national phone company’s switches to listen in on the prime minister during the 2005 Olympic bid. Chinese hackers used Google’s lawful interception back door to hack Gmail and figure out who dissidents talked to. Our communications systems are more secure if they’re designed to keep everyone out – and adding a single back door to them blows their security models up. You can’t be a little bit pregnant, and the computers in your pocket and on your desk and in your walls can’t be a little bit insecure. Once they’re designed for surveillance, anyone who can bribe or impersonate a cop can access them.”

Cory Doctorow, ‘The NSA’s Prism: Why we should care’ 

Information gathered by private firms is, as demonstrated by the experience of banks and credit card companies, again and again processed by the government. The boundaries between the public and the private sectors are thus blurred. Personal data, once stored, tend to become a flexible, common base of information”

Spiro Simits, ‘Reviewing privacy on an information society’, University of Pennsylvania Law Review, Vol. 135, 1987

Federal agencies collect and use personal information for various purposes, both directly from individuals and from other sources, including information resellers–companies that amass and sell data from many sources.”

U.S. Government Accountability Office, ‘Personal Information: Agency and Reseller Adherence to Key Privacy Principles’, April 2006

In fiscal year 2005, the Departments of Justice, Homeland Security, and State and the Social Security Administration reported that they used personal information obtained from resellers for a variety of purposes. Components of the Department of Justice (the largest user of resellers) used such information in performing criminal investigations, locating witnesses and fugitives, researching assets held by individuals of interest, and detecting prescription drug fraud. The Department of Homeland Security used reseller information for immigration fraud detection and border screening programs. Uses by the Social Security Administration and the Department of State were to prevent and detect fraud, verify identity, and determine eligibility for benefits. The agencies spent approximately $30 million on contractual arrangements with resellers that enabled the acquisition and use of such information. About 91 percent of the planned fiscal year 2005 spending was for law enforcement (69 percent) or counterterrorism (22 percent).”

U.S. Government Accountability Office, ‘Personal Information: Agency and Reseller Adherence to Key Privacy Principles’, April 2006

The major information resellers that do business with the federal agencies we reviewed have practices in place to protect privacy, but these measures are not fully consistent with the Fair Information Practices. For example, the principles that the collection and use of personal information should be limited and its intended use specified are largely at odds with the nature of the information reseller business, which presupposes that personal information can be made available to multiple customers and for multiple purposes. Resellers said they believe it is not appropriate for them to fully adhere to these principles because they do not obtain their information directly from individuals.”

U.S. Government Accountability Office, ‘Personal Information: Agency and Reseller Adherence to Key Privacy Principles’, April 2006

Resellers generally limit the extent to which individuals can gain access to personal information held about themselves, as well as the extent to which inaccurate information contained in their databases can be corrected or deleted. Agency practices for handling personal information acquired from information resellers did not always fully reflect the Fair Information Practices. That is, some of these principles were mirrored in agency practices, but for others, agency practices were uneven. For example, although agencies issued public notices on information collections, these did not always notify the public that information resellers were among the sources to be used. This practice is not consistent with the principle that individuals should be informed about privacy policies and the collection of information.”

U.S. Government Accountability Office, ‘Personal Information: Agency and Reseller Adherence to Key Privacy Principles’, April 2006


Disquiet #4: Resistance is futile

Even more importantly, dysfunction in privacy markets has a  dynamic aspect. Over time, people can be expected to (over)disclose  more and more information, both because they have become inured to  disclosure and because the equilibrium in the marketplace has tipped  toward disclosure as a condition of market entry.”

Julie Cohen, ‘Irrational Privacy?’

It is already far too late to prevent the invasion of cameras and databases. The djinn cannot be crammed back into the bottle. No matter how many laws are passed, it will prove quite impossible to legislate away the new tools and techniques. They are here to stay. Light is going to shine into every aspect of our lives… Oh, we can try. We might agitate, demonstrate, legislate. But in rushing to pass so-called privacy laws, we will not succeed in preventing hidden eyes from peering into our lives. The devices will get tinier, more mobile, and more clever. In software form, they will cruise the data highways. The rich, the powerful, police agencies, and a technologically skilled élite will always be able to find out whatever they want to know about you and me…

So far, most European politicians have reached for the low-hanging fruit – law – thinking that if only they can better regulate American companies – for example, by forcing them to disclose how much data and when they share with NSA – this problem will go away. This is a rather short-sighted, naïve view that reduces a gigantic philosophical problem – the future of privacy – to seemingly manageable size of data retention directives. If only things were that simple! Our current predicaments start at the level of ideology, not bad policies or their poor implementation. This is not to oppose more regulation of technology companies – Europe should have done this a decade ago instead of getting caught in the heady rhetoric of ‘cloud computing – but only to point out that the task ahead is far more intellectually demanding.”

Evgeny Morozov – ‘The price of hypocrisy’


Disquiet #5: We accept the Faustian bargain

We choose to do it because of the convenience, redundancy, automation, and shareability. We like it when we can access our e-mail anywhere, from any computer. We like it when we can restore our contact lists after we’ve lost our phones. We want our calendar entries to automatically appear on all of our devices. These cloud storage sites do a better job of backing up our photos and files than we would manage by ourselves; Apple does a great job keeping malware out of its iPhone apps store.”

Bruce Schneier, ‘When it comes to security, we’re back to feudalism’


Disquiet #6: We accept digital feudalism

They tether us like serfs; just try to take data from one digital lord to another. Ultimately, they will always act in their own self-interest, as companies do when they mine our data in order to sell more advertising and make more money. These companies own us, so they can sell us off – again, like serfs – to rival lords … or turn us in to the authorities.”

Bruce Schneier, ‘When it comes to security, we’re back to feudalism’


Disquiet #7: Pawns of the adjustment bureau

Networked information technologies enable surveillant attention to become continuous, pervasively distributed, and persistent. This in turn produces modulation: a set of processes in which the quality and content of surveillant attention is continually modified according to the subject’s own behavior, sometimes in response to inputs from the subject but according to logics that ultimately are outside the subject’s control.”

Julie Cohen, ‘What is privacy for’

Now is Google’s attempt at becoming the real time interface to our lives – moving well beyond the siloed confines of “search” and into the far more ambitious world of “experience.” As in – every experience one has could well be lit by data delivered through Google Now. Google knows that this moment – the moment of our lives becoming data – is happening now, and the company is very, very focused on seizing it.”

John Battelle, ‘Google Now: The tip of a very long spear’

In the modulated society, surveillance is not heavy-handed; it is ordinary, and its ordinariness lends it extraordinary power. The surveillant assemblages of informational capitalism do not threaten the ‘normalized soul training’ of the Orwellian nightmare. They beckon with seductive appeal. Individual citizen-consumers willingly and actively participate in processes of modulation, seeking the benefits that increased personalization can bring. For favored consumers, these may include price discounts, enhanced products and services, more convenient access to resources, and heightened social status. Within surveillant assemblages, patterns of information flow are accompanied by discourses about why the patterns are natural and beneficial, and those discourses foster widespread internalization of the new norms of information flow.”

Julie Cohen, ‘What is privacy for’

Modulation is a mode of privacy invasion, but it is also a mode of knowledge production designed to produce a particular way of knowing and a mode of governance designed to produce a particular kind of subject. Its purpose is to produce tractable, predictable citizen-consumers whose preferred modes of self-determination play out along predictable and profit-generating trajectories. Yet to speak of networked processes of surveillance and modulation in the industrial-era vernacular, as systems for “manufacturing consent” would be too crude. Rather, in a much more subtle process of continual feedback, stimuli are tailored to play to existing inclinations, nudging them in directions that align with profit-maximizing goals. So too with political inclinations; particularly as search and social networking become more seamlessly integrated, networked citizen-consumers move within personalized “filter bubbles” that conform the information environment to their political and ideological commitments. This is conducive to identifying and targeting particular political constituencies, but not necessarily to fostering political dialogue among diverse constituencies in ways that might enable them to find common ground.”

Julie Cohen, ‘What is privacy for’

Habits, activities, and preferences are complied, registered, and retrieved to facilitate better adjustment, not to improve the the individual’s capacity to act and decide. Whatever the original incentive for computerization may have been, processing increasingly appears as the ideal means to adapt an individual to a predetermined, standardized behavior that aims at the highest possible degree of compliance with the model patient, consumer, taxpayer, employee, or citizen. Furthermore, interactive systems do not, despite all contrary assertions, restore a long lost individuality by correcting the effects of mass production in a mass society. On the contrary, the telemetric integration forces the individual once more into a preset scheme. The media supplier dictates the conditions under which the communication takes place, fixes the possible subjects of the dialogue, and, due to the personal data collected, is in an increasingly better position to influence the subscribers behavior. Interactive systems, therefor,e suggest individual behaviors where in fact no more than stereotyped reactions occur”

Spiro Simits, ‘Reviewing privacy on an information society’, University of Pennsylvania Law Review, Vol. 135, 1987

In essence, information-rich democracies have reached a point where they want to try to solve public problems without having to explain or justify themselves to citizens. Instead, they can simply appeal to our own self-interest—and they know enough about us to engineer a perfect, highly personalized, irresistible nudge.”

Evgeny Morozov, ‘The real privacy problem’

A non-interpretable process might follow from a data-mining analysis which is not explainable in human language. Here, the software makes its selection decisions based upon multiple variables (even thousands) … It would be difficult for the government to provide a detailed response when asked why an individual was singled out to receive differentiated treatment by an automated recommendation system. The most the government could say is that this is what the algorithm found based on previous cases.”

Tal Zarsky, ‘Transparent Predictions’

From with Great Computing: “While much of the world’s concern has been focused on NSA spying, I believe the greatest threat to my freedom will result from my being placed in a virtual algorithmic prison. Those algorithmic predictions could land me on no-fly lists or target me for government audits. They could be used to deny me loans and credit, screen my job applications and scan LinkedIn to determine my suitability for a job. They could be used by potential employers to get a picture of my health. They could predict whether I will commit a crime or am likely to use addictive substances, and determine my eligibility for automobile and life insurance. They could be used by retirement communities to determine if I will be a profitable resident, and employed by colleges as part of the admissions process. Especially disturbing is the notion that once you become an algorithmic prisoner, it is very difficult to get pardoned. Ask anyone who has tried to get off a no-fly list or correct a mistake on a credit report.”

Bill Davidow, ‘With Great Computing Power Comes Great Surveillance’

It is all too easy to forget that ‘free’ inevitably means that someone will be deciding how you live.”

Jaron Lanier, Who owns the future?


Disquiet #8: The threat to democracy

By these increasingly ordinary processes, both public and private regimes of surveillance and modulation diminish the capacity for democratic self-government.”

Julie Cohen, ‘What is privacy for’

Citizens of the modulated society are not the same citizens that the liberal democratic political tradition assumes, nor do their modulated preferences even approximately resemble the independent decisions, formed through robust and open debate, that liberal democracy requires in order to sustain and perfect itself. The modulated society is the consummate social and intellectual rheostat, continually adjusting the information environment to each individual’s comfort level. Liberal democratic citizenship requires a certain amount of discomfort—enough to motivate citizens to pursue improvements in the realization of political and social ideals. The modulated citizenry lacks the wherewithal and perhaps even the desire to practice this sort of citizenship.  If this sounds like science fiction, it shouldn’t. Like the liberal self, liberal democracy has always been an ideal to be pursued and approximated. A polity’s ability to approximate liberal democracy has both institutional and material preconditions. In the generations following the framing of the U.S. Constitution, those who sought to build a functioning liberal democracy had to contend with the gulf between liberalism’s aspirations to egalitarianism and the concentration of political power in an entitled minority of white male property- and slaveowners. In the generations to come, those who seek to maintain a functioning liberal democracy will need to contend with the gulf between liberalism’s aspirations to self-government by an informed and vigilant citizenry and the relatively blunted capacities of a modulated citizenry.”

Julie Cohen, ‘What is privacy for’

For if we are observed in all matters, we are constantly under threat of correction, judgment, criticism, even plagiarism of our own uniqueness. We become children, fettered under watchful eyes, constantly fearful that—either now or in the uncertain future—patterns we leave behind will be brought back to implicate us, by whatever authority has now become focused upon our once-private and innocent acts. We lose our individuality, because everything we do is observable and recordable.”

Bruce Schneier, ‘The eternal value of privacy’

Networked information technologies mediate our experiences of the world in ways directly related to both the practice of citizenship and the capacity for citizenship, and so configure citizens as directly or even more directly than institutions do. The practice of citizenship requires access to information and access to the various communities in which citizens claim membership. In the networked information society, those experiences are mediated by search engines, social networking platforms, and content formats. Search engines filter and rank search results, tailoring both the results and the accompanying advertising to what is known about the searcher and prioritizing results in ways that reflect popularity and advertising payments. Social networking platforms filter and systematize social and professional relationships according to their own logics. Content formats determine the material conditions of access to information—for example, whether a video file can be copied or manipulated, or whether a news forum permits reader comments. Each set of processes structures the practice of citizenship, and also subtly molds network users’ understanding of the surrounding world. To an increasing degree, then, the capacity for democratic self-government is defined in part by what those technologies and other widely used technologies allow, and by exactly how they allow it. The resulting changes in our political culture may be more profound and permanent than we have allowed ourselves to imagine.”

Julie Cohen, ‘What is privacy for’

Thanks to smartphones or Google Glass, we can now be pinged whenever we are about to do something stupid, unhealthy, or unsound. We wouldn’t necessarily need to know why the action would be wrong: the system’s algorithms do the moral calculus on their own. Citizens take on the role of information machines that feed the techno-bureaucratic complex with our data. And why wouldn’t we, if we are promised slimmer waistlines, cleaner air, or longer (and safer) lives in return? This logic of preemption is not different from that of the NSA in its fight against terror: let’s prevent problems rather than deal with their consequences. Even if we tie the hands of the NSA—by some combination of better oversight, stricter rules on data access, or stronger and friendlier encryption technologies—the data hunger of other state institutions would remain. They will justify it. On issues like obesity or climate change—where the policy makers are quick to add that we are facing a ticking-bomb scenario—they will say a little deficit of democracy can go a long way.”

Evgeny Morozov, ‘The real privacy problem’


Disquiet #9: Human capital is diminished

An environment that was always and everywhere congenial to the straightaway execution of our impulsions would set a term to growth as surely as one always hostile would irritate and destroy. Impulsion forever boosted on its forward way would run its course thoughtless, and dead to emotion. For it would not have to give an account of itself in terms of the things it encounters, and hence they would not become significant objects. The only way it can become aware of its nature and its goal is by obstacles surmounted and means employed; means which are only means from the very beginning are too much one with an impulsion, on a way smoothed and oiled in advance, to permit of consciousness of them. Nor without resistance from surroundings would the self become aware of itself; it would have neither feeling nor interest, neither fear nor hope, neither disappointment nor elation. Mere opposition that completely thwarts, creates irritation and rage. But resistance that calls out thought generates curiosity and solicitous care, and, when it is overcome and utilized, eventuates in elation.”

John Dewey, Art as Experience

The ground for valuing liberty is the claim that we are pre-eminently rational agents, each of us well suited to determining what goes in our own life. There is ample evidence, however, from the fields of psychology and behavioral economics, that in many situations this is simply not true. The incidence of irrationality is much higher than our Enlightenment tradition has given us to believe, and keeps us from making the decisions we need to reach our goals. The ground for respecting autonomy is shaky.”

Sarah Conly, Against Autonomy

“A commitment to privacy expresses a different kind of “sound reason” that we might choose to value—one that prizes serendipity as well as predictability and idiosyncrasy as well as assimilation”

Julie Cohen, ‘What is privacy for’

Habits, activities, and preferences are compiled, registered, and retrieved to facilitate better adjustment, not to improve the individual’s capacity to act and to decide. Whatever the original incentive for computerization may have been, processing increasingly appears as the ideal means to adapt an individual to a predetermined, standardized behavior that aims at the highest possible degree of compliance with the model patient, consumer, taxpayer, employee, or citizen.”

Spiro Simitis, ‘Reviewing privacy on an information society’

Privacy is important both because it promotes innovative practice and because it promotes the kinds of innovation that matter most. The human innovative drive is both unpredictable and robust, but it does not follow either that all environments are equally favorable to innovation or that all environments will produce the same kinds of innovation. If privacy and serendipity are critical to innovation—by which I mean critical both to the likelihood that innovation will occur and to the substance of that innovation—there is reason to worry when privacy is squeezed to the margins, and when the pathways of serendipity are disrupted and rearranged to serve more linear, commercial imperatives. Environments that disfavor critical independence of mind, and that discourage the kinds of tinkering and behavioral variation out of which innovation emerges will, over time, predictably and systematically disfavor innovation, and environments designed to promote consumptive and profit-maximizing choices will systematically disfavor innovations designed to promote other values. The modulated society is dedicated to prediction but not necessarily to understanding or to advancing human material, intellectual, and political wellbeing. Data processing offers important benefits, but so does privacy. A healthy society needs both to thrive.”

Julie Cohen, ‘What is privacy for’

Even if Big Data did not continually alter its own operands, it does not operate in a vacuum. It is a mistake to think of the techniques of Big Data as simply adding to the amount of information circulating within society. The valorization of predictive rationality and risk management inevitably displaces other kinds of knowledge that might be generated instead. Stimuli tailored to consumptive preferences crowd out other ways in which preferences and self-knowledge might be expressed, and also crowd out other kinds of motivators—altruism, empathy, and so on—that might spur innovation in different directions. In a consumption-driven economy, the innovations that emerge and find favor will be those that fulfill consumption-driven needs. Contemporary applications of Big Data extend beyond marketing and advertising to core social and cultural functions, including the study of intellectual preferences and the delivery of higher education. Systematizing those functions according to the dictates of predictive rationality threatens important social values. It crowds out the ability to form and pursue agendas for human flourishing, which is indispensable both to maintaining a vital, dynamic society and to pursuing a more just one.”

Julie Cohen, ‘What is privacy for’

Privacy shelters dynamic, emergent subjectivity from the efforts of commercial and government actors to render individuals and communities fixed, transparent, and predictable. It protects the situated practices of boundary management through which the capacity for self-determination develops.”

Julie Cohen, ‘What is privacy for’


Disquiet #10: Everyone should worry

The problem with the nothing-to-hide argument is the underlying assumption that privacy is about hiding bad things. By accepting this assumption, we concede far too much ground and invite an unproductive discussion about information that people would very likely want to hide. As the computer-security specialist Schneier aptly notes, the nothing-to-hide argument stems from a faulty “premise that privacy is about hiding a wrong.” Surveillance, for example, can inhibit such lawful activities as free speech, free association, and other First Amendment rights essential for democracy.”

Daniel Solov, ‘Why Privacy Matters Even if You Have ‘Nothing to Hide’

“My life’s an open book,” people might say. “I’ve got nothing to hide.” But now the government has large dossiers of everyone’s activities, interests, reading habits, finances, and health. What if the government leaks the information to the public? What if the government mistakenly determines that based on your pattern of activities, you’re likely to engage in a criminal act? What if it denies you the right to fly? What if the government thinks your financial transactions look odd—even if you’ve done nothing wrong—and freezes your accounts? What if the government doesn’t protect your information with adequate security, and an identity thief obtains it and uses it to defraud you? Even if you have nothing to hide, the government can cause you a lot of harm.”

Daniel Solov, ‘Why Privacy Matters Even if You Have ‘Nothing to Hide’

‘But the government doesn’t want to hurt me,’ some might argue. In many cases, that’s true, but the government can also harm people inadvertently, due to errors or carelessness.”

Daniel Solov, ‘Why Privacy Matters Even if You Have ‘Nothing to Hide’


Disquiet #11: A new inequality emerges

An exclusively technical civilization … is threatened … by the splitting of human beings into two classes—the social engineers and the inmates of closed social institutions.”

Jürgen Habermas, Theory and Practice

How does one get privacy these days? Just ask any hacker: only by studying how the right tools work. Privacy is no longer something to be taken for granted or enjoyed for free: you have to expend some resources to master the tools. Those resources could be money, patience, attention – you might even hire a consultant to do all this for you – but the point is that privacy is becoming expensive.

Evgeny Morozov, ‘Why we are allowed to hate Silicon Valley’

And what of those who can’t afford tools and consultants? How do their lives change? When the founder of a prominent lending start-up – the former CIO of Google, no less – proclaims that “all data is credit data, we just don’t know how to use it yet” I can’t help but fear the worst. If “all data is credit data” and poor people cannot afford privacy, they are in for some dark times. How can they not be anxious when their every move, their every click, their every phone call could be analyzed to predict if they deserve credit and at what rates? If the burden of debt wasn’t agonizing enough, now we’ll have to live with the fact that, for the poor people, anxiety begins well before they get the actual loan. Once again, one doesn’t have to hate or fear technology to worry about the future of equality, mobility and the quality of life. The “digital debate,” with its inevitable detours into cultural pessimism, simply has no intellectual resources to tackle these issues.”

Evgeny Morozov, ‘Why we are allowed to hate Silicon Valley’


Disquiet #12: We turn our selves and our lives into commodities

The data-centric model of Silicon Valley capitalism seeks to convert every aspect of our everyday existence – what used to be our only respite from the vagaries of work and the anxieties of the marketplace – into a productive asset.  This is done not just by blurring the distinction between work and nonwork  but also by making us tacitly accept the idea that our reputation is a work-in-progress – something that we could and should be honing 24/7. Therefore, everything is turned into a productive asset: our relationships, our family life, our vacations, our sleep (you are now invited to “hack” it so that you can get most of your sleep in the shortest amount of time).”

 Evgeny Morozov, ‘Why we are allowed to hate Silicon Valley’

What eludes Mr Snowden – along with most of his detractors and supporters – is that we might be living through a transformation in how capitalism works, with personal data emerging as an alternative payment regime. The benefits to consumers are already obvious; the potential costs to citizens are not. As markets in personal information proliferate, so do the externalities – with democracy the main victim.”

 Evgeny Morozov, ‘The Snowden saga heralds a radical shift in capitalism’


Disquiet #13: We witness acts of sabotage

Second, we must learn how to sabotage the system—perhaps by  refusing to self-track at all. If refusing to record our calorie intake or our whereabouts is the only way to get policy makers to address the structural causes of problems like obesity or climate change—and not just tinker with their symptoms through nudging—information boycotts might be justifiable. Refusing to make money off your own data might be as political an act as refusing to drive a car or eat meat. Privacy can then reemerge as a political instrument for keeping the spirit of democracy alive: we want private spaces because we still believe in our ability to reflect on what ails the world and find a way to fix it, and we’d rather not surrender this capacity to algorithms and feedback loops.”

Evgeny Morozov, ‘The real privacy problem’


A note of hope

Then again perhaps there is no need to worry, let alone be paranoid.

Perhaps for all the inevitable side effects, all technological change takes us along a trajectory of progress.

Perhaps we are merely witnessing teething pains.

Perhaps we are seeing the first tremors of a fundamental recalibration of our culture and its values.

Perhaps the notion of what constitutes privacy will be remade.

Perhaps we will indeed give up on believing we are rational actors and surrender ourselves to oversight and adjustment.

Perhaps we will become less precious about wanting to feel autonomous.

Perhaps the goods and services we receive will be so remarkable that handing over our data and privacy will make total sense.

Perhaps we shall gain unimaginably more than we lose.

Perhaps in time these anxieties and oppositions will be regarded as the quaint relics of an age long left behind.

Perhaps we are merely grappling with the problem that the civilisation of Iain M. Banks’ Culture novels had solved:

One of the most important tasks in setting up and running a stable and internally content civilisation is finding an acceptable balance between the desire for freedom of choice in one’s actions (and the freedom from mortal fear in one’s life) and the need to feel that even in a society so self-correctingly Utopian one is still contributing something.”

And perhaps we’ll get there too.


Thinking beyond the market

Pandora’s box cannot be closed. The good, and the not so good have been let loose upon the world. Just as the abundant benefits are undeniable, so too can we not afford to deny the consequent issues and challenges they raise. Issues of personal identity, liberty, social values, ethics, and politics.

Ad- and marketing land would love for these issues to be merely ones of the marketplace, and for them be framed narrowly in terms of data collection, privacy, consent, legislation, and consumer benefits.

But the issues are far broader and deeper than this. Technology, as Julie Cohen has reminded us, is never a neutral quantity. What we make reflects and reproduces our beliefs about what is important, and about how we should live. What we make is shot through with value judgments. Consciously or unconsciously, the things we build are informed by ideology.

So however inconvenient it might be, we cannot discuss technology and its consequences purely in terms of the market. The impact of what we are building and the issues it raises extend far beyond it.

They are issues about how we want to live. Not merely how we want to buy.



Iain M. Banks, ‘A few notes on the culture

John Battelle, ‘Google Now: The tip of a very long spear

John Battelle, ‘Why the banner is heroic and adtech is our greatest artifact

Richard Brautigan, ‘All Watched Over By Machines Of Loving Grace

David Brin, ‘The transparent society

Julie E. Cohen, ‘What is privacy for

Julie E. Choen, ‘Irrational Privacy?

Sarah Conly, Against Autonomy

Bill Davidow, ‘With Great Computing Power Comes Great Surveillance: The dark side of Moore’s Law’ 

Cory Doctorow, ‘The NSA’s Prism: Why we should care

Martha Lane Fox, speech to the House of Lords

Jürgen Habermas, Theory and Practice

Kevin Kelly, ‘We are the web

Kevin Kelly, ‘The Technium

Krishnan Kumar, ‘Utopianism

Jaron Lanier, Who owns the future?

Evgeny Morozov, ‘The real privacy problem

Evgeny Morozov – ‘The price of hypocrisy

Evgeny Morozov, ‘Why we are allowed to hate Silicon Valley

Evgeny Morozov, ‘The Snowden saga heralds a radical shift in capitalism

Bruce Schneier, ‘The eternal value of privacy

Bruce Schneier, ‘When it comes to security, we’re back to feudalism

Eric Siegel, Predictive Analytics: The power to predict who will click, buy, lie, or die

Spiro Simits, ‘Reviewing privacy on an information society’, University of Pennsylvania Law Review, Vol. 135, 1987

Daniel Solov, ‘Why Privacy Matters Even if You Have ‘Nothing to Hide

U.S. Government Accountability Office, ‘Personal Information: Agency and Reseller Adherence to Key Privacy Principles’, April 2006

Tal Zarsky, ‘Transparent predictions