Category: Technology

Your brand is software, not a person: A new metaphor for changed times

The-Matrix-slave

“The price of metaphor is eternal vigilance”

Norbert Wiener

It is a truth universally acknowledged that consumers find it natural to imbue brands with human personality characteristics, such as ‘honest’, ‘optimistic’, ‘cool’ or ‘tough’. 

That brands are like human beings, that they have differentiating personality traits, and that consumers have relationships with them remains still, largely unchallenged dogma in marketing circles.

And yet – the legend of Ernest Dichter, and the acres of subsequent academic theorizing notwithstanding – the fact remains that personality traits are poor at differentiating brands, within any given category different brands do not attract different kinds of buyers, and people are not that attached to brands.

Surely it is time to retire the brand-as-person metaphor.

Indeed perhaps it’s time to find a metaphor (if we must have one) more suited to our times. For while much of our inherited brand theory baggage values the intangible, as Guy Murphy has noted:

The democracy of information has allowed consumers to focus hard on the more rational and ‘real’ aspects of the product itself. Social commentary and ratings have particularly brought tangible product performance itself to the fore. The most emotionally rich brands can be undermined by a one-star product rating. This is as true for products as it is for the companies that make them… There is a shift occurring from adding intangible value to adding tangible value. Instead of adding emotion to function, brands are beginning to add function to function. We don’t have to play out a brand as a metaphor any more, when we can do it for real. Brands will seem more valuable not just because they feel better but because they work better, or serve you better. This is not a denial of the emotional component of brands, it is just a different way of generating it. It is not about the power of associations, it is about the power of reality”

In such a world, perhaps we should be seeking a metaphor and perspective that more consciously attempts to reconcile the invisible, mental dimensions to a brand’s existence with its practical, and concrete manifestations in the world.

And in world which as Marc Andreessen has argued, is being eaten by it, we could perhaps, do worse, than think of brands as software.

After all, as he puts it :

In many industries, new software ideas will result in the rise of new Silicon Valley-style start-ups that invade existing industries with impunity. Over the next 10 years, the battles between incumbents and software-powered insurgents will be epic. Joseph Schumpeter, the economist who coined the term “creative destruction,” would be proud.”

No, a brand is not a person.  But it is software. Both of the mind. And in the world.

***

This much we know. Brands are memory structures in the mind.  They are the sum total of all our memories, encounters, impressions and associations connected with a brand. 

And consciously or unconsciously accessed, they work to make a brand thought of in purchase and consumption occasions.

In this way, they operate as heuristics, rules of thumb that by-pass the necessity to conduct an elaborate cost-benefit analysis of all the options available every time we are faced with choice.

Viewed through this lens, brands are decision-making software.

***

This of course, has always been the case - it is the oldest and indeed original function of branding. But today we are now subcontracting our mental processes to software. 

For now we have technology that can remember, find, anticipate, recommend, remind, analyse, compute, manage, measure, monitor for us. And in the wake of this technological – and with it, cultural – upheaval, brands are evolving into software that resides in the world.

Brands can now can remember what we like, and what we bought.

They can anticipate when we need to restock, repurchase, or renew.

They can suggest purchases, content, and experiences we will probably like.

They can compare and recommend purchase options.

They can respond to our service, upgrade, and replacement needs.

And of course as brands become more like software, unshackled from the constraints of the physical world, they and their functions can inevitably flow through our lives with ever greater ease.

So the brand is software. Of the mind. And in the world.

***

The brand-as-software metaphor certainly feels better aligned than the brand-as-person metaphor to the real role of branding role in consumer decision-making.

And it certainly appears better able to encourage us to take advantage of the new opportunities technology opens up for us.

Indeed the brand-as-software metaphor is much less of a metaphor, much less than a just figure of speech than the brand-as-person metaphor. It better reflects how brands actually operate, in the mind, and in the world.

***

So what might thinking of brands as software actually encourage us to do differently?

Perhaps it will encourage us to think of brands as fluid, adaptable, responsive, upgradable, and permeating.

Perhaps it will encourage us to think how utility (not just storytelling) can create meaning.

Perhaps it might encourage more of a rapprochement and productive relationship between those who make products and services, and those who tell stories.

Perhaps we might demand that brands be just as useful in the mind as they are in the world.

Perhaps we will grant equal status to the creation of memory structures, and to the creation of utility in the world, rather than set them in competition with each other.

And perhaps thinking of brands as an external manifestation of human mental processes will encourage us to think of software in human terms,  rather than think of humans in machine terms.

***

So far, so good, possibly. But we cannot talk about the brand-as-software without talking about the OS or platform. If brands are indeed software, then what do they run on?

In a multi-platform, multi-device, multi OS world, the brand-as-software must of course be able to run effortlessly on, by way of example, both iOS and Android. If brands wish to be part of the fabric of the world, then they must be able to adapt to and run on all the dominant platforms out there. And that demands a new degree of adaptability of what we build.

But there is a more important OS the brand-as-software must run on.

The human OS.

That sounds easy and obvious, but the fact of the matter is that most brand builders don’t give enough of a damn about creating for the human platform. They ignore being interesting and pursue difference. They fail to work back from human drives, needs, wants, and interests back to strategy, and instead insist on working forwards from brand and product. They either fail to reinforce and sustain long-term memory structures, or they fail to refresh them.

Brands must work on the human OS, or fail. For the human OS always takes precedence over platform-level OSs, and is the one that we should all be designing for. Always.

***

Sources

Special thanks to Ben Malbon for prompting me to ask some hard but good questions.

Guy Murphy, ‘Brands in the digital age: The opt-in age of brands’, Admap, December 2013

Marc Andreessen, ‘Why software is eating the world’, The Wall Street Journal, August 20, 2011

All watched over by machines of loving grace? Notes of disquiet and disbelief

 

Hal_9000

 

“I like to think (and the sooner the better!) of a cybernetic meadow where mammals and computers live together in mutually programming harmony… and all watched over by machines of loving grace.”

Richard Brautigan, ‘All Watched Over By Machines Of Loving Grace’

“How is it that when I speak of belief, I become aware always of a shadow, the shadow of disbelief haunting my belief?’”

Graham Greene, Monsignor Quixote

“Anyone who makes up his mind to evade the uncertainty of belief will have to experience the uncertainty of unbelief”

Pope Benedict XVI, Introduction to Christianity

***

 

Preface: What we build will shape us

Innovation is never a neutral quantity. Technologies and artifacts are shaped by the values, priorities, and assumptions of their developers, and often their users as well. Of course, many technologies are designed or refined with particular goals in mind, but here I am referring to a different and less deliberate shaping process, through which artifacts come to reflect and reproduce beliefs about the types of functions and ways of living and working that are important.” 

Julie Cohen, ‘What is privacy for’

We shape our buildings, and afterwards our buildings shape us.” 

Winston Churchill

“Our artifacts organize the world for us, subtly shaping the ways that we make sense of it. Over time we come to perceive the world through the lenses that our artifacts create. For example, an automobile club map and the step-by-step instructions uttered by an in-car GPS or the iPhone’s Siri interface represent local geography in radically different ways. Intriguingly, there is evidence to suggest that over time the processes of mediation and configuration become deeply encoded in our neurobiology, producing individuals who are literally wired to think and act differently” 

Julie Cohen, ‘What is privacy for’

As goes adtech, so goes our social code” 

John Battelle, ‘Why the banner is heroic and adtech is our greatest artifact’

So what is the impact of technological change, of our emergent ‘technium’, as Kevin Kelly has dubbed it – the ecology of ‘technological species’ and their complex interactions with us, and with each other? And what are the effects of what we are building on how we live, and indeed on who we are?

I have chosen to rely upon the observations, analysis and opinion of others. Caught between a head filled with sci-fi wonders since a teen and a tendency to worry, I make little commentary but provide extracts instead. This is as much a personal exploration as anything else.

So let’s start with the very good news.

***

We’re building a new infrastructure

What we today call “adtech” will tomorrow become the worldwide real-time processing layer driving much of society’s transactions.”

John Battelle, ‘Why the banner is heroic and adtech is our greatest artifact’ 

Data embodies a priceless collection of experience from which to learn. Every medical procedure, credit application, Facebook post, movie recommendation, fraudulent act, spammy e-mail, and purchase of any kind – each positive or negative outcome, each successful or failed sales call, each incident, event, and transaction, – is encoded as data and warehoused. The glut grows by an estimated 2.5 quintillion bytes per day… And so a veritable Big Bang has set off, delivering an epic sea of raw materials, a plethora of examples os great in number, only a computer could manage to learn from them. Used correctly, computers avidly soak up this ocean like a sponge…. The process of machines learning from data unleashes the power of this exploding resource. It uncovers what drives people and the actions they take – what makes it tick and how the world works. With the new knowledge gained, prediction is possible.”

Eric Siegel, Predictive Analytics: The power to predict who will click, buy, lie, or die

Think about what programmatic adtech makes possible. An individual requests a piece of content through a link or an action (like touching something on a mobile device). In milliseconds, scores of agents execute thousands of calculations based on hundreds of parameters, all looking to market-price the value of that request and deliver a personalized response. This happens millions of times a second,  representing hundreds of millions, if not billions, of computing cycles each second. What’s most stunning about this system is that it’s tuned to each discrete individual – every single request/response loop is unique, based on the data associated with each individual… Every time you turn up your thermostat, this infrastructure will engage, determining in real time the most efficient response to your heating needs. Each time you walk into a doctor’s office, the same kind of system could be triggered to determine what information should appear on your health care provider’s screen, and on yours, and how best payment should be made (or insurance claims filed). Every retail store you visit, every automobile you drive (or are driven by), every single interaction of value in this world can and will become data that interacts with this programmatic infrastructure.”

John Battelle, ‘Why the banner is heroic and adtech is our greatest artifact’

What we today call ‘adtech’ will tomorrow become the worldwide real-time processing layer driving much of society’s transactions. That layer deserves to be named as perhaps the most important artifact extant today.”

John Battelle, ‘Why the banner is heroic and adtech is our greatest artifact’

***

We’re building a better world

The web has transformed the way we work, play and interact with each other. Some of health’s greatest problems are being rethought. Millions of so called citizen scientists are plotting cancer gene patterns via online games. Early intervention in dementia is more common after the success of an online test for potential patients. Education is being opened up on a global scale through the use of massive open online courses – from khan academy’s tutorials to coursera’s degrees. All industries are being disrupted. Farmers in Ghana are saving time and money by using their smartphones to trade their products before the long walk to market begins and underprivileged women in South Africa are breaking out of poverty after training that means they can answer the tech problems of US customers. Even the animal videos posted online – much mocked by some – are now enabling scientists to gather meaningful data from around the world about animal relationships that would never have been accessible before.”

Martha Lane Fox

Utopia confronts reality not with a measured assessment of the possibilities of change but with the demand for change. This is the way the world should be. It refuses to accept current definitions of the possible because it knows these to be a part of the reality it seeks to change. In its guise as utopia, by the very force of its imaginative presentation, it energizes reality, infusing it with ‘the power of the new.’”

Krishan Kumar, ‘Utopianism’

The critics of early computers capitalized on a common fear: that a Big Brother brain would watch over us and control us. What we know now of our own brains is that they too are only networks of mini-minds, a society of dumber minds linked together, and that when we peer into them deeply we find that there is no “I” in charge. Not only does a central-command economy not work; a central-command brain won’t either.

In its stead, we can make a nation of personal computers, a country of decentralized nodes of governance and thought. Almost every type of large-scale governance we can find, from the body of a giraffe, to the energy-regulation in a tidal marsh, to the temperature regulation of a beehive, to the flow of traffic on the Internet, resolves into a swarmy distributed net of autonomous units and heterogeneous parts.

No one has been more wrong about computerization than George Orwell in 1984. So far, nearly everything about the actual possibility-space that computers have created indicates they are not the beginning of authority but its end.

In the process of connecting everything to everything, computers elevate the power of the small player. They make room for the different, and they reward small innovations. Instead of enforcing uniformity, they promote heterogeneity and autonomy. Instead of sucking the soul from human bodies, turning computer-users into an army of dull clones, networked computers — by reflecting the networked nature of our own brains and bodies — encourage the humanism of their users. Because they have taken on the flexibility, adaptability, and self-connecting governance of organic systems, we become more human, not less so, when we use them.”

Kevin Kelly, The electronic hive: Embrace it’

I like to think (and the sooner the better!) of a cybernetic meadow where mammals and computers live together in mutually programming harmony like pure water touching clear sky… where we are free of our labors and joined back to nature, returned to our mammal brothers and sisters, and all watched over by machines of loving grace.”

Richard Brautigan, ‘All Watched Over By Machines Of Loving Grace’

There is only one time in the history of each planet when its inhabitants first wire up its innumerable parts to make one large Machine. Later that Machine may run faster, but there is only one time when it is born.

You and I are alive at this moment.

We should marvel, but people alive at such times usually don’t. Every few centuries, the steady march of change meets a discontinuity, and history hinges on that moment. We look back on those pivotal eras and wonder what it would have been like to be alive then. Confucius, Zoroaster, Buddha, and the latter Jewish patriarchs lived in the same historical era, an inflection point known as the axial age of religion. Few world religions were born after this time. Similarly, the great personalities converging upon the American Revolution and the geniuses who commingled during the invention of modern science in the 17th century mark additional axial phases in the short history of our civilization.

Three thousand years from now, when keen minds review the past, I believe that our ancient time, here at the cusp of the third millennium, will be seen as another such era. In the years roughly coincidental with the Netscape IPO, humans began animating inert objects with tiny slivers of intelligence, connecting them into a global field, and linking their own minds into a single thing. This will be recognized as the largest, most complex, and most surprising event on the planet. Weaving nerves out of glass and radio waves, our species began wiring up all regions, all processes, all facts and notions into a grand network. From this embryonic neural net was born a collaborative interface for our civilization, a sensing, cognitive device with power that exceeded any previous invention. The Machine provided a new way of thinking (perfect search, total recall) and a new mind for an old species. It was the Beginning.”

Kevin Kelly, ‘We are the web’

***

The necessity of doubt

This is Hieronymus Bosch’s extraordinary painting The Garden of Earthly Delights:

The_Garden_of_Earthly_Delights_by_Bosch_High_Resolution_2

The image below is a detail.

No strangers to the perils of skating, it was the Dutch who gave us the expression “skating on thin ice”.  What appears to be at first glance a boat or sled is actually a giant skate. We’re looking at a tiny man on an oversized ice skate.

garden-rightpanel-Version19

Bosch was fond of using oversized objects to represent objects in man’s imagination. Our skating man  here thinks he is the one controlling his movement and the direction he’s going in. But his vehicle is imaginary, a figment of his fantasies about his own ability to do things. He’s riding forces much greater than he is. And there in the upper left we witness his inevitable fall through the ice.

For despite all the the powers and joys it is yielding today, and despite all the breathless prophecies of the frictionless future to come, there is a very real possibility that it is our over-sized ice skate. And that while we think we are in control of it, we are in truth riding forces we do not fully comprehend, and which we cannot fully control.

No doubt there will be readers smarting already with righteous indignation. But the fact of the matter is that if we are to make any real, intelligent progress then criticism must be entertained. Technology is a tool, not an article of faith. And to question it is not an act of apostasy.

Room must be allowed for doubt and disquiet. For as the writer and social critic Lillian Smith said, “Faith and doubt, both are needed, not as antagonists but working side by side, to take us around the unknown curve.”

So let’s now exercise a little doubt, try to peer around the unknown curve, and speculate about what potential scenarios and options might await us.

***

Disquiet #1: It’s an infrastructure of surveillance

Networked information technologies enable surveillance to become modulation. Surveillance may be defined generically as attention that is purposeful, routine, systematic, and focused. Networked information technologies enable surveillant attention to become continuous, pervasively distributed, and persistent.”

Julie Cohen, ‘What is privacy for’

Spying on you is…. The official primary business of the information economy” 

Jaron Lanier, ‘Who Controls The Future?’

From the cash-register scanners in a million supermarkets, videostores, and pharmacies, there already pours a steady flow of statistical data about customers and their purchases, ready to be correlated, helping companies serve us more efficiently – or else giving them an unfair advantage, knowing vastly more about us than we do about them. Soon, computers will hold financial and educational records, legal documents, and medical analyses that parse you all the way down to your genes. Any of this might be accessed by strangers without your knowledge or even against your stated will.”

David Brin, ‘The transparent society’

***

Disquiet #2: There is more to come

It’s about to get worse, though. Companies such as Google may know more about your personal interests than your spouse, but so far it’s been limited by the fact that these companies only see computer data. And even though your computer habits are increasingly being linked to your offline behaviour, it’s still only behaviour that involves computers.

The Internet of Things refers to a world where much more than our computers and cell phones is internet-enabled. Soon there will be internet-connected modules on our cars and home appliances. Internet-enabled medical devices will collect real-time health data about us. There’ll be internet-connected tags on our clothing. In its extreme, everything can be connected to the internet. It’s really just a matter of time, as these self-powered wireless-enabled computers become smaller and cheaper.

“Lots has been written about the ‘Internet of Things’ and how it will change society for the better. It’s true that it will make a lot of wonderful things possible, but the “Internet of Things” will also allow for an even greater amount of  surveillance than there is today.”

Bruce Schneier, ‘Will giving the internet eyes and ears mean the end of privacy?’

Moore’s Law has also made mass automated surveillance dirt cheap. Government surveillance that used to cost millions of dollars can now be carried out for a fraction of that. We have yet to fully grasp the implications of cheap surveillance. The only thing that is certain is that we will be seeing a great deal more surveillance—of ordinary citizens, potential terrorists, and heads of state—and that it will have major consequences.” 

Bill Davidow, ‘With Great Computing Power Comes Great Surveillance: The dark side of Moore’s Law’

Items of interest will be located, identified, monitored, and remotely controlled through technologies such as radio-frequency identification, sensor networks, tiny embedded servers, and energy harvesters — all connected to the next-generation internet using abundant, low-cost, and high-power computing.” 

Former director of the CIA, David Patraeus

***

Disquiet #3: The surveillance-industrial emerges

You should care about surveillance because once the system for surveillance is built into the networks and the phones, bad guys (or dirty cops) can use it to attack you. In Greece, someone used the police back door on the national phone company’s switches to listen in on the prime minister during the 2005 Olympic bid. Chinese hackers used Google’s lawful interception back door to hack Gmail and figure out who dissidents talked to. Our communications systems are more secure if they’re designed to keep everyone out – and adding a single back door to them blows their security models up. You can’t be a little bit pregnant, and the computers in your pocket and on your desk and in your walls can’t be a little bit insecure. Once they’re designed for surveillance, anyone who can bribe or impersonate a cop can access them.”

Cory Doctorow, ‘The NSA’s Prism: Why we should care’ 

Information gathered by private firms is, as demonstrated by the experience of banks and credit card companies, again and again processed by the government. The boundaries between the public and the private sectors are thus blurred. Personal data, once stored, tend to become a flexible, common base of information”

Spiro Simits, ‘Reviewing privacy on an information society’, University of Pennsylvania Law Review, Vol. 135, 1987

Federal agencies collect and use personal information for various purposes, both directly from individuals and from other sources, including information resellers–companies that amass and sell data from many sources.”

U.S. Government Accountability Office, ‘Personal Information: Agency and Reseller Adherence to Key Privacy Principles’, April 2006

In fiscal year 2005, the Departments of Justice, Homeland Security, and State and the Social Security Administration reported that they used personal information obtained from resellers for a variety of purposes. Components of the Department of Justice (the largest user of resellers) used such information in performing criminal investigations, locating witnesses and fugitives, researching assets held by individuals of interest, and detecting prescription drug fraud. The Department of Homeland Security used reseller information for immigration fraud detection and border screening programs. Uses by the Social Security Administration and the Department of State were to prevent and detect fraud, verify identity, and determine eligibility for benefits. The agencies spent approximately $30 million on contractual arrangements with resellers that enabled the acquisition and use of such information. About 91 percent of the planned fiscal year 2005 spending was for law enforcement (69 percent) or counterterrorism (22 percent).”

U.S. Government Accountability Office, ‘Personal Information: Agency and Reseller Adherence to Key Privacy Principles’, April 2006

The major information resellers that do business with the federal agencies we reviewed have practices in place to protect privacy, but these measures are not fully consistent with the Fair Information Practices. For example, the principles that the collection and use of personal information should be limited and its intended use specified are largely at odds with the nature of the information reseller business, which presupposes that personal information can be made available to multiple customers and for multiple purposes. Resellers said they believe it is not appropriate for them to fully adhere to these principles because they do not obtain their information directly from individuals.”

U.S. Government Accountability Office, ‘Personal Information: Agency and Reseller Adherence to Key Privacy Principles’, April 2006

Resellers generally limit the extent to which individuals can gain access to personal information held about themselves, as well as the extent to which inaccurate information contained in their databases can be corrected or deleted. Agency practices for handling personal information acquired from information resellers did not always fully reflect the Fair Information Practices. That is, some of these principles were mirrored in agency practices, but for others, agency practices were uneven. For example, although agencies issued public notices on information collections, these did not always notify the public that information resellers were among the sources to be used. This practice is not consistent with the principle that individuals should be informed about privacy policies and the collection of information.”

U.S. Government Accountability Office, ‘Personal Information: Agency and Reseller Adherence to Key Privacy Principles’, April 2006


***

Disquiet #4: Resistance is futile

Even more importantly, dysfunction in privacy markets has a  dynamic aspect. Over time, people can be expected to (over)disclose  more and more information, both because they have become inured to  disclosure and because the equilibrium in the marketplace has tipped  toward disclosure as a condition of market entry.”

Julie Cohen, ‘Irrational Privacy?’

It is already far too late to prevent the invasion of cameras and databases. The djinn cannot be crammed back into the bottle. No matter how many laws are passed, it will prove quite impossible to legislate away the new tools and techniques. They are here to stay. Light is going to shine into every aspect of our lives… Oh, we can try. We might agitate, demonstrate, legislate. But in rushing to pass so-called privacy laws, we will not succeed in preventing hidden eyes from peering into our lives. The devices will get tinier, more mobile, and more clever. In software form, they will cruise the data highways. The rich, the powerful, police agencies, and a technologically skilled élite will always be able to find out whatever they want to know about you and me…

So far, most European politicians have reached for the low-hanging fruit – law – thinking that if only they can better regulate American companies – for example, by forcing them to disclose how much data and when they share with NSA – this problem will go away. This is a rather short-sighted, naïve view that reduces a gigantic philosophical problem – the future of privacy – to seemingly manageable size of data retention directives. If only things were that simple! Our current predicaments start at the level of ideology, not bad policies or their poor implementation. This is not to oppose more regulation of technology companies – Europe should have done this a decade ago instead of getting caught in the heady rhetoric of ‘cloud computing – but only to point out that the task ahead is far more intellectually demanding.”

Evgeny Morozov – ‘The price of hypocrisy’

***

Disquiet #5: We accept the Faustian bargain

We choose to do it because of the convenience, redundancy, automation, and shareability. We like it when we can access our e-mail anywhere, from any computer. We like it when we can restore our contact lists after we’ve lost our phones. We want our calendar entries to automatically appear on all of our devices. These cloud storage sites do a better job of backing up our photos and files than we would manage by ourselves; Apple does a great job keeping malware out of its iPhone apps store.”

Bruce Schneier, ‘When it comes to security, we’re back to feudalism’


***

Disquiet #6: We accept digital feudalism

They tether us like serfs; just try to take data from one digital lord to another. Ultimately, they will always act in their own self-interest, as companies do when they mine our data in order to sell more advertising and make more money. These companies own us, so they can sell us off – again, like serfs – to rival lords … or turn us in to the authorities.”

Bruce Schneier, ‘When it comes to security, we’re back to feudalism’


***

Disquiet #7: Pawns of the adjustment bureau

Networked information technologies enable surveillant attention to become continuous, pervasively distributed, and persistent. This in turn produces modulation: a set of processes in which the quality and content of surveillant attention is continually modified according to the subject’s own behavior, sometimes in response to inputs from the subject but according to logics that ultimately are outside the subject’s control.”

Julie Cohen, ‘What is privacy for’

Now is Google’s attempt at becoming the real time interface to our lives – moving well beyond the siloed confines of “search” and into the far more ambitious world of “experience.” As in – every experience one has could well be lit by data delivered through Google Now. Google knows that this moment – the moment of our lives becoming data – is happening now, and the company is very, very focused on seizing it.”

John Battelle, ‘Google Now: The tip of a very long spear’

In the modulated society, surveillance is not heavy-handed; it is ordinary, and its ordinariness lends it extraordinary power. The surveillant assemblages of informational capitalism do not threaten the ‘normalized soul training’ of the Orwellian nightmare. They beckon with seductive appeal. Individual citizen-consumers willingly and actively participate in processes of modulation, seeking the benefits that increased personalization can bring. For favored consumers, these may include price discounts, enhanced products and services, more convenient access to resources, and heightened social status. Within surveillant assemblages, patterns of information flow are accompanied by discourses about why the patterns are natural and beneficial, and those discourses foster widespread internalization of the new norms of information flow.”

Julie Cohen, ‘What is privacy for’

Modulation is a mode of privacy invasion, but it is also a mode of knowledge production designed to produce a particular way of knowing and a mode of governance designed to produce a particular kind of subject. Its purpose is to produce tractable, predictable citizen-consumers whose preferred modes of self-determination play out along predictable and profit-generating trajectories. Yet to speak of networked processes of surveillance and modulation in the industrial-era vernacular, as systems for “manufacturing consent” would be too crude. Rather, in a much more subtle process of continual feedback, stimuli are tailored to play to existing inclinations, nudging them in directions that align with profit-maximizing goals. So too with political inclinations; particularly as search and social networking become more seamlessly integrated, networked citizen-consumers move within personalized “filter bubbles” that conform the information environment to their political and ideological commitments. This is conducive to identifying and targeting particular political constituencies, but not necessarily to fostering political dialogue among diverse constituencies in ways that might enable them to find common ground.”

Julie Cohen, ‘What is privacy for’

Habits, activities, and preferences are complied, registered, and retrieved to facilitate better adjustment, not to improve the the individual’s capacity to act and decide. Whatever the original incentive for computerization may have been, processing increasingly appears as the ideal means to adapt an individual to a predetermined, standardized behavior that aims at the highest possible degree of compliance with the model patient, consumer, taxpayer, employee, or citizen. Furthermore, interactive systems do not, despite all contrary assertions, restore a long lost individuality by correcting the effects of mass production in a mass society. On the contrary, the telemetric integration forces the individual once more into a preset scheme. The media supplier dictates the conditions under which the communication takes place, fixes the possible subjects of the dialogue, and, due to the personal data collected, is in an increasingly better position to influence the subscribers behavior. Interactive systems, therefor,e suggest individual behaviors where in fact no more than stereotyped reactions occur”

Spiro Simits, ‘Reviewing privacy on an information society’, University of Pennsylvania Law Review, Vol. 135, 1987

In essence, information-rich democracies have reached a point where they want to try to solve public problems without having to explain or justify themselves to citizens. Instead, they can simply appeal to our own self-interest—and they know enough about us to engineer a perfect, highly personalized, irresistible nudge.”

Evgeny Morozov, ‘The real privacy problem’

A non-interpretable process might follow from a data-mining analysis which is not explainable in human language. Here, the software makes its selection decisions based upon multiple variables (even thousands) … It would be difficult for the government to provide a detailed response when asked why an individual was singled out to receive differentiated treatment by an automated recommendation system. The most the government could say is that this is what the algorithm found based on previous cases.”

Tal Zarsky, ‘Transparent Predictions’

From with Great Computing: “While much of the world’s concern has been focused on NSA spying, I believe the greatest threat to my freedom will result from my being placed in a virtual algorithmic prison. Those algorithmic predictions could land me on no-fly lists or target me for government audits. They could be used to deny me loans and credit, screen my job applications and scan LinkedIn to determine my suitability for a job. They could be used by potential employers to get a picture of my health. They could predict whether I will commit a crime or am likely to use addictive substances, and determine my eligibility for automobile and life insurance. They could be used by retirement communities to determine if I will be a profitable resident, and employed by colleges as part of the admissions process. Especially disturbing is the notion that once you become an algorithmic prisoner, it is very difficult to get pardoned. Ask anyone who has tried to get off a no-fly list or correct a mistake on a credit report.”

Bill Davidow, ‘With Great Computing Power Comes Great Surveillance’

It is all too easy to forget that ‘free’ inevitably means that someone will be deciding how you live.”

Jaron Lanier, Who owns the future?


***

Disquiet #8: The threat to democracy

By these increasingly ordinary processes, both public and private regimes of surveillance and modulation diminish the capacity for democratic self-government.”

Julie Cohen, ‘What is privacy for’

Citizens of the modulated society are not the same citizens that the liberal democratic political tradition assumes, nor do their modulated preferences even approximately resemble the independent decisions, formed through robust and open debate, that liberal democracy requires in order to sustain and perfect itself. The modulated society is the consummate social and intellectual rheostat, continually adjusting the information environment to each individual’s comfort level. Liberal democratic citizenship requires a certain amount of discomfort—enough to motivate citizens to pursue improvements in the realization of political and social ideals. The modulated citizenry lacks the wherewithal and perhaps even the desire to practice this sort of citizenship.  If this sounds like science fiction, it shouldn’t. Like the liberal self, liberal democracy has always been an ideal to be pursued and approximated. A polity’s ability to approximate liberal democracy has both institutional and material preconditions. In the generations following the framing of the U.S. Constitution, those who sought to build a functioning liberal democracy had to contend with the gulf between liberalism’s aspirations to egalitarianism and the concentration of political power in an entitled minority of white male property- and slaveowners. In the generations to come, those who seek to maintain a functioning liberal democracy will need to contend with the gulf between liberalism’s aspirations to self-government by an informed and vigilant citizenry and the relatively blunted capacities of a modulated citizenry.”

Julie Cohen, ‘What is privacy for’

For if we are observed in all matters, we are constantly under threat of correction, judgment, criticism, even plagiarism of our own uniqueness. We become children, fettered under watchful eyes, constantly fearful that—either now or in the uncertain future—patterns we leave behind will be brought back to implicate us, by whatever authority has now become focused upon our once-private and innocent acts. We lose our individuality, because everything we do is observable and recordable.”

Bruce Schneier, ‘The eternal value of privacy’

Networked information technologies mediate our experiences of the world in ways directly related to both the practice of citizenship and the capacity for citizenship, and so configure citizens as directly or even more directly than institutions do. The practice of citizenship requires access to information and access to the various communities in which citizens claim membership. In the networked information society, those experiences are mediated by search engines, social networking platforms, and content formats. Search engines filter and rank search results, tailoring both the results and the accompanying advertising to what is known about the searcher and prioritizing results in ways that reflect popularity and advertising payments. Social networking platforms filter and systematize social and professional relationships according to their own logics. Content formats determine the material conditions of access to information—for example, whether a video file can be copied or manipulated, or whether a news forum permits reader comments. Each set of processes structures the practice of citizenship, and also subtly molds network users’ understanding of the surrounding world. To an increasing degree, then, the capacity for democratic self-government is defined in part by what those technologies and other widely used technologies allow, and by exactly how they allow it. The resulting changes in our political culture may be more profound and permanent than we have allowed ourselves to imagine.”

Julie Cohen, ‘What is privacy for’

Thanks to smartphones or Google Glass, we can now be pinged whenever we are about to do something stupid, unhealthy, or unsound. We wouldn’t necessarily need to know why the action would be wrong: the system’s algorithms do the moral calculus on their own. Citizens take on the role of information machines that feed the techno-bureaucratic complex with our data. And why wouldn’t we, if we are promised slimmer waistlines, cleaner air, or longer (and safer) lives in return? This logic of preemption is not different from that of the NSA in its fight against terror: let’s prevent problems rather than deal with their consequences. Even if we tie the hands of the NSA—by some combination of better oversight, stricter rules on data access, or stronger and friendlier encryption technologies—the data hunger of other state institutions would remain. They will justify it. On issues like obesity or climate change—where the policy makers are quick to add that we are facing a ticking-bomb scenario—they will say a little deficit of democracy can go a long way.”

Evgeny Morozov, ‘The real privacy problem’


***

Disquiet #9: Human capital is diminished

An environment that was always and everywhere congenial to the straightaway execution of our impulsions would set a term to growth as surely as one always hostile would irritate and destroy. Impulsion forever boosted on its forward way would run its course thoughtless, and dead to emotion. For it would not have to give an account of itself in terms of the things it encounters, and hence they would not become significant objects. The only way it can become aware of its nature and its goal is by obstacles surmounted and means employed; means which are only means from the very beginning are too much one with an impulsion, on a way smoothed and oiled in advance, to permit of consciousness of them. Nor without resistance from surroundings would the self become aware of itself; it would have neither feeling nor interest, neither fear nor hope, neither disappointment nor elation. Mere opposition that completely thwarts, creates irritation and rage. But resistance that calls out thought generates curiosity and solicitous care, and, when it is overcome and utilized, eventuates in elation.”

John Dewey, Art as Experience

The ground for valuing liberty is the claim that we are pre-eminently rational agents, each of us well suited to determining what goes in our own life. There is ample evidence, however, from the fields of psychology and behavioral economics, that in many situations this is simply not true. The incidence of irrationality is much higher than our Enlightenment tradition has given us to believe, and keeps us from making the decisions we need to reach our goals. The ground for respecting autonomy is shaky.”

Sarah Conly, Against Autonomy

“A commitment to privacy expresses a different kind of “sound reason” that we might choose to value—one that prizes serendipity as well as predictability and idiosyncrasy as well as assimilation”

Julie Cohen, ‘What is privacy for’

Habits, activities, and preferences are compiled, registered, and retrieved to facilitate better adjustment, not to improve the individual’s capacity to act and to decide. Whatever the original incentive for computerization may have been, processing increasingly appears as the ideal means to adapt an individual to a predetermined, standardized behavior that aims at the highest possible degree of compliance with the model patient, consumer, taxpayer, employee, or citizen.”

Spiro Simitis, ‘Reviewing privacy on an information society’

Privacy is important both because it promotes innovative practice and because it promotes the kinds of innovation that matter most. The human innovative drive is both unpredictable and robust, but it does not follow either that all environments are equally favorable to innovation or that all environments will produce the same kinds of innovation. If privacy and serendipity are critical to innovation—by which I mean critical both to the likelihood that innovation will occur and to the substance of that innovation—there is reason to worry when privacy is squeezed to the margins, and when the pathways of serendipity are disrupted and rearranged to serve more linear, commercial imperatives. Environments that disfavor critical independence of mind, and that discourage the kinds of tinkering and behavioral variation out of which innovation emerges will, over time, predictably and systematically disfavor innovation, and environments designed to promote consumptive and profit-maximizing choices will systematically disfavor innovations designed to promote other values. The modulated society is dedicated to prediction but not necessarily to understanding or to advancing human material, intellectual, and political wellbeing. Data processing offers important benefits, but so does privacy. A healthy society needs both to thrive.”

Julie Cohen, ‘What is privacy for’

Even if Big Data did not continually alter its own operands, it does not operate in a vacuum. It is a mistake to think of the techniques of Big Data as simply adding to the amount of information circulating within society. The valorization of predictive rationality and risk management inevitably displaces other kinds of knowledge that might be generated instead. Stimuli tailored to consumptive preferences crowd out other ways in which preferences and self-knowledge might be expressed, and also crowd out other kinds of motivators—altruism, empathy, and so on—that might spur innovation in different directions. In a consumption-driven economy, the innovations that emerge and find favor will be those that fulfill consumption-driven needs. Contemporary applications of Big Data extend beyond marketing and advertising to core social and cultural functions, including the study of intellectual preferences and the delivery of higher education. Systematizing those functions according to the dictates of predictive rationality threatens important social values. It crowds out the ability to form and pursue agendas for human flourishing, which is indispensable both to maintaining a vital, dynamic society and to pursuing a more just one.”

Julie Cohen, ‘What is privacy for’

Privacy shelters dynamic, emergent subjectivity from the efforts of commercial and government actors to render individuals and communities fixed, transparent, and predictable. It protects the situated practices of boundary management through which the capacity for self-determination develops.”

Julie Cohen, ‘What is privacy for’

***

Disquiet #10: Everyone should worry

The problem with the nothing-to-hide argument is the underlying assumption that privacy is about hiding bad things. By accepting this assumption, we concede far too much ground and invite an unproductive discussion about information that people would very likely want to hide. As the computer-security specialist Schneier aptly notes, the nothing-to-hide argument stems from a faulty “premise that privacy is about hiding a wrong.” Surveillance, for example, can inhibit such lawful activities as free speech, free association, and other First Amendment rights essential for democracy.”

Daniel Solov, ‘Why Privacy Matters Even if You Have ‘Nothing to Hide’

“My life’s an open book,” people might say. “I’ve got nothing to hide.” But now the government has large dossiers of everyone’s activities, interests, reading habits, finances, and health. What if the government leaks the information to the public? What if the government mistakenly determines that based on your pattern of activities, you’re likely to engage in a criminal act? What if it denies you the right to fly? What if the government thinks your financial transactions look odd—even if you’ve done nothing wrong—and freezes your accounts? What if the government doesn’t protect your information with adequate security, and an identity thief obtains it and uses it to defraud you? Even if you have nothing to hide, the government can cause you a lot of harm.”

Daniel Solov, ‘Why Privacy Matters Even if You Have ‘Nothing to Hide’

‘But the government doesn’t want to hurt me,’ some might argue. In many cases, that’s true, but the government can also harm people inadvertently, due to errors or carelessness.”

Daniel Solov, ‘Why Privacy Matters Even if You Have ‘Nothing to Hide’

***

Disquiet #11: A new inequality emerges

An exclusively technical civilization … is threatened … by the splitting of human beings into two classes—the social engineers and the inmates of closed social institutions.”

Jürgen Habermas, Theory and Practice

How does one get privacy these days? Just ask any hacker: only by studying how the right tools work. Privacy is no longer something to be taken for granted or enjoyed for free: you have to expend some resources to master the tools. Those resources could be money, patience, attention – you might even hire a consultant to do all this for you – but the point is that privacy is becoming expensive.

Evgeny Morozov, ‘Why we are allowed to hate Silicon Valley’

And what of those who can’t afford tools and consultants? How do their lives change? When the founder of a prominent lending start-up – the former CIO of Google, no less – proclaims that “all data is credit data, we just don’t know how to use it yet” I can’t help but fear the worst. If “all data is credit data” and poor people cannot afford privacy, they are in for some dark times. How can they not be anxious when their every move, their every click, their every phone call could be analyzed to predict if they deserve credit and at what rates? If the burden of debt wasn’t agonizing enough, now we’ll have to live with the fact that, for the poor people, anxiety begins well before they get the actual loan. Once again, one doesn’t have to hate or fear technology to worry about the future of equality, mobility and the quality of life. The “digital debate,” with its inevitable detours into cultural pessimism, simply has no intellectual resources to tackle these issues.”

Evgeny Morozov, ‘Why we are allowed to hate Silicon Valley’

***

Disquiet #12: We turn our selves and our lives into commodities

The data-centric model of Silicon Valley capitalism seeks to convert every aspect of our everyday existence – what used to be our only respite from the vagaries of work and the anxieties of the marketplace – into a productive asset.  This is done not just by blurring the distinction between work and nonwork  but also by making us tacitly accept the idea that our reputation is a work-in-progress – something that we could and should be honing 24/7. Therefore, everything is turned into a productive asset: our relationships, our family life, our vacations, our sleep (you are now invited to “hack” it so that you can get most of your sleep in the shortest amount of time).”

 Evgeny Morozov, ‘Why we are allowed to hate Silicon Valley’

What eludes Mr Snowden – along with most of his detractors and supporters – is that we might be living through a transformation in how capitalism works, with personal data emerging as an alternative payment regime. The benefits to consumers are already obvious; the potential costs to citizens are not. As markets in personal information proliferate, so do the externalities – with democracy the main victim.”

 Evgeny Morozov, ‘The Snowden saga heralds a radical shift in capitalism’

***

Disquiet #13: We witness acts of sabotage

Second, we must learn how to sabotage the system—perhaps by  refusing to self-track at all. If refusing to record our calorie intake or our whereabouts is the only way to get policy makers to address the structural causes of problems like obesity or climate change—and not just tinker with their symptoms through nudging—information boycotts might be justifiable. Refusing to make money off your own data might be as political an act as refusing to drive a car or eat meat. Privacy can then reemerge as a political instrument for keeping the spirit of democracy alive: we want private spaces because we still believe in our ability to reflect on what ails the world and find a way to fix it, and we’d rather not surrender this capacity to algorithms and feedback loops.”

Evgeny Morozov, ‘The real privacy problem’

***

A note of hope

Then again perhaps there is no need to worry, let alone be paranoid.

Perhaps for all the inevitable side effects, all technological change takes us along a trajectory of progress.

Perhaps we are merely witnessing teething pains.

Perhaps we are seeing the first tremors of a fundamental recalibration of our culture and its values.

Perhaps the notion of what constitutes privacy will be remade.

Perhaps we will indeed give up on believing we are rational actors and surrender ourselves to oversight and adjustment.

Perhaps we will become less precious about wanting to feel autonomous.

Perhaps the goods and services we receive will be so remarkable that handing over our data and privacy will make total sense.

Perhaps we shall gain unimaginably more than we lose.

Perhaps in time these anxieties and oppositions will be regarded as the quaint relics of an age long left behind.

Perhaps we are merely grappling with the problem that the civilisation of Iain M. Banks’ Culture novels had solved:

One of the most important tasks in setting up and running a stable and internally content civilisation is finding an acceptable balance between the desire for freedom of choice in one’s actions (and the freedom from mortal fear in one’s life) and the need to feel that even in a society so self-correctingly Utopian one is still contributing something.”

And perhaps we’ll get there too.

***

Thinking beyond the market

Pandora’s box cannot be closed. The good, and the not so good have been let loose upon the world. Just as the abundant benefits are undeniable, so too can we not afford to deny the consequent issues and challenges they raise. Issues of personal identity, liberty, social values, ethics, and politics.

Ad- and marketing land would love for these issues to be merely ones of the marketplace, and for them be framed narrowly in terms of data collection, privacy, consent, legislation, and consumer benefits.

But the issues are far broader and deeper than this. Technology, as Julie Cohen has reminded us, is never a neutral quantity. What we make reflects and reproduces our beliefs about what is important, and about how we should live. What we make is shot through with value judgments. Consciously or unconsciously, the things we build are informed by ideology.

So however inconvenient it might be, we cannot discuss technology and its consequences purely in terms of the market. The impact of what we are building and the issues it raises extend far beyond it.

They are issues about how we want to live. Not merely how we want to buy.

***

Sources

Iain M. Banks, ‘A few notes on the culture

John Battelle, ‘Google Now: The tip of a very long spear

John Battelle, ‘Why the banner is heroic and adtech is our greatest artifact

Richard Brautigan, ‘All Watched Over By Machines Of Loving Grace

David Brin, ‘The transparent society

Julie E. Cohen, ‘What is privacy for

Julie E. Choen, ‘Irrational Privacy?

Sarah Conly, Against Autonomy

Bill Davidow, ‘With Great Computing Power Comes Great Surveillance: The dark side of Moore’s Law’ 

Cory Doctorow, ‘The NSA’s Prism: Why we should care

Martha Lane Fox, speech to the House of Lords

Jürgen Habermas, Theory and Practice

Kevin Kelly, ‘We are the web

Kevin Kelly, ‘The Technium

Krishnan Kumar, ‘Utopianism

Jaron Lanier, Who owns the future?

Evgeny Morozov, ‘The real privacy problem

Evgeny Morozov – ‘The price of hypocrisy

Evgeny Morozov, ‘Why we are allowed to hate Silicon Valley

Evgeny Morozov, ‘The Snowden saga heralds a radical shift in capitalism

Bruce Schneier, ‘The eternal value of privacy

Bruce Schneier, ‘When it comes to security, we’re back to feudalism

Eric Siegel, Predictive Analytics: The power to predict who will click, buy, lie, or die

Spiro Simits, ‘Reviewing privacy on an information society’, University of Pennsylvania Law Review, Vol. 135, 1987

Daniel Solov, ‘Why Privacy Matters Even if You Have ‘Nothing to Hide

U.S. Government Accountability Office, ‘Personal Information: Agency and Reseller Adherence to Key Privacy Principles’, April 2006

Tal Zarsky, ‘Transparent predictions

 

Modern times: The case against perpetual progress

1858_4_CourseOfEmpire_Destruction_Cole

For so many in the techno-elite, even those who don’t entirely subscribe to the unlimited optimism of the Singularity, the notion of perpetual progress is somehow taken for granted.  As a former classicist turned technologist, I’ve always lived with the shadow of the fall of Rome, the failure of its intellectual culture, and the stasis that gripped the Western world for the better part of a thousand years…

History teaches us that conservative, backward-looking movements often arise under conditions of economic stress. As the world faces problems ranging from climate change to the demographic cliff of ageing populations, it’s wise to image widely divergent futures.

Yes, we may find technological solutions that propel us into new golden age of robots, collective intelligence, and an economy built around ‘the creative class’. But it’s at least as probable that as we fail to find those solutions quickly enough, the world falls into apathy, disbelief in science and progress, and after a melancholy decline, a new dark age.

Civilisations do fail. We have never yet seen one that hasn’t. The difference is that the torch of the past has always passed to another region of the world. But we’ve now for the first time, got a single global civilisation. If it fails, we all fail together.”

Tim O’Reilly, ‘The rise of anti-intellectualism and the end of progress’, in John Brockman, ed., What Should We Be Worried About?