Some years ago, I was interviewed for a management role at a prestigious organisation whose accounts, it turned out, were a total mess. Inflated income expectations coupled with a failed hiring policy transformed its finances into a river of red.

Meeting the administrative team responsible, I was struck by their skill and intelligence. It was their CEO, since departed, who had led them over a cliff.


But why did they follow? If they knew it was a disastrous strategy why did they comply? “Because change was necessary”, I was told. To which I replied (before refusing the job) “if change was necessary, did it mean that any change would do?”


I vividly recall the sepulchral silence that followed. In the contemporary world, “change” has the power of a Harry Potter spell. Phrases such as “the need for change”, “the importance of change”, “the fear of change” are sprayed around like verbal confetti, without context or further explanation. Like “the will of God” for medieval theologians, or “the class struggle” for revolutionary Communists, “change” is the answer to every question… even before it is posed.


When did we stop thinking about what the word means? When we did we stop asking “change from what, to what?” When did we give up trying to calculate all the costs and benefits of change, substituting mantras about its “inevitability” and a need for “resilience” in the face of it?


A little etymological digging provides illumination. The story is not one of the rise of a single word, but of a raft of related concepts. When we bow down, as my administrators did, to demands for “change” without critical inquiry, we are sucked into what linguist George Lakoff calls its “entailments”. “Change” is a metaphor, as much poetic as political. To understand its hold on us, we must explore the semantic priming activated in our minds when we hear it.

Change, Innovation & Progress

“Change” has other terms baked in to its meaning, to the point where they are almost synonymous with it. The most important are “innovation” and “progress”. These days, change = innovation, and innovation = progress. Let’s look at them in turn.


“Innovation” comes from the Latin root novous, meaning “new”. It made its debut in thirteenth century Europe, in legal discourse. For the next four centuries it had a negative connotation. A “novator” was a holder of heretical or deviant opinions. It was not a desirable label in societies that valued conformity of behaviour and beliefs.


In the nineteenth century that began to alter. The French sociologist Gabriel Tarde was the first to use “innovation” in its modern sense. In constructing a different definition for it, he integrated two opposing concepts: i. “invention”, the creation of something new, and; ii. “imitation”, its diffusion across social and economic processes. “Innovation” implies not just originality, therefore, but also use. It is the name for what happens when a new thing is taken up as part of our collective life. Canadian sociologist Benoit Godin has written about the derivation of the word “innovation”. He comments:

Over [time], innovation came to be defined as useful innovation… As E.M. Rogers put it: “the adoption of a new idea almost always entails the sale of a new product”. Many factors contributed to this shift: the political and economic context, the industrial and consumer revolutions… and, above all, the institutionalization of technological invention via patent laws, and industrial development through R&D laboratories… Innovation as a category during the twentieth century is witness to a certain context – capitalism – and to changes in political values.1

Tracking the history of a word is tricky. Once it enters everyday language its meaning is dispersed. However, Godin’s observation is indisputable. In the minds of most people, innovation = technological innovation, and technological innovation = new commercial products.


This has profound implications. Think about your phone. It’s likely it replaced the one you had before. In considering your purchase, you no doubt did a comparison between the two, using a mix of functional and aesthetic criteria, with the first the more dominant. After all, the important thing about a phone is that it works. Our expectation when we “upgrade” to a new phone is that there is no loss of functionality. Otherwise, why buy it? Even if it does some tasks better, we expect it to do the old tasks too. No one gets a new phone that has a bigger screen but can’t send texts.


In technological innovation, the basis of comparison is clear. We compare like with like. Judgements are grounded in an observable reality that can be widely agreed and objectively measured.


Now consider the claim in The Oxford Handbook of Australian Politics (2020) that Australia is a “democratic innovator”, that it has introduced a number of “electoral innovations that have defined and distinguished its democracy”.2 In what sense has Australia “upgraded” democracy? If we compare the electoral system we have now with one from our past, or in another country, we are not comparing like with like. How do we ground our judgements?


It can be done, but it takes careful research and it is unlikely our conclusions will be indisputable. Yet the term “innovation” has a technological aura that lends a patina of objectivity to opinions that are anything but. If “innovation” is associated with “change”, then the problem is virally transmitted, and a Harry Potter spell is cast. When we hear the word “change” we think “improved use”, forgetting to ask the vital question, “change from what, to what?”.


“Progress” appears less often in public pronouncements these days, probably because we live in a world more aware of the value of traditional cultures, and more sceptical of the advantages of Westernisation. But every time a politician uses the phrase “going forwards” we cop a version of history where “forwards” is tantamount to “up”.
We live better lives than our parents; better housed, better fed, better waged; who live better than their parents. And so on. Actually, this isn’t true for many in the population. But it is the assumption that underpins the entire “progress narrative” of modern society.


This is called “Whig interpretation of history”. The phrase was devised by Herbert Butterfield in 1931, in a book of the same title. He defines it as,

… the tendency in many historians to write on the side of Protestants and Whigs, to praise revolutions provided they have been successful, to emphasize certain principles of progress in the past and to produce a story which is the ratification if not the glorification of the present.3

If this sounds like an “upgrade” version of history, that’s because it is. The Whigs were the winning party of England’s traumatic civil war in the seventeenth century, wealthy merchants and influential gentry who saw a world destined to endlessly improve. Especially for them. New technology had a crucial role here too, as first the Agricultural Revolution then the Industrial Revolution, transformed every aspect of social and economic life.


Whether technology was the cause of these transformations or the result of them is a moot point. But in the Whig gloss, history = progress and progress = technological innovation. It’s preposterous, of course. If it is hard to compare electoral systems, how can we possibly compare historical eras? But one can see how terms like “change”, “innovation” and “progress” fly in formation. They emerge out of concrete experiences. Everyday language being the dirty wash it is, these get thrown together and used interchangeably, so that their separate meanings converge.


The logic then emerges: if innovation = technological innovation, then innovation = progress, because history = progress = technological innovation. We glide over crucial differences like an ice-skater. It remains only to make a final contraction, change = progress, and wrap it in Godin’s observation, to get progress = new products.


This is why my administrators reacted as they did. A proposal to change a product range had the force of moral command. Even though it was financially disastrous, they were sucked into its Harry Potter spell. New products = technological innovation = progress = change. Who wants to be on “the wrong side of history”, despite the fact that history has no sides, and being wrong is a feature of human agents, not events?

Changing (our idea of) Change

A book with a big impact on me as a student was RG Collingwood’s The Idea of History (1942). In it, he charts a change in the Western conception of knowledge. The ancient and medieval worlds were postlapsarian and precedental. Knowledge was a matter of consulting “authorities”, and if there was a golden age, it lay in the past not the future. Scholarship involved “cut and paste”. If you had an opinion, however original, you found someone who had said it before, and quoted them. Scholars looked back in lament rather than forward in hope. “Change” meant change for the worst. It meant decay and decline.


As the idea of knowledge modified, so did the status of “change”. Knowledge became investigation and discovery, observation, induction, deduction and data. Asking questions replaced quoting authorities, and the future became an epic canvas where critical inquiry could be conducted without constraint. No wonder “change” has the allure it does. In the future, the grass is greener, by definition. Even if things don’t turn out right the first time, eventually a future will arrive that is better than the past. That’s the thing about the future. There’s always more of it to be had.


Or perhaps not. It has been almost 70 years since Hans Suess demonstrated that CO2 released by fossil fuels is not immediately absorbed into the ocean. The world’s climate is a limited resource. Our lives are a limited resource. Our money, patience and attention is a limited resource. The assumption that change = progress = technological innovation = new products is environmentally, economically, and ethically suspect. Would the world be worse off if we had not “innovated” the nuclear bomb? Or plastic bags? Or Twitter, for that matter?


Using the word “change” does not exempt us from carefully categorising change proposals.


When we are choosing consumer products, “change” is synonymous with technological innovation. When we are talking electoral systems, it needs to be considered alongside terms of equal value, like “tradition”. When we are discussing historical “progress” we should be clear “change” is a word without moral valency until we impart it.


So when the boss insists on “the need for change” we must not be soaped. Are all the benefits and costs being evaluated? Probably not. Until recently, “environmental costs” were left out of economic calculations. Under these circumstances change = climate degradation = extinction of the human species. No one wants that change. To avoid it, we need to challenge the use of the word in the daily round.


Expelliarmus, as Harry would say.

REFERENCES

[1]http://www.prime-noe.org

[2]https://www.oxfordhandbooks.com/view/10.1093/oxfordhb/9780198805465.001.0001/oxfordhb-9780198805465-e-2

[3]http://www.eliohs.unifi.it/testi/900/butterfield/preface.html

Dr Julian Meyrick is Professor of Creative Arts in the Griffith Centre for Creative Industries. The son of an English father and an Australian mother, Julian studied politics and economics at the University of Exeter in the UK, then took an MA in theatre directing in the US. He was Artistic Director of kickhouse theatre 1990-1998, and Associate Director and Literary Adviser at Melbourne Theatre Company 2002-2007. He has a PhD in the history of Australian theatre and was a Research Fellow at La Trobe University 2008-2011. From 2012 to 2019 he was Professor of Creative Arts at Flinders University. Julian has directed over forty theatre shows, and is winner of the Helpmann Award for Best New Work in 2012. He is a General Editor of the Currency House New Platform Paper series, a board member of both CHASS and NORPA, and Literary Adviser for the Queensland Theatre. He is a regular media commentator on matters of Australian arts and cultural policy. His book What Matters?: Talking Value in Australian Culture, co-­authored with Robert Phiddian and Tully Barnett, was published by Monash University Publishing in 2018.

Professional Learning Hub

The above article is part of Griffith University’s Professional Learning Hub’s Thought Leadership series.

The Professional Learning Hub is Griffith University’s platform for professional learning and executive education. Our tailored professional learning focuses on the issues that are important to you and your team. Bringing together the expertise of Griffith University’s academics and research centres, our professional learning is designed to deliver creative solutions for the workplace of tomorrow. Whether you are looking for opportunities for yourself, or your team we have you covered.

Learn more

Advance your career with Griffith Professional

Griffith's new range of stackable professional courses designed to quickly upskill you for the future economy.

Find out more about Griffith Professional