Picking up the pound notes

‘What’s needed is a consortium of laboratory companies and a network of inspired innovators all eager to reinvent the way human beings work at scale’. Thus Gary Hamel earlier this year launching a ‘[New] Human Movement’ along the lines of the human Relations Movement – ‘the most important movement in management history’ – to kill bureaucracy and make business an environment ‘where human beings are free to flourish’.

Read it – it’s a stirring message. But with due respect for Hamel, I think that here he’s on the wrong track. What we lack is not labs or management inventors dreaming up flashy new ways of doing things. Instead, it’s patient thinkers able to integrate and take forward the basic things that we’ve known, or at least intuited, since the original human relations movement. As Jeff Pfeffer, another respected management professor, remarked at a 2008 conference on developing management ‘moonshots’, ‘we need an implementation, as much as an innovation, engine’.

Yet an ‘engine’ isn’t the right metaphor either. Consider what have been termed (including by Hamel himself) the ‘positive deviants’ of the business world: companies in a variety of industries that thrive and prosper by doing things so differently from their industry counterparts that they might be a different species. Think among others Toyota in motors, W L Gore in plastics and materials science, Handelsbanken (banking) and, a much more recent example, Buurtzorg, the fast-growing Netherlands nursing and social care organisation. By size, shape, industry and national culture, they are very diverse. Yet what they share is far more important than what they don’t. And it is something that their conventional congeners conspicuously lack: a commitment to building organisations around human beings.

Which prompts the thought: maybe they are a different species.

Thus, when Bill and Genevieve Gore founded their eponymous firm in the 1950s they wanted to create a framework in which entrepreneurially-minded engineers and scientists (as they were) could work on projects that mattered to them and would contribute to the world. They deliberately took as their starting point the human-relations-school theories of Abraham (‘hierarchy of needs’) Maslow and Douglas (‘Theory X, Theory Y’) McGregor.

It is probably not coincidence that the deviants share many qualities with Gore. They all exhibit a strong sense of purpose and identity – essential for cohesion, and powerful enough to deter those who don’t fit. They emphasize small groups acting as teams – nurses in Buurtzorg, multi-skilled craftsmen at Toyota. At Handelsbanken people talk of ‘the view from the church-spire’ view to illustrate the maximum geographical dimensions of a branch’s constituency; at both Handelsbanken and Gore expansion takes place as and when a leader emerges and a team forms around them to launch a new internal start-up. All depend on distributed decision-making around a few simple principles (Toyota crew members can and do halt the line many times during a shift to solve minor issues; Buurtzorg nurses switch between caring roles as required; Gore has ‘leaders’ – those who have followers – but no formal management ranks; Handelsbanken is run by its branches).

In all these organisations, personal commitments matter and are upheld by peer pressure. Since those commitments can be trusted, management bureaucracy (Hamel’s bane) is kept to a minimum. None of them has budgets, the linchpin of conventional command-and-control management, in the ordinary sense. The prime motivation is intrinsic, and the only performance management in the work. Headquarters overhead is typically tiny – at rapidly expanding Buurtzorg, 50 administrators and 20 trainers for 10,000 professionals deployed in 900 teams; Berkshire Hathaway, another positive deviant and one of the largest companies in the world, is run by 25 people from an office in an unprepossessing block in Omaha. While all of them use technology, none makes a fetish or boasts about it – at Toyota, the current manufacturing director, who worked his way up from the assembly line, is busy cutting and simplifying its automation.

Oh, and I nearly forgot: they are all outperformers. Toyota (which has been honing its production system, a genuine wonder of the business world, for 60 years old) is consistently the most profitable car company in the world. The privately-held W L Gore, in its way hardly less remarkable than Toyota, and also 60, has made a profit in every year of its existence; Fast Company labelled it ‘pound for pound, the most innovative company in America’. Handelsbanken has a 40-year profits record, ever since it adopted its current form. Even though it is formally a non-profit, Buurtzorg more than breaks even, ploughing the surplus back into R&D.

Intriguingly, the design principles that the deviants have in their own ways arrived at find echoes in significant research findings in other fields. Take the research that won Elinor Ostrom, originally a political scientist, the Nobel memorial economics prize in 2009. Ostrom researched ‘the tragedy of the commons’, the all-too-common occurrence, for obvious reasons highly unpopular with traditional economists, where the invisible hand of self-interest leads not to the common good but the reverse: overfishing, overgrazing, competitive nationalism and the exhaustion of planet earth.

Working from observation rather than mathematical equations (another slap in the face for conventional economists) Ostrom isolated eight core design principles that distinguished groups that made a success of managing a common resource from those that failed. Lo and behold, they closely replicate those used by our deviants: strong purpose, fairness in reward and cost, inclusive decision-making, commitment and peer-group monitoring, just conflict resolution, local autonomy and consistent governance.

Which brings us straight back to our opening theme – how to advance to a better paradigm – and the obvious puzzle it begs. Managers don’t customarily leave pound notes lying around on the ground: if we know what works, why doesn’t every company do it? One answer, ironically, is that groups, which as Ostrom found work by cooperation that favours all participants, can easily be derailed by self-interested individuals (aka the invisible hand) – after all, the joint capacity of a business group is a common resource that is just as vulnerable to over or unfair exploitation as a village green or fish-rich bay. Perversely, this behaviour is sanctioned and even encouraged by current economic and management theory, which consistently turns a blind eye to the commons problem and persists in viewing companies as machines and people as purely economically-driven robots.

Or think about it as an evolutionary struggle. Public corporations run on conventional lines are a waning breed, having lost half their US and UK numbers over the last decades. They are not well equipped for survival in a world that’s evolving towards broader, ecosystem-like constellations. But the new still struggles to make headway. The deviants are hard to copy, because (like Gore) they are born from different impulses, with as it were a new genetic make-up. As the company’s CEO told me recently, it’s tough for would-imitators to implement a Gore-type organisation piecemeal ‘because it’s an ecosystem – a truly holistic way of working’. In a forthcoming new book, John Seddon notes that the only way conventionally-trained managers ‘get’ a systems approach (common to all the deviants’ worldview) is by experiencing it at first hand. Abstract rational explanation doesn’t change their thinking – since the theory determines what they see, they perceive the paper on the ground as litter, not pound notes. What this means is that growth at the deviants is constrained not by opportunity or even competition, but the capacity to socialise new people into their different-in-kind way of working. Seddon calls this process ‘crossing the Rubicon’: the good news is that those who reach the other side don’t come back. The bad news: they have to do it one by one.

If this is halfway true, Hamel’s labs need a different agenda. Top of it would be removing the blinkers that prevent people seeing what’s before their noses – primarily ‘the absurd conception of human nature known as Homo economicus’, in the words of evolutionary biologist David Sloan Wilson, and secondly the simplistic notion of equilibrium or a natural state of things that is the only justification for Adam Smith’s invisible hand. As the inventor Edwin Land once remarked, the best innovations are often not a blinding realisation of something new but ‘a sudden cessation of stupidity … not having a new thought but stopping having an old one’. Now, where did all those pound notes go?

This changes everything

Change is responsible for more management angst than almost anything else on their agenda. With good reason: apart from leadership, it is the most discussed and least understood element in the discipline – witness a failure rate among large change initiatives that is commonly put at 70 per cent.

Why does change so often go wrong? One clue might be that in many big organisations the custodians of the change budget are… the IT department. There’s a plausible logic to this. IT is routinely presented as the ‘driver’ or less assertively the ‘enabler’ of change, the assumption being that going digital will automatically improve efficiencies and outcomes and thus should lead. Yet, as we all know to our cost, IT projects are even more prone to failure than change programmes.

The truth is that the logic is back to front. IT and change initiatives have a fatal flaw in common, which is that they start at the end, not the beginning. So managers debate strategic options, then draw up plans and schedules complete with quick wins, intermediate milestones, deliverables, and carefully orchestrated communications campaigns to keep people and programme marching towards the desired destination. Of course they do. Where else would you start than with a plan?

Unfortunately, however, there’s a snag. And it’s a fundamental one. While life – as the philosopher Søren Kierkegaard put it – is understood backwards, it can only be lived forwards. This deceptively simple truth means that managing by preordained result is both epistemologically and practically a nonsense. Organisations are human entities consisting of too many variables, too many shifting pieces, and too many feedback loops, all interacting with each other all the time, for them to submit to a fixed plan. Evolution, as someone said, is cleverer than you are. In other words, change is an effect as much as a cause. It is emergent.

This changes everything. The result is not just a different ‘change model’ but a different way of thinking. Conventional change models come straight out of the command-and-control (aka central planning) playbook, decreed from above and driven down through the organisation. In the alternative, systems view, on the other hand, change is better thought of as a process of discovery, proceeding not by way of an abstract plan plotted to a fixed destination, but through open-ended investigation and iterative experiment leading to constantly improving outcomes.

In this version, change starts by establishing not the destination but, much more prosaically, the starting point – something that, extraordinarily, most managers fail to do. Root-and-branch analysis of the current state of play on the ground – that is, face to face with the customer, rather than regulators, shareholders and inspectors – invariably confronts managers with two unwelcome findings. The first is that their real problems aren’t the ones they thought they were (and they are never the lack of fancy IT – sometimes indeed the reverse, as is the case with the shambolic Universal Credit roll-out).

The second, closely related, is that your product or service is way less good than your (or your regulators’) figures led you to believe. Only when you have digested that news (having passed through shock and denial on the way) are you in a position to figure out where and what to improve; and it’s only when the hypothesis has been tested in action that it is possible to envisage what the final change will actually look like.

Such a modest, empirical approach to change has two enormous advantages. The first is that it prevents managers wasting large amounts of money and effort on top-down change programmes that are doomed to fail. The second is that, conversely, cumulative improvements can eventually lead to the kind of gains that no one would have dared to put in a plan. Which all goes to show that the hoary Irish joke had it right all along. If you start from here, you can’t be sure where you’ll end up. But it will be a lot better than imagining you know where you want to go – because in that case you’d have to start from somewhere quite else.

This article first appeared in Professional Manager, June 2019

The management X-factor

There are plenty of theories about the well established declining rate of innovation in the capitalist West. One large, but often neglected, contributor is the behaviour of all-powerful capital markets. In one much quoted study, US directors admitted they would pass up promising investment opportunities rather than disappoint Wall Street expectations and attract the attention of activists, private equity and other predators. (Here is an explanation for two other ‘mysteries’: the increasing number of companies shunning the public markets and remaining private, and the sharply declining life expectancy of quoted companies – today’s shareholder primacy model is simply not compatible with long-term corporate health and wealth.)

Another plausible reason for slowing innovation, though, is the laws of science, which make it ever harder to replicate the ‘10X’ improvements that have fuelled growth spurts of the past (rail equalling 10X canal, car 10X horse, electricity 10X candle, internet 10X print, jet 10X train, etc). Note in this context that platforms such as Uber, while certainly disruptive, aren’t actually market-creating or indeed even value-creating innovations to any large degree, their undoubted value to consumers being significantly undercut by the destruction of value to the same people as workers. Since Uber’s current fare levels are avowedly unsustainable, the value to consumers may in any case only be temporary, lasting just as long as it takes to establish a de facto monopoly and operationalise autonomous vehicles, at which point prices will speedily rise to cover full costs and begin to compensate long-suffering shareholders. Meanwhile, even in medicines innovation is stalling as development costs rise and antibiotics lose their potency; and only the most ardent believers think Californian commuters will benefit from Elon Musk’s mooted 700 mph ‘hyperloop’ link between San Francisco and Los Angeles any time soon.

There is, however, one area where a multiple-times gain is immediately available, requiring neither expensive technology nor indeed much capital investment at all. All its components are already known. And it would benefit not just one sector, but the whole of the economy. It is, of course, management, the most ignored factor of all. The cost of mis-managing the resources we have at our disposal are beyond colossal. The late Peter Drucker got a lot of things right, but when he confidently predicted that by 2010 large businesses would be operating with half the management levels and one-third the number of managers of 20 years ago, he was spectacularly wrong. On the contrary: it is the explosive growth of management jobs that is the most remarkable feature of corporate employment over the last 20-30 years.

As I have noted before, many or most of them are bullshit jobs that only exist as a result of a kind of management ‘failure demand’ – failure to do something or do something right the first time. A good recent example would be the expensive advertising and marketing of dubious products that no one would buy otherwise, now through deeply duplicitous online manipulation and surveillance, or people checking up on and performance managing others. Gary Hamel reckons that excess bureaucracy costs the US $3tr, the UK £400bn annually. And that’s direct costs: factor in the opportunity costs of doing the wrong thing, the ‘unknown and unknowable’ cost of workplace stress, physical and mental ill health, absence and lack of engagement, and ‘pretty soon you’re talking real money’, as the US senator famously said. If, as John Seddon argues, systematically better management can at the same time permit private-sector organisations to create greater wealth and the public sector to consume fewer resources, that would be a double whammy of improvement. The side effects of better morale, higher engagement and greater wellbeing as a result of more effective public services would constitute a third.

At the broadest level, there are things we could do to remove energy-absorbing frictions that are so obvious they shouldn’t need saying. The great development economist and historian Carlota Perez believes that the IT and communications technology underpinning the fifth industrial revolution could yet lead to a new ‘golden age’ of social development, as previous ones have – but only if as in the past we shape and support it with institutions that make it a win-win for business and society both. It’s not unfettered markets that brought about past golden ages, but judicious channelling of technology and enterprise into areas that benefited wider society – so automobiles and mass-production drove suburbanisation supported by housebuilding and savings and ownership institutions, then pension arrangements and healthcare, all in a benevolent employment-creating circle that enabled ordinary folk to participate in the markets that new configurations had created.

Now things have changed. It’s no use demanding more of the ‘business-friendly’ policies that have led us into the current impasse. The overriding imperative is for all the actors – including business – ‘to make peace with the planet’, in Perez’ striking phrase, acting within its limits instead of treating it as a giant quarry-cum rubbish tip. That’s the first friction. In the same way, it must be obvious by now that there can be no golden age while economic incentives are so wildly out of alignment with social interests. We need to use markets – ‘the only natural economy, the only kind that makes sense, the only one that leads to prosperity, because it is the only one that reflects the nature of life itself’ (Vaclav Havel) – but not in a way that allows the proceeds to be monopolised by just one constituency, shareholders, at the expense of all the others. As we are now learning to our cost, inequality is a social friction on which both the economy and the polity can come to a grinding halt.

Just as importantly – the third great damaging cause of friction, and for most people the most immediate and pressing – the way our organisations are currently run is so grotesquely against the grain of everything we know about how human beings work that it would be comical if it weren’t so perversely destructive. Despite the best efforts of mainstream economics and management based on it, we don’t belong to the fantasy race of homo economicus; treating people as if they did is Orwellian anti-management. It makes people unhappy and distrustful, the work dismal, and the places they do it, especially big companies, corrupt, inefficient and – the ultimate waste – stuffed with people doing jobs that they hate and have no purpose.

There’s no excuse for this. We know that humans scale one at a time. They work on intrinsic motivation (Frederick Herzberg’s ‘To get people to do a good job, give them a good job to do’, is still the unimprovable formulation here), appreciation, face-to-face communication and small teams with as much autonomy as possible. They need a worthwhile purpose they can identify with, that turns innovation and improvement into a part of the daily routine. If they have those, the rest follows after, including investment used at the service of humans, not to replace them. The most important investment needed to eliminate these frictions is the thought, energy and will involved in reshaping law, regulation and incentives to favour the planet and its inhabitants rather than an abstract idea of profit maximisation. The payoff: 10X happiness. Both input and outcome are admittedly hard to measure. But it’s a measure of how far management has strayed from its human roots that that may be the highest hurdle to getting it done.

Needed – another extinction rebellion

In a recent column, The Guardian‘s George Monbiot noted that,‘Of all the varieties of media bias, the deepest is the bias against relevance. The more important the issue, the less it is discussed’, since opening it up might trigger demands for change that would threaten powerful vested interests.

What aroused Monbiot’s ire was the lack of concern over the collapse of nature. But he could have been talking about management. Management is ubiquitous, the invisible link between between the ideology of the day and what we do every day at work. It’s astonishing that although the discipline draws more students at both undergraduate and postgraduate level (God help them, and us) than any other subject, no one writes or talks about what they are being taught in the mainstream media. It is simply treated as a given. So no one would guess that that our conceptual understanding of management lags far behind that of any proper science, while the practice of managment is not even standing still: it is clearly getting worse. At least we are beginning to understand our effect on the rest of nature – but it’s not clear that the same can be said for the effects on ourselves.

The ultimate reasons go right to the top, to the heart of corporate governance. But here I want to invoke a few of the small absurdities and inhumanities we are subjected to, which also speak of the bigger ones behind.

Take first a despairing article by an anonymous policeman in The Guardian describing the near intolerable pressures of policing in the capital. Like one in five of his colleagues in the Met, according to a recent report, the writer suffers from PTSD. He believes the police have lost control of the streets and with just 10 officers in his borough available at any one time to attend to an emergency, admits to being scared himself. Direct cuts of 20,000 police officers (thank you, Mrs May) are bad enough; but equally sapping are reductions not only in police support staff but also in related services – ‘our duties are being stretched beyond our capabilities to include non-criminal matters regarding mental health and social services, because cuts have debilitated those sectors too.’ Unlike hospitals, the police can’t close their doors when they’re full. Instread, they turn off their telephones – so, with foot patrols almost non-existent, intelligence suffers and the chance of anything less than serious crime being dealt with falls to Zero. Finally, unofficial targets for things like stop and search not only divert attention from real purpose but increase the public disengagement and distrust that makes police work harder. This is the very opposite of smart policing: dumb, stupid, bullshit public service that disillusions both people who suffer crime and those who are supposed to prevent it.

Item two is my GP of the last 15 years, a cheerful, no-nonsense senior partner of a busy central London practice. At my last appointment, she announced that she was retiring early because although she loved the work, the strain had become unbearable. The last straw was an inspection, preparation for which had terrified staff and consumed huge amounts of time that should have been devoted to patients. The surgery passed the inspection – but only after the doctor had endured to five hours of hostile interrogation which, she said, clearly started from the assumption that she was incompetent or hiding something. She went home, wept for two hours, persuaded her husband not to track down the inspectors and punch them in the face, and resigned the same evening.

My doctor isn’t alone. Most GPs are intending to retire before the age of 60, according to a recent Pulse survey, blaming overwork, rampant bureaucracy and a plummeting standard of living. GP leaders said the flux of doctors was a “genuine tragedy and waste”’. Again, this is anti-management – practice that makes the condition worse.

Item three may seem trivial, but it’s a symptom of the same deadly disease. At the university where my wife works, departments used to have their own administrators, who knew both staff and students and formed a friendly and effective link between them. To ‘streamline and professionalise’ this arrangement, administrators were brought into a central grouping, redubbed ‘professional services’. This performed the feat of upsetting staff, students and administrators themselves, who, having lost their links with people they had worked with for years, now stay in the job for months rather than years. To remedy the situation, managers are now proposing to edit and circulate a monthly newsletter describing new developments and ‘enhancements’ to a service which is infinitely less effective, more cumbersome and more costly than before. Words fail me.

Writ large, the dire financial consequences of such institutionalised idiocy can be read almost every week on the website of the National Audit Office in a report on a new shared services, outsourcing or IT project disaster. Recent examples include the complete failure of the privatised probation service (step forward, Chris Grayling), the ferry-with-no-ships saga (ditto), and, a new one on me, the lamentable story of the Emergency Services Network: years late, the new communications system for the emergency services is now forecast to cost £9.3bn, 50 per cent more than originally anticipated – and with much of the technology still not proven, the NAO doubts whether it will meet its ‘reset’ launch date of 2022. Doesn’t anyone learn anything?

Aside from mind-boggling financial ineptitude, what all these things, small and large, have in common, is contempt for the human – most directly obvious in the public service examples, but equally present, in compounded form, in the NAO cases. Failed IT projects always grossly overestimate the relative importance of the technology versus that of the humans that use it, for example.

The private sector is no better – in fact it is often worse. As author Dan Lyons noted in a recent RSA presentation self-explanatorily entitled ‘Why modern work makes us miserable’, companies obsessively trying to make humans fit with what computers want rather than the other way round have given up even pretending that workers are assets. The result is not only hateful service (automated tills, chatbots, interactive voice systems) but also dehumanising work practices (gigs and precarity, or alternatively long hours under tight surveillance). It’s not even ‘efficient’: Gary Hamel estimates that excess bureaucracy costs the US private sector, supposedly the leanest in the world, trillions of dollars a year. Even Chinese workers are getting restive under the country’s ‘996’ (9 till 9 six days a week) works system. Jeff Pfeffer in his angry Dying for a Paycheck calculated that simply going to work was the fifth biggest killer in the US, with extra costs to the health system of $2bn a year.

Do we live to work or work to live? The balance has swung so far towards the former that it sometimes seems that far from advancing, we are being propelled back to Victorian levels of exploitation and inequality. Until there’s a sharp change of direction, and we start seriously talking about what management is doing to us, humanity may be in as much danger of collapse as the planet’s rainforests or the oceans.

Managing in the wild

At first sight, Wilding, by Isabella Tree, doesn’t have much to do with business and management. But bear with me. This compelling book tells the story of the extraordinary consequences of turning a large, long-established farming estate in West Sussex back to nature. The historic 3,500 acre Knepp estate had been in the same aristocratic family for centuries. Taking over in the 1980s, author Tree and her farmer husband, heir to the estate, enthusiastically did everything the farming textbooks and consultants told them: intensifying the inputs, investing in labour-saving machinery, betting the farm, literally, on scale. But with all the advantages of ownership, experience, and access to the best conventional advice that the owners could muster, on their marginal land (Susex wealden clay is a nightmare to work) they simply couldn’t make it pay.

So in 2000, at the end of their tether, they sold off their expensive farm machinery and dairy herd, stopped the losing struggle against the clay and began, amateurishly at first, the process of allowing the land to rest. At first, they just felt relief. But almost instantly, they started to see nature pushing back with astonishing speed. Starting with semi-domesticated parkland, then extending the experiment to other parts of the estate by bringing in ponies, cattle and pigs, all allowed to roam and breed free, the owners watched in amazement as soil fertility, plant and insect life, birds and mammal diversity exploded, making the estate an island of exuberant fertility among the surrounding swathes of agri-industrial monoculture – calling into question some basic tenets of ecology and conservation science in the process

And the relevance to management and business? Well, this year’s Global Peter Drucker Forum, management’s Davos (for which, disclosure, I do some editing), has taken as its theme ‘the power of business ecosystems’. This is an important milestone, for two reasons. It belatedly requires management to start thinking in terms of systems, which it has astonishingly managed to avoid up till now. And while business ecosystems are man-made rather than natural, they are based on the same kind of principles. Managers may not have realised it yet, but changing the ruling business metaphor from the machine to biology has the potential to alter the management’s entire underlying philosophy. In short, it is a real evolutionary shift.

Although the idea has been around in the background for a decade or more, researchers are still exploring the concept of ecosystems, and there is plenty of interesting work still to do. In the meantime, however, and in no particular order, here are a few tentative thoughts on the implications for management that suggest themselves from Tree’s remarkable story.

Control…not. Evolution, as Francis Crick once said, ‘is cleverer than you are.’ You can’t control ecosystems. Apple may have been characteristically ahead of the game in thinking in ecosystem terms with the iPhone, but the platform only took off when Steve Jobs reluctantly allowed third-party developers on to the app store (they now number 500,000, contributing a total of 2.5m apps). In reverse, today’s intensive farming is Fordist, command-and-control agriculture. It sort of worked for a bit, but at only at huge cost – requiring huge volumes of artificial inputs (aka incentives) and pesticides and herbicides (rules and sanctions), all of which destroy diversity and the ability to innovate and change – and, like command and control itself, has now become the problem. Recent concern over soil degration only underlines the point. As Tree shows, agriculture is a classic case of the UK’s unerring ability to get the worst of both worlds – a supposedly market-based system that is so hedged around with the bureaucracy of regulation and subsidy that many farmers feel they have no choice what to grow on, or even do with, their land. Ironically, Knepp had to struggle to be allowed to do nothing.

Balance. The first law of ecology is that everything is connected to everything else. So ‘maximising anything is fatal for the balance of the whole system’ (Charles Hampden-Turner). It’s when balance is lost that things go wrong. Meaning that in a business ecosystem maximising shareholder value (or short-term land yields, for that matter) was always going to be destructive, and, like command and control, with its panoply of targets, KPIs and other measures unrelated to purpose, has to be abandoned.

Change. For conventional management, change is a mystery and a nightmare. It takes for ever and most of the time (70 per cent, according to HBR) it doesn’t work. But under ecological rules, change is natural, happens all the time, and, as at Knepp, can take place at astonishing speed. It’s just that it’s emergent – so while you can predict the direction of travel, you can’t tell exactly where you will end up. The course wilding took, and continues to take, at Knepp has been a source of amazement and controversy to scientists, the owners and Natural England alike. Like evolution, of which it is obviously part, ecological change can’t be planned. But it can be observed, understood, learned from, and should be treated as normal.

Economy. Like any low-trust command-and-control operation, intensive farming is highly inefficient and costly as a system, both internally (bureaucratic, management intensive) and externally (obtrusive regulation, subsidies). One of the many surprises at Knepp has been the extent to which, the expensive artificial uppers and downers once removed, the ecosystem has become self-regulating. Without herbicides, ‘weeds’ have sprung up, to the great disapproval of townees who like their countryside trimmed and twee. But it turns out, from observation, that left to themselves farm and other animals use many of them for self-medication (another surprise for scientists). They already have a better, organic diet; together with the ability to dose themselves, Knepp’s cattle, ponies, pigs and deer are now largely vet- and thus expense-free, even when giving birth. The cherry on the cake, as it were, is that the solely pasture-fed meat is shown to be not only better tasting, but actually good for you, even the fat. Managing with rather than against the grain of the ecosystem is better, simpler, and lower cost, since there’s less of it.

Simpler; but – the sting in the tale – not necessarily easier. It’s already clear that ecosystems will involve managing a richer and much more nuanced range of relationships than before, which may include both competition and cooperation in different areas, formal or informal partnerships, cooperative networks in which products co-evolve, sometimes even co-creation with customers. This won’t suit managers who like things black and white, preferably in numerical form, and alpha-male CEOs who can’t bear not to be dominant. But, hey, dealing with ambiguities and interpreting relationships are a large part of what it is to be human. It’s called life. So they’ll just have to get over it and start managing as if it, and humans, mattered.

Telling tales: the difficulties of telling the truth

The ability to tell stories, it is said, is one of the qualities that differentiates humans from other animals. Stories have brought us extraordinary riches – Homer, the gods (or God), perhaps even evolution: would we have left Africa without a story of destiny, greener grass or just curiosity about what lies beyond the next hill? It’s through our ability to connect utterly different elements – a butterfly’s wings and a hurricane, say – to form a narrative that we make sense of the world and allow ourselves to feel that we are in control of our lives.

But storytelling also has a darker side, as an absorbed audience heard at a February forum organised by The Foundation devoted to that complicated subject. The human-ness that allows us to recognize – or invent – a good story also inflects the way we receive it. All too often stories lead us to terrible places. In a minor key, actor Mark Rylance relates how at the Globe theatre he was forced to throttle back the famous call to arms in Shakespeare’s Henry V when he realised the frenzy of anti-French hostility he was whipping up in the groundling audience. For the real-life consequences, think no further than the Inquisition, National Socialism, or Isis.

Yet the potency of storytelling takes on an added importance today, in an age that has been widely characterised as ‘post truth’ – an age of fake news and ‘alternative facts’, where our natural inventiveness on one side and gullibility on the other are sometimes supplemented by deliberate manipulation by ever more sophisticated technological means. To such a degree that, as one Forum speaker, satirist John Morton, put it, stories become all that we have, in the sense that, in the absence of absolute truth, ‘we think, okay, we live with a collection of competing narratives, that’s all we have to sustain us’.We no longer accept to have ‘truth’ curated for us by the church, the mainstream press, or the political parties. We have had enough, Michael Gove said, of experts. So which, or whose, stories are we to privilege? How do we know which to trust and which to dismiss?

And here’s the rub. As humans, the speakers noted, we respond to stories not with Enlightenment-style logic and rationality, but with a very human logic in which ‘facts’ and ‘evidence’ are just the instruments that get us from A to B, B being the place that our emotions have already decided that we are going. As story-telling coach (and comic) Tom Salinsky compellingly showed with the example of the first and last voyage of the Titanic, the irresistible appeal of stories is fuelled by the combination of a few relatively simple components: an unexpected event or paradox – an unsinkable ship that sinks, in this case compounded by the fact that it was on its maiden voyage (you couldn’t make it up); a dramatic immediate cause (the iceberg), hiding a deeper hidden one (hubris, or human folly); and a poignant human hook (the band played on as the ship sank). What changes the account from an encyclopedia entry to a story is, first, the human hook, which plays to the overriding importance of emotion in our responses, overriding reason in the process of decision-making almost every time. ‘The factual stuff is necessary to provide the context, but it’s that moment of emotional catharsis that we remember and that moves us,’ said Salinsky. ‘That’s what gives stories their power, and why also they’re potentially so dangerous’.

The second essential feature is cause and effect. ‘Cause and effect is what stories run on – without it there isn’t a story’, Salinsky noted. No accident, then, that identifying cause and effect is the central quest of much of literature, including the entire genre of detective fiction.

Directly causal connections are of course much harder to establish in social and human affairs than it is in the physical world. Hence the flourishing of fake news, poisonous rumour and conspiracy theories which in turn augment the violence of political or ideological arguments, or assertions – about Brexit or Trump, for example – that rapidly colonise the truth-free space and crowd out less extreme interpretations. Both of these are extraordinary illustrations of the power of a good narrative (‘Make America great again!’, ‘Take back control!’), whether real or imagined, to trump mountains of earnest but story-less facts and figures. Less obviously, both, as Morton pointed out, are classic examples of stories escaping control and developing lives of their own, independent of their makers. He cited the ‘humiliating’ authorial experience of having characters or story-lines refusing to follow the course allotted to them in the outline. But it wasn’t just that stories could take you to a destination you didn’t intend – nor could you control how they landed in and interacted with the real world, sometimes even altering it in their own image, as, arguably, in the case of political satire: what started out with the relatively benign Spitting Image ended up with the scabrous The Thick of It, in which all politicians are duplicitous, stupid or borderline criminal. ‘So I’m wondering,’ said Morton, ‘whether one unintended consequence of the satirical brilliance of The Thick Of It when it got out into the real world was that it was one of the causal factors in the kind of mad, terrible world we live in now’.

Some part of the problem may be epistemological. As the Danish philosopher Søren Kierkegaard famously noted, while life is understood backwards, it is lived forwards. The only way the realities of a present life can be force-fitted to a destination decided in advance is by doing violence either to one’s own beliefs or those of others. This may help explain why we live in what might be called, as in the title of a recent BBC Radio 4 series, ‘the age of denial’. Life contains so many unspeakable, awful things that we can’t individually do much about – climate change, plunging biodiversity, exclusion, slavery, child abuse – that the only way we can deal with them is to blot them out. Complicating matters, blotting out the unacceptable – the ‘optimism bias’ – may be evolutionarily essential: otherwise why go on living? While its complement, the ‘negativity bias’ (the salience of bad news over good), is also essential in keeping us alert to the constant threat of danger. So where’s the balance?

There’s little doubt that all today’s tendencies, but particularly denial, have been supercharged by social media and the internet, both of which radically expand the scope for group polarisation. As a respected journalist and commentator, Gavin Esler has watched with concern as fantasy and malevolence make it ever harder for balance and insight to be heard. Again, Trump is the telltale example here. Never mind all the other disqualifications: how is it, Esler demanded, that a president of the United States can get away with telling, on the Washington Post’s reckoning, 15 lies a day for a total of 6,420 in two years – many of them breathtakingly blatant untruths – and still retain the confidence of 40 per cent of Americans who would never allow the same latitude to his opponents? The answer, Esler suggested, was that there was no pretence about Trump. No one could doubt that what they saw was what they got. Trump was authentically himself – a liar – knew it, and acted it to the hilt. He didn’t need to be an earnest or careful denier – it simply wasn’t important. This puts Trump so far ahead of the curve that some have termed him a ‘post-denialist’ – someone who is so unconcerned about truth or fact that he doesn’t even bother to justify his lies.

Generalised post-denialism would be an internet-age dystopia beyond anything that Orwell or Huxley could have invented, with implications that scarcely bear thinking about. If we are not to go that way, at some stage the fightback has to start. ‘At some point it seems to me we have to reassert that facts do matter’, said Esler. ‘No matter how flat I feel the world is, it isn’t, and if the facts don’t matter, any of us in the journalism or communications business might as well pack up and go home.’

Part of the answer, it was suggested, was to get beyond the facile notion of authenticity that certain demagogues have learned to play so effectively: the commonly-used justification that ‘I just say what I think’ doesn’t mean what you are just saying is right, clever or of any value at all. Sincerity, Esler proposed, was a better criterion to judge by. On the positive side, he added, belying today’s fashionable stereotypes, many politicians are good, intelligent people genuinely motivated by the desire to improve lives: their story deserves to be heard too. Another part of the answer is surely to oblige social media and tech companies to face up to their responsibilities by making them accountable for their content in the same way as the struggling traditional media, as they should have been from the beginning.

Finally, of course, the stories swirling around us are ours too, and it is up to us to handle them with as much care as we can. ‘Nowadays, anyone who wishes to combat lies and ignorance and to write the truth must overcome at least five difficulties. He must have the courage to write the truth when truth is everywhere opposed; the keenness to recognize it, although it is everywhere concealed; the skill to manipulate it as a weapon; the judgment to select those in whose hands it will be effective; and the cunning to spread the truth among such persons. These are formidable problems for writers living under Fascism, but they exist also for those writers who have fled or been exiled; they exist even for writers working in countries where civil liberty prevails.’ That text for today was penned by the great committed German poet, Bertolt Brecht. He wrote it in 1935.

The power of words

A couple of days ago, I came across a copy of Winston Churchill’s ‘We shall fight them on the beaches’ speech of June 1940. It had languished unread in my study since it was reprinted by The Guardian in a series of ‘Great speeches of the 20th century’ in 2007. I’m well aware that ‘Churchill, hero or villain?’, has recently been at the epicentre of a ludicrously trumped-up controversy on Twitter. But whatever your opinion of the old bruiser (and I recommend a quick read of Simon Jenkins’ article in The Guardian to put the matter in perspective), Churchill’s words by both omission and commission contain some lessons that any of today’s politicians making dismal idiots of themselves over Brexit could profitably take to heart.

Reading the speech just now now is an instructive experience. It is absolutely calculated with one end in view: to create unity. From the first word, you know you are not just in the presence of momentous events – you are a participant in them. Every paragraph is about ‘we’. I was familiar with the rousing final ‘We shall go on to the end…’ peroration, of course – but taking in the rest of what is quite a long and dense speech for the first time, the well-known ending is not the most remarkable thing about it. As Simon Schama notes in his short introduction, by far the most striking feature for today’s reader or listener, a companion to the ‘we’, is the startling rhetorical tactic from which it draws its persuasive force: honesty.

There is no fancy introduction. Churchill launches straight into a vivid description of the German May blitzkrieg and the desperate retreat of French and British troops to the Channel ports that had me, as no doubt listeners at the time, reaching for a map to follow the strategic sweep and grasp its implications. He makes no attempt to hide the losses and their consequences. While Dunkirk, which immediately preceded the speech, is a ‘miracle of deliverance’, he leaves no room for doubt about the height of the stakes (‘the whole root and core and brain of the British army… seemed about to perish upon the field’), nor that deliverance has been plucked at the last second from the jaws of disaster. ‘We must be very careful not to assign to this deliverance the attributes of a victory. Wars are not won by evacuations,’ he warns. Make no mistake, the whole episode has been ‘a colossal military disaster’ that has left the French army weakened, the ‘fine’ Belgian army lost, the channel ports and swathes of northern France including ‘valuable mining districts’ in enemy hands, and a daunting quantity of guns and equipment, all of which would have to be painfully built up again, simply abandoned.

Nor is this all. With ‘remorseless candour’ (the description of the reporter of the then Manchester Guardian), the Prime Minister goes on to set out the likely next developments. Hitler might be expected to follow up quickly with a strike at France. Or at Britain: ‘When we see the originality of malice, the ingenuity of aggression, which our enemy displays, we may certainly prepare ourselves for every kind of novel stratagem and every kind of brutal and treacherous manoeuvre.’

But although there is no mistaking the gravity of the situation, or the possibility of worse to come, the tone is above all one of facing down the adversity. When Churchill pays tribute to the bravery of the retreating troops (including French and Belgian) and the extraordinary efforts of the RAF (with a prescient nod to its future role in protecting our own shores), Navy and little ships that brought off 335,000 allied soldiers from the beaches, the statement of national unity in the defiance is uncompromising. We really are all in this together. Yet there is no sense of Little Englandism. Unlike today’s MPs with their excruciatingly emphasized ‘our country’, ‘the British people’, not to mention ‘the will of the British people’, Churchill doesn’t do the virtue-signalling patriotism – he sometimes uses ‘this island’ or ‘the nation’, but mostly simply ‘we’. He repeats the pronoun no less than 10 times in the first half of the famous peroration. Then in a brilliant final coup, he first glancingly evokes the possibility of a British defeat (‘even if… this island or a large part of it were subjugated and starving…’), before closing off the conditional by broadening the ‘us’ to include the British empire and the US, which would in that case carry on the fight until the job was done.

But first there’s another lesson for 2019’s MPs. Taking stock of the need, even after the recent losses, to balance home defence with ‘the largest possible potential of offensive effort’, Churchill proposes that the house discuss the subject in secret session – this partly to avoid giving useful information to the enemy, but mainly because the government ‘would benefit by views freely expressed in all parts of the house by members with their knowledge of so many different parts of the country.’

Let that sink in a bit.

The UK is not currently at war in the most literal sense (although to use the phrase is to be aware of the just subterranean parallels). But contrast the inclusion, unity of purpose and clarity of vision set out in the 1940 speech – a perfect statement for the time – with today’s sorry equivalent at another moment of national crisis: ‘a jumble of jargon, jousting and gibberish, with everyone sucked into the vortex of confusion, to the exclusion of every other issue in the world’, in the words of the New York Times’ columnist Roger Cohen, in principle a friendly observer. Like all the key terms used in Parliament at the moment, the desperate protestations of clarity (‘Let me be clear that…’, ‘The Prime Minister has made it very clear that…’) only underline the reverse: the sole clarity on offer, as Cohen notes, is that no one has a clue what will happen next. The hero-worship of some Brexiters for Churchill (Richard Evans’ takedown of Boris Johnson’s ‘biography’ in the New Statesman is irresistible) in this context is deeply ironic, since the cacophony reproduced daily in the House of Commons displays all the qualities of the June 1940 speech in reverse. It is a monument to muddle, fudge, discord and dissembling that can only comfort enemies and dismay friends. The rhetoric is meretricious and sham, and – again unlike the Churchill example – nothing good can possibly come out of it.

Rogue state

Just now it’s hard to avoid the subject of Brexit, which – irrespective of your referendum choice – is a tale of government ineptitude so great that it inspires a kind of awe: misbegotten in concept, disastrously managed and losing no opportunity to make things worse as it went along. And behind Brexit lurks something even more worrying than our fate within or without Europe: an atrophy of the idea of the state that not only makes a mockery of the notion of taking back control of anything, but makes one fearful for our ability to preserve a functioning democracy.

In fact, the Brexit tale starts well before the referendum was even a mention in a Tory manifesto – and ironically, except in the fevered nationalist imaginings of the European Research Group, it has very little to do with Europe. In this version, the origins of Brexit are to be found in the backwash from the 2008 financial crisis.

The reasoning goes like this. The GFC – a banking, not a spending crisis, itself the result of policy failure to rein in the financial sector – severely dented the UK’s public finances. That triggered austerity, the huge brunt of which (89 per cent) fell on public expenditure, in particular local authorities in already deprived areas, whose citizens responded by enthusiastically voting UKIP in local elections. This in turn frightened Conservatives into promising a referendum in the election of 2015, with the results that we know. The correlation between votes for UKIP in elections and Leave in the referendum, suggests that without austerity – and the sadistic form in which it was administered – the referendum result would have been different. So there is a straight line from the Great Financial Crash through austerity to Brexit that never touches Europe, except of course as collateral damage.

The story of Brexit post-referendum is just as hapless – as far fetched, although not as funny, as a Gilbert and Sullivan plot (martial law and the evacuation of the royals after a crash exit, anyone?). Dsplaying what can only qualify as reckless negligence, the government appears to have done no due diligence on either the external or the internal consequences of its actions, underestimating the negative effects on its own integrity (Northern Ireland, Scotland) as much as it overestimated European willingness to satisfy our notorious appetite for cake both to scoff now and to hoard for future midnight feasts. There was an advisory referendum on the most important political decision for a couple of generations that somehow morphed into the will of the people despite possible manipulation and a majority that wouldn’t have sufficed to alter a local golf club’s membership dues. Then there were negotiations not with the EU but between different governmental factions, and when government members now disingenuously warn of social unrest the opposition is again not Europe but internal. Brexit has turned into a nightmare Catch-22: if we knew our history, including that of two world wars in the last century, we wouldn’t do it in a million years; so the only way to do it is by ignoring history, which of course guarantees that we shouldn’t do it. We’re reliving history without learning its lessons.

That, however, is not all. Writing in The Guardian, Fintan O’Toole, one of the sharpest commentators on the whole saga, noted: ‘Brexit is really just the vehicle that has delivered a fraught state to a place where it can no longer pretend to be a settled and functioning democracy…It is time to move on from the pretence that the problem with British democracy is the EU and to recognise that it is with itself.’ Part of the problem, as O’Toole notes, is about what to do with English nationalism. But that is compounded by the fact that the UK no longer believes in its ability to carry out many of the traditional roles of the state, which it has meekly abandoned to the market and business. It is a self-hating state which now finds itself almost completely bereft of its traditional defences and competencies at exactly the moment, with danger and turmoil swirling around, when statecraft is most needed.

In retrospect, the extent and urgency of these failings had been brought into sharp focus in a forthright presentation by the FT’s Martin Wolf at last November’s Global Peter Drucker Forum in Vienna. In a good session on ‘Beyond market failures: how the state creates value’ (you can watch it here), Wolf laid out some basic home truths. The state, he asserted, was ‘the most important institutional innovation in human history, as essential now as it’s ever been’ (the idea that we would all be in heaven if it only got out of the way, he added, was ‘only possible for people who are so used to strong and powerful states that they cannot imagine their disappearance’). Leaving aside taken-for-granted aspects like security, the justice and legal system, and the laws governing the roles, purpose and legitimate operations of business, all of which just happen to be ‘a total mess’ – ‘and if this isn’t important I don’t know what is’ – all our current priorities of broadly shared and sustainable prosperity, financial stability and environmental protection require the active intervention of a state that is ‘ambitious, effective and … under democratic control’. For many states, perhaps especially our own, that is a very big ask. Yet without it, and without states and governments getting better at cooperating with each other than they presently are, said Wolf, we face a crisis in which the brave new technologies that we set such store by ‘will in my view destroy us.’

Those, in the words of the FT’s chief economic commentator, are the stakes. It is little comfort to know that Brexit in that perspective is just a taster of bigger tests to come.

Managing as if the world matters

As the transmission belt between ideology and the behaviour of companies, the most important actors in the modern economy – and one moreover that transmits both ways via economic and political power – management has always been much more important than most people imagine. Never mind Brexit: on the way it develops from here may now depend the future of the world.

It has done so before. In the decades after WWII, the (mostly) virtuous circle of rising profits feeding into investment in new plant and above all good jobs, fuelled a (mostly) balanced rise in prosperity across Western economies that could not be matched in the rigid Soviet bloc. But as Henry Mintzberg has often insisted, the fall of the Soviet Union wasn’t a case of capitalism defeating socialism. It was a case of balanced, plural societies, comprising vibrant private, public and civic sectors, proving more flexible and productive than unbalanced ones consisting only of a public sector. Now, having learned the wrong lesson, the West, particularly the Anglophone West, is making the same mistake as the Soviet Union but in reverse: encouraging a rapacious private sector to exploit or eat everything else in its path.

The upheavals of 2016 allow us to connect up the suddenly visible economic dots: these, together with other manifestations of populism and and nationalism erupting like a plague of boils all over Europe, are the direct outcome of our own lack of balance, which, if left unchecked, will see our pretensions crumble as comprehensively as the Berlin Wall in 1989.

The irony is that what today’s malcontents, however crude their expression of it, actually want, is neither complicated nor outlandish. In a 2017 Legatum survey, respondents listed, in order, food, water, emergency services, healthcare, housing, jobs and education. A car and air travel came way down on the list, which probably hasn’t changed much since the 1950s, the main difference being that then everything on it was in the realm of the realistically attainable.

Now as then, the key ingredient is jobs, which are largely the creation of a vigorous private-sector – ie companies. The corollary, as commentators such as the FT’s Martin Wolf never tire of pointing out, is that how companies are managed and to what ends – the ideology of management – is a macroeconomic issue of the highest importance. In case there is any doubt just how high, here is Colin Mayer, former dean of Oxford’s Said Business School, in his much-noticed new book on the corporation, Prosperity:

With the emergence of the mindful corporation we could therefore be on the edge of the most remarkable prosperity and creativity in the history of the world. On the other hand we could equally well be at the mercy of corporations that are the seeds of our destruction through growing inequality, national conflicts and environmental collapse on scales that are almost impossible to conceive of today. We are therefore on the border between creation and cataclysm, and the corporation is in large part the determinant of what way we will go.

Our very future, he goes on, ‘depends on reinventing the corporation’. The good news is that that is perfectly possible – Mayer points to the six previous ages that it has evolved through since Roman times, steadily increasing its purview through public services, guilds, towns, universities, the church, hosplitals, trading, manufacture, transport, safekeeping, lending, insurance and financial instrument trading, as evidence of the corporation’s extraordinary Protean ability to adapt to and embrace new purposes and functions according to the demands of the times. The bad news is that this animate, dynamic form has been captured by an abstract, mechanistic management and governance model that prevents further evolution by positing a corporate ‘end of history’ converged on a one-dimensional shareholder-controlled profit-oriented enterprise model.

Changing that means changing the underlying ideology of management. And that won’t be easy. Back in 1998 Sumantra Ghoshal and I wrote a piece for the FT about the spread of ‘asshole management’ – ‘an infernal cycle’, driven by the demands of the capital markets for ever greater returns, ‘in which managers, change agents and academics were all collaborating to make both work and leadership … a crippling and inhuman experience’. Twenty years later that ‘profoundly coercive system’ has tightened more than we could have imagined as activist hedge funds and private equity have squeezed companies until the pips – pensions, careers, welfare and now jobs – squeaked and popped. Surveillance capitalism, allowing companies to manipulate and hack humans – to the extent of compromising free will, in the view of some observers – has added another vicious twist to the screw. In self-fulfilling prophecy, this has now become the norm: we no longer imagine there is an alternative. Ruthlessness begets ruthlessness, so we now have asshole companies and asshole (‘hostile’) government departments as management and corporations recast everything they come in contact with in their own reductive, inhuman image. It’s no accident that the US now has an asshole president, the perfect incarnation of of a capitalism that makes assholes of us all.

Yet could we, just maybe, have reached – apologies – peak asshole? The idiocies in Trump’s self-contradictions are too gross to be treated with anything other than contempt. Big tech’s excesses have triggered a backlash that is gathering momentum.The Orwellian ‘reversifications’ (John Lanchester’s term) that current management systematically leads to – banks that make people poorer, hospitals that kill, welfare that immiserates, a Home Office that creates aliens not citizens – are beginning to create a similar outcry. The insulting contrast between the jobs, housing and public services that we want and the HS2, Heathrow expansion and Brexit that we get has come into focus like never before. Above all the miasma of fascism drifting up from our fractured societies puts any idea that we can return to business as usual out of the question.

In their 1986 blockbuster Megatrends, John Naisbitt and Patricia Aburdene wrote: ‘Whenever a new technology is introduced into society, there must be a counterbalancing human response… We must learn to balance the material wonders of technology with the spiritual demands of our human nature’. That’s true. But just so there’s no misunderstanding, the technology that most urgently needs humanising is the meta-technology that underpins how all the others are used, management.

The building of a bullshit economy

In 2013, anthropologist David Graeber, now a professor at LSE, crashed the website of a small magazine with a short essay that struck a chord all over the world: ‘On the Phenomenon of Bullshit Jobs’.

Graeber, who later extended the article into a book, was struck by the number of jobs thought pointless even by those who did them. He pondered Keynes’ much-quoted prediction that we would all work 15 hours a week by the year 2000, and noted capitalists’ aversion to spending money on unnecessary jobs (or even necessary ones: ‘No – give me back my fucking money!’ Trump reportedly raged on finding he was supposed to employ a transition team when moving into the White House). So what was going on?

Graeber was acute in nailing the proliferation of non-jobs, but less so at explaining it. In fact the situation is more insidious than his version, if admittedly duller. It is not, as he suggested, primarily the result of ‘managerial feudalism’ (employing flunkies to big up your status), nor a dark plot by the ruling class to keep workers out of mischief by insisting on the sanctity of work even when it is valueless, although that is an outcome. Instead it is the predictable consequence of our current destructive management beliefs and the work designs they lead to.

The reasons are fairly simple. Since companies put their own short-term interests above those of society, there is constant friction at the margins of what’s legal or at least acceptable. Pushing too far leads to scandal (Enron), crash (Lehman) or both (2008), and, as sure night follows day regulation to bolt the door after the departed horse. As John Kay wearily explains, ‘We have dysfunctional structures that give rise to behaviour that we don’t want. We respond to these structures by identifying the undesirable behaviour, and telling people to stop. We find the same problem emerges, in a slightly different guise. So we construct new rules. And so on. And on. And on.’

As regulation gets ever more complicated, it evolves into an industry in its own right, with its own vested interests and bureaucracy – a monstrously growing succubus symbiotic with the industries it is supposed to control. You can watch the process playing out again in Silicon Valley now. ‘Facebook puts profits above care for democracy’, proclaimed the FT in a recent article. Of course it does: that’s what managers have been taught to do. The demand for regulation is steadily building as a consequence.

Don’t get me wrong – Big Tech needs reining in as urgently as Big Finance. But as a manifestation of a bigger problem – the ‘dysfunctional structure’ that generates regulation that is simultaneously necessary and useless – the only solution is to reduce the need for regulation in the first place by placing a duty of care on companies for the society they form part of. In other words, regulatory jobs are net energy and value-sapping jobs which shouldn’t exist – the creation of philosopher John Locke’s madman, ‘someone reasoning correctly from erroneous premises’. As Peter Drucker put it, ‘There is nothing quite so useless as doing with great efficiency something that should not be done at all’.

And here’s the thing. The dysfunctional structure is fractal, replicated at every level down through the organisation. Since it assumes at least some workers, including managers, will shirk and skive, management is geared for control rather than trust. Low-trust organisations run on rules, surveillance and performance management – which through the process of self-fulfilling prophecy actually makes untrustworthy, or at least unengaged, behaviour more likely. Look no further for the cause the apparent paradox, noted by Graeber, that bureaucracy proliferates just as much in the supposedly lean and efficient private sector as in the public. In effect, each company carries the burden of its own regulatory apparatus. In 2016 Gary Hamel estimated that excess bureaucracy was costing the US $3tr a year in lost productivity, or 17 per cent of GDP. Across the OECD, what we might call the ‘bullshit tax’ amounted to $5.4tr. ‘Bureaucracy must die!’ proclaims Hamel. Yet he concedes that despite his campaign, it seems to get worse, not better.

Finally, with the ideology of Public Choice, the same pessimistic assumptions and stultifying management structures have been visited on the public sector in the form of New Public Management, with exactly similar results. Marketisation has added a further overlay of bullshit. Symptomatic is the experience of the university sector: compare stationary salaries and worsening conditions of academic staff with burgeoning jobs (and salary levels) in administration and management (especially at the top) and the creation of entirely new departments concerned with branding, PR and massaging the all-important student satisfaction figures – an enormous increase in pointless overhead on the real work of turning out critical citizens who can distinguish real value from hot air.

Putting all this together, it is hardly surprising that the US and UK, as the most extreme proponents of deregulation and privatisation, are, with delicious irony, more subject to this systemic bureaucratisation than other less laisser faire economies. So much so that it is tempting to characterise the UK in particular as a bullshit economy. Having largely abandoned manufacturing, it prides itself as a purveyor of financial and professional services selling advice and other products of which the social value is dubious, to say the least. The extreme and paradigmatic case is advertising. ‘The UK advertising industry,’ a recent House of Lords report solemnly intoned, ‘is a success story. Advertising fuels the economy by helping businesses to grow and compete against one another. It is also a significant sector of the economy on its own. The UK, especially London, is a global centre for advertising, exporting services to clients around that world,’ and plenty more in the same vein.

Well, maybe. But in its own terms, as senior adman Dave Trott succinctly told a BBC Radio 4 audience recently, of £23bn worth of ads purchased annually in the UK, ‘4 per cent are remembered positively, 7 per cent are remembered negatively, and 89 per cent are neither noticed or remembered at all’. Let that sink in a minute. £20bn of ads that might as well never have been created – that is bullshit of an awesome order.

Bullshit generates more bullshit. ‘The best minds of my generation are thinking about how to make people click on ads’, one Silicon Valley techie accurately noted. ‘And that sucks.’ Or about spin and fakery – another British ‘success story’ that bloats as newsrooms shrink. PR people now outnumber reporters five to one, compared with two to one 15 years ago. Which is why this kind of bullshit/bureaucracy is so hard to root out. It’s what happens when economic incentives are out of line with society’s interests. It’s not a bug in the system – it’s a feature. It won’t change, in other words, until everything changes.