See my article for LBS Strategy Review, ‘Adhocracy – a new management approach’, with Julian Birkinshaw and Jonas Ridderstrale, here
Category: Free post
Technology doesn’t kill jobs: executives and companies do
Almost every article printed about robots and jobs starts and ends the same way. Here’s a recent example from the FT. The first hallmark of the genre, previewed in the headline ‘Poor education leaves emerging markets vulnerable to automation shock’, is a dire prediction of job losses – in this case in the developing world, where ‘the replacement of workers by machines threatens two-thirds of jobs’, according to a UN report. Then, as always – ‘As always, the only answer is education.’
Of course, prediction and ‘solution’ vary slightly. The looming job loss can be in particular sectors, countries or continents, and the answer can be training or other preparation, or, more frequently nowadays, some form of universal basic income. But both diagnosis and are cure are characterised by the same infuriating mixture of fatalism and complacency.
Past technological surges have always ended up creating more jobs than they destroyed, albeit in unpredictable areas, the argument runs; all we can do now is to tempt employers by giving them more skilled, willing and flexible workers. There are better and worse variations on this argument – for instance, this by Tim Harford is fine. But what most share is the unquestioned assumption that the only half of the equation that can be operated on or influenced is the offer – the workers. So the FT article above is more about Indian and African education than employment. As for the demand for workers – well, it’s just what demand will be.
But this is pushing on a piece of string. What if companies don’t want to employ people? After all, it’s not ‘the whirlwind of automation’ or even ‘machines’ that create or eradicate jobs. It is investment decisions made by human beings in company boardrooms. And those decisions do not take place in a vacuum.
A recent White House report on AI, automation and the economy underlines that ‘Technology is not destiny… The direction of innovation is not a random shock to the economy, but the product of decisions made by firms, governments, and individuals’. As Brian Arthur shows in his excellent book on the nature of technology, the latter is an integral part of the evolving economy, both shaping and being shaped by it. In some areas, such as medicine, the avenues science pursues are already economically determined – cures for first-world conditions are more lucrative than those for poorer countries. In others, management motivations decide how discoveries, once made, are diffused. Technologies such as the internet, voice recognition, touchscreen, and GPS – all developed in the public sector – could have been combined in countless different ways, or none at all: it took an inspired Steve Jobs to bundle them together into the familiar shape of the iPhone. The platform economy the iPhone then made possible is a further techno-economic evolutionary twist.
How the platform economy evolved, and why it is now the go-to model for every budding start-up, is in turn in part the result of developments in other socio-technological areas, in this case the company and management. In 2014 Martin Wolf wrote in the FT: ‘Almost nothing in economics is more important than thinking through how companies should be managed and for what ends’. He went on: ‘Unfortunately, we have made a mess of this. That mess has a name: it is “shareholder value maximisation”. Operating companies in line with this belief not only leads to misbehaviour but also militates against their true social aim, which is to generate greater prosperity’.
This is the first time in history that one of the great technological spurts has taken place when companies are being operated in line with this anti-social belief: that is, under a regime where one stakeholder is supposed to maximise its returns at the expense of the others, including society, and where the most widely taught and practiced version of strategy is largely about preventing other stakeholders from eating the shareholders’ lunch.
Now put today’s gig and task-based economy in perspective. It hasn’t suddenly popped up at random from the blue. It is just the latest step in a process of corporate dis-employment which began in the 1980s. Simply put, under shareholder value employees are costs to be minimised like any other. So responding to their new incentives, managers began to pass up employment-creating initiatives they would have undertaken in the past in favour of cost-minimising measures to benefit shareholders.
The downsizing and outsourcing trends initiated then have expanded steadily to the present day. Collateral damage was first lifetime employment, then defined-benefit pensions, then corporate responsibility for career. In recent years automation and AI have further eroded the full-time permanent employment bond, with a corresponding upturn in the growth of freelance, short-term and zero-hours contracting.
Enter in 2007 the iPhone.
There are many ways the smartphone could have been used for economic and social gain – including the enabling of a real sharing economy. But in the labour-historical context there is an inevitability about executives driven by shareholder value deploying it to dismantle the last element in the traditional employment regime: the job. With labour a commodity to be contracted as easily as any other, one of the traditional justifications for the traditionally shaped company collapsed, and with it the last link between corporate growth and employment. At best new-generation companies are job-neutral (Instagram: $19bn in value, 19 employees); at worst, like Uber, they are job killers.
The blunt truth is that companies today have no intention of employing people unless they absolutely have to (which explains why the Silicon Valley titans are hypocritically rallying to the cause of the universal basic income). Next down the road: driverless cars. In this situation, expecting better qualifications to improve employment chances is a bit like hoping that faster, stronger horses would stave off the advance of the combustion engine.
So do we sit passively while employment continues to wither until it becomes the preserve of a privileged few?
Or do we decide to act on what we can still influence, the demand side of labour?
Here are some suggestions.
The first step is for governments to reinstate employment as a central plank of economic policy, as it was up to the 1980s when it was hastily abandoned as politicians let themselves be persuaded that markets know best. Tax policies should be adjusted accordingly. Companies that fail to pay a living wage should not be considered for state contracts.
Employment policy should go alongside a root-and-branch rethink of ‘how companies should be managed and for what ends’, to requote Martin Wolf. Far surpassing the timid and irrelevant tweaks envisaged by Theresa May, the aim would be to prise companies from the grasp = of short-term shareholders (and executives) and restore them to their proper mission of generating greater prosperity for society.
Individuals also have their part to play. Strikingly, Gallup surveys show that globally what more people want than anything else – more than security, family and peace – is a full-time job with a pay cheque. OK: their responsibility then is to prepare themselves to engage with the employment they want as both workers and citizens – including in the responsible trade unions that governments (and companies) should foster as counterweight to the current corporate dominance.
Employment is one of the time-bombs left un-defused by the failure to reform business-as-usual after the financial crash of 2008. It would be better to act now than to wait for a chance detonation to set it off.
Dinosaurs weren’t replaced by better dinosaurs
Read my account of the latest Foundation forum here
University challenge
In C.P. Snow’s Strangers and Brothers novel sequence, several of which are set in a fictional Cambridge college in the 1930s and 1940s, older dons could remember the time when college fellows weren’t allowed to marry. As late as the 1960s universities remained a world apart. There were just 22 in the UK, reserved for a privileged 5 per cent of the population – and in some of them students still had to be in by 12 at night.
Of all our enduring institutions (Oxbridge dates back to the !2-13th century), over recent years the universities have perhaps travelled farthest in the shortest period of time. There are now 150 of them in the UK, and Tony Blair’s target of 50 per cent participation has pretty much been met. The all-important student satisfaction survey naturally ensures that no vestiges of restraint on night-time entertainment endure (today’s equivalent may be ‘safe spaces’, but that’s another story).
That is certainly an achievement, but as recent political headlines over tuition fees and teaching attest, it is far from an unalloyed or uncomplicated one. For the recent history of the university sector is an object lesson in the unexpected consequences of opportunistic policy-making, a number of which are now coming destructively home to roost.
The first is perhaps the most straightforward. One powerful justification for university expansion was the supply-side argument that boosting educational levels would respond to employers’ demands for a more capable workforce and thus benefit the economy as a whole. Fifty years on, employers are still whingeing about the wrong kind of qualifications – and these days they’d often rather not employ anyone at all, particularly expensive graduates. It was always the demand side too, stupid.
As commentators such as Alison Wolf have consistently argued, the reverse of the coin of privileging university education is the scandalous neglect of further and vocational education such as apprenticeships. In the resulting mismatch, overqualified graduates are being employed for jobs which previously would have gone to the less well qualified, compressing the latters’ employment and social chances, and everyone’s wage rates. This not only stokes the political pressures that are now all too evident, but also ensures that the expensive loans taken out to buy an income boost that hasn’t materialised, will never be repaid.
There have been internal growing pains, too. As with much in Britain, universities, in the words of Andrew Adonis, head of Blair’s No 10 policy unit, have developed haphazardly, with ‘one thing leading to another, in a typically unplanned British way, on the part of successive governments’. As the university estate has grown, and encouraged by successive governments, notions of choice, competition and latterly value for money have steadily come to the fore. Particularly consequential (and not in a good way) has been the regime of targets in the shape of the Research Assessment Exercise and its successors instituted by the Thatcher government in the 1980s.
As with all such measures, the consequences were predictably unintended. Thus making a large chunk of university funding dependent on research quality provoked massive gaming on one hand (a burgeoning transfer market for prolific researchers, exclusion of weaker colleagues from assessment, huge expansion of conferences and learned journals to report on and in), together with reduced emphasis on teaching on the other. Teaching quality hasn’t suffered – competition for posts among a vastly increased supply of young PhDs has seen to that – but quantity has. Teaching is now to be subjected to its own assessment, with no doubt similar perverse effects, not to mention spiralling bureaucratic demands: university head of department is in practice now a full-time management job, with no time for either research or the teaching that the incumbentwas taken on to perform.
Meanwhile, tuition fees and the insidious marketisation of education generally have increasingly displaced competition between institutions from the academic to factors such as facilities and ‘the student experience’. Universities now think of themselves as ‘brands’, with a massive increase in spending on marketing and related activities – and managerial types to do it. While lecturing salaries are subject to a 1.1 per cent annual growth cap, no such restriction applies to management. Adonis points to soaring vice-chancellors’ pay, which drags other managerial ranks up with it. Bath University employs 13 managers on more than £150,000 a year, and 67 on £100,000. Many universities actually make a ‘profit’ on tuition fees: dispiritingly, this is where the money goes instead.
Which brings us back to the bottom line: who pays? Adonis is right that fees set at £9,000 a year (a cynically and supremely opportunist move by George Osborne to fund tax cuts for the better off), soon to go up again, are a ‘Frankenstein’s monster’ and in the long term untenable. No one should have to begin adult life with debts of £50,000 hanging over them. But the effects ramify out from individuals to the entire macroeconomy. For a start, starting debt levels are now so high that even after 30 years, three-quarters of students won’t have paid off their debt, according to the IFS. On the government’s own estimates total unpaid loans will hit £100bn next year and double again in a decade.
That’s unconscionable enough. But there are huge knock-on effects too. There was always a sneaking suspicion that one intended side-effect of making young adults pay for their university education might be to curb student activism. At least in the US, that seems to have been the intention. But the indirect costs are now mounting vertiginously. Indebted graduates are delaying having families while they search for reasonably paying jobs. As for buying homes, forget it – as also the furniture and other stuff that go in them. In fact, impaired credit ratings make it quite hard for them to make any substantial purchases at all. Dampened spirits and high anxiety levels are being connected with health issues such as depression, and marital failure. Finally, the debt overhang is also reckoned to be a factor in the worrying fall-off in rates of entrepreneurship and new business formation.
It takes a special kind of management to transform a policy that was presented as having only upsides into something now characterised as ‘unsustainable’, ‘in tatters’ and a ‘substantial economic headwind’. It will take something equally special to unravel it, but the other way round. But that would require an ability to register and learn the lessons of past mistakes – so don’t hold your breath.
Why austerity doesn’t work
Austerity is a disaster economically, politically and socially.
There are at least three reasons why.
The first is the Keynesian one. Paradoxically, austerity (reducing the size of the state, cutting public services, freezing wages) is a luxury. It can only be done acceptably in export-led economies like Germany and Canada where businesses depend mainly on foreign demand. The UK, on the other hand, is a low-wage economy that relies on domestic demand from consumers many of whom are earning less than before the crash of 2008, who have no savings and mountains of debt. The less they have to spend, the harder it is for UK businesses to prosper. Duh. The government’s tax take shrinks too, so that the date for paying down the public deficit continues to recede. As economist Jonathan Portes tweeted, ‘Failing to borrow long-term at negative real rates to fix roofs (& other things) over last 7 years an act of deliberate economic self-harm’. This is austerity as suicide.
The second reason why austerity doesn’t work is that the institutional framework, or social contract, that supported its logic isn’t there any more. Post-war western economies were built on the understanding that production and consumption were interdependent – one person’s wages bought someone else’s output – as embodied in the famous diagram of the circular flow of income at the beginning of Paul Samuelson’s Economics.
That began to break down in the 1970s and 1980s, when a persistent trope was that the private sector was being ‘crowded out’ by the public sector, which needed to be pruned back to allow entrepreneurial animal spirits to flourish to rev up the sluggish economy. Hence the rounds of privatisation, agencification and outsourcing that still continue.
But wind forward to today. It is quite clear that the capital markets that call the tune over corporate resource-allocation decisions aren’t remotely interested in the circular flow. They don’t care a fig about job creation – in fact the only innovation and investment they applaud is automation that cuts jobs. So whereas in the past companies automatically created employment as they grew, they no longer do. Companies now only create well-paying full-time jobs as a last resort, leading to the celebrated (possibly apocryphal) stand-off between Henry Ford II and union boss Walter Reuther as they toured an advanced new manufacturing facility:
Ford: Walter, how are you going to get those robots to pay your union dues?
Reuther: Henry, how are you going to get them to buy your cars?
The internet aids and abets this fragmentation. In a classic bait and switch, using network effects and the massive cross-subsidies afforded by our meek acceptance of the devil’s bargain of ‘free’, the internet privileges us as consumers, but only by turning our jobs into gigs and micro-tasks, with equivalent effects on our wages.
This is austerity as reductio ad absurdum.
The final reason why austerity doesn’t work is the most fundamental and paradoxical of all. It is unknown to economics, and unfortunately to most managers too. It is that, as we have learned from systems thinkers and practitioners, managing by cost doesn’t cut costs – it increases them. This sounds counterintuitive. The conventional assumption, inherited from strategy and mass production, is that there is a trade-off between quality and cost. The better the service, the higher the cost; so the only way to cut the cost is to reduce the quality (or quantity, by rationing or otherwise restricting access).‘Services for the poor are always poor services’, Richard Titmuss observed as long ago as the 1960s.
But this is not inevitable. Here is a rare case where the cake can be had and eaten too. At a recent event, the chairman of pioneering peer-to-peer lender Zopa noted that ‘businesses that win customer service awards [like Zopa] are those that people don’t have to speak to. They don’t have to speak to you because the service works as it should do, from the tin.’ Contact centres are expensive, but firms like Zopa don’t have them – why would they? What’s more, being efficient at what it does – basically credit – doesn’t only benefit Zopa’s borrowers and savers. ‘It’s more fundamental than that, because if you’re good at credit and you get a reputation for it your cost of capital goes down… By being better at credit than any UK bank our cost of funds has gone down dramatically – way lower than any of our competitors.’ In a sector with thin margins, that’s an existential advantage. Building on what it has learned, Zopa is about to launch ‘the best consumer bank in the UK’. Can’t wait.
Like profits, costs are properly thought of as a consequence of the way you do something for a customer – an effect, not a cause. Done well, as at Zopa, the consequence is that costs fall. Done poorly, as at other banks, as at Grenfell Tower, as in countless other services, and costs rise. As another speaker at the same event lamented, ‘Why is it that we never have the time or money to do something right, but we always do to do it twice’?
Just as in a coupled system efficiencies amplify each other, so do inefficiencies. Consider prisons. The easiest way to deal with criminals is to lock them up. So that’s what we do. But prison is expensive. So governments repeatedly try to cut its cost by reducing staff numbers, economising on rehabilitation, and keeping prisoners banged up 23 hours a day.
But this is austerity as sheer management ineptitude.
The outcome: overcrowded jails as dangerous powder kegs that instead of returning offenders to society as functioning citizens, send them back as professional villains and drug addicts. A shredded probation service offers little support for those discharged, with the predictable result that most of them are expensively back inside within months – and the pressure grows for yet more prisons. In other words, prison makes the presenting problem worse. Research in Manchester shows that 80 per cent of crime is committed by a relatively small number of individuals and families who are known to the police – and to many other public agencies. Straightening them out would reduce demand not just on the criminal justice service but across the entire public sector.
As is being proved in right-wing US states such as Texas and and Utah. Appalled by the soaring cost of dealing with reoffenders and homeless (often the same individals), they find it far cheaper to house each client and give them a tough minder until their life is stabilised than to leave them to clog up police cells and homeless shelters or return to prison. Believe it or not, Texas is closing prisons; in 2014 Utah claimed it had reduced homelessness by three-quarters and was on the way to eliminating it altogether.
You wouldn’t call this ‘austere management’ – it submits no one to hardship and it should not be something exceptional. To the contrary: it gives people what they need, neither more nor less, with no spare effort or waste. Unlike the economic version, it has equal appeal to right or left. Let’s call it instead ‘frugal’ – as all management should be, whether in the best or worst of times.
The case for better management
When the Chartered Management Institute (CMI) launched a manifesto for improving UK management earlier in June, it couldn’t have guessed how good its timing would be. In launching the manifesto, CMI CEO Ann Francke reasonably noted that the obsession with the art of the Brexit deal was obscuring a bigger and more fundamental prize: doing a better job of making the stuff that our trading partners, whatever their nationality, might want to buy. Closing the productivity gap with our European neighbours through better management and leadership, she pointed, would offset the yearly penalty of leaving the EU with some change left over.
Francke wasn’t to know how vividly the post-, and indeed pre-election shambles would underline the inadvisability of relying on British political management and deal-making skills to secure economic advancement. At the same time, she seriously underestimated the extent of the potential gains to be made by better conduct of the UK’s corporate – and governmental – business.
The estimates below are necessarily imprecise and non-exhaustive, but they provide some pointers.
-
In 2015 Investors in People, quoted by CMI, estimated that poor people management was costing UK firms £84bn a year in forgone productivity.
-
According to McKinsey Global Institute (MGI), also quoted by CMI, bridging the UK gender gap at work could generate an extra £150 billion on top of business-as-usual GDP forecasts in 2025 and translate into 840,000 more jobs for women.
For comparison, the OFS puts the penalty to the economy of leaving Europe at £75bn annually. But why stop there? There’s lots more where that came from that CMI doesn’t mention. For example:
-
Gary Hamel and Michel Zanini calculate that the UK has proportionally more surplus bureaucrats – managers and administrators doing non-value-adding tasks such as checking the work of others – than the US, where they think GDP could be boosted by $3tr if the drones were redeployed to more productive work. On that basis the equivalent figure for the UK would be around £400bn.
-
In a recent discussion document, MGI estimated that short-termism (largely a consequence of faulty governance arrangements) could be costing the US 0.8% of lost GDP a year. It concluded: ‘Our findings–that short termism is rising, that it harms corporate performance, and that it has cost millions of jobs and trillions in GDP growth–are sobering. Companies and governments should begin to take proactive steps to overcome short-term pressure and focus on long-term value. The economic success of their companies and their countries depends on it’. All of the above of course applies in equal measure to the UK. 0.8 per cent of UK GDP is roughly £20bn.
-
Regulation was meant to promote a more efficient welfare state. But the outsourcing and marketisation of public-sector service provision to a rent-seeking private sector (governance failure again) has led to a proliferating thicket of regulation which cost more than £1bn to administer at the turn of the millennium, according to the LSE, while imposing huge costs on auditees: an OFSTED inspection can cost a school directly and indirectly up to £20,000. As John Seddon point outs, these totals pale into insignificance against the huge unknowable opportunity costs of prescribing wrong methods (eg back offices, call centres), and, even worse, halting systemic improvement dead in its tracks.
-
Finally, as a knock-on consequence of all of the above (as I noted in my last), terrible public services. As with regulation, an unholy multiplier compounds the problem: managing by cost, or rationing, doesn’t reduce demand – it fragments it and, a bit like Japanese knotweed, each splinter takes root and grows into a vigorous new demand of its own somewhere else in the system.
There’s plenty of overlap in these categories, of course – but that signifies that there are also de-multiplier effects in the opposite direction: better social services remove failure demand from the NHS and other agencies, better regulation reduces red tape and releases innovation, and above all better governance by knocking short-termism on the head and enabling a move from command and control potentially cuts the need for regulation and bureaucracy both internal and external, promotes better work design, does away with the need for separate ‘people management’ and provides a sharper focus on the customer.
In fact, the systemic ramifications spread even wider. If, as Edward Luce argues in a fine recent article in the FT, Brexit itself, like Trump, is the messy rebuke meted out to those who have overseen the glaring cumulative failure of Anglo-American capitalism, the only answer is to devise a better, more inclusive model: ie, one with better governance and more productive management.
Blinded by its own sense of entitlement, the UK has always had a shaky grasp of its own management needs and priorities. We console ourselves by saying we’re good at pomp and circumstance, and these days at the execution of big projects (the Olympics, HS1 and Crossrail), when we get round to them, and excluding, of course, anything to do with computers. But these are one-offs. We’re hopeless at most routine management, which is why many of ‘our’ top performing sectors, like the motor industry, are foreign-owned and/or managed, with, as the CMI notes, a long tail of underperformers trailing behind. Even our supposed jewel, the City, is largely foreign-owned. As I have noted elsewhere, this makes us highly dependent on foreign management, not to mention alarmingly vulnerable to the effects of Brexit.
Management failings, as the OECD has pointed out, are a brake on competitiveness, quite independent of our trading arrangements, and Brexit only makes them more significant, not less. An urgent programme to address these weaknesses isn’t glamorous – needless to say, the CMI manifesto has passed unnoticed by the press or politicians. But it’s practical, doable and, unlike Brexit, fully in our own control. There is nothing to lose. It perfectly accords with historian Linda Colley’s sober observation that in light of reality ‘[Britain’s] politicians need to talk and think and plan not in terms of a transformative, glowing Brexit or a new modern socialist millennium, but to put their minds together to establish what the least worst options are that they can feasibly and usefully pursue.’
What are they waiting for?
Inclusive growth and prosperity – for whom?
Read my article in EFMD Global Focus, June 2017, here
Britain through the looking glass
Brexit Britain is an unreal place, and even more unreal in an election. It has its share of problems none of which Brexit can fix, because they are not what it thinks they are. What it treats as problems, aren’t, while the real problems, which it assumes to be strengths, grow bigger as they are ignored. Having comprehensively misdiagnosed itself, it treats its non-problems with remedies that are actually the cause of its real ones, turning them into proper problems too. It may be in the process of making itself democratically ungovernable.
For years Britain has lived not so much beyond its means as beyond the looking glass. What it sees in the mirror is a buccaneering freebooter whose entrepreneurs have only to be freed of pettifogging EU rules to plunder the world like Francis Drake and Walter Raleigh of old. It could hardly be more telling that the big British films of summer 2017 are Churchill and Dunkirk. The reality is a second-ranking financier-trader that provides neither the social protection of Northern Europe nor the low taxes of the US that its inhabitants aspire to. Britain’s emblematic institution is the NHS, a once bold experiment now neither bold nor innovative but still venerated, that struggles to provide a half-decent service despite the constraints and reforms that are constantly visited on it.
How did we get to this pass? The answer, thrown into sharp relief by the shock of Brexit, is that we willed it. The British sickness is iatrogenic, caused by the market reforms of the 1980s that had the opposite effects to those intended, and the ignoring of reforms in quite different areas might have put it on the right track.
Speaking a at a recent seminar at LSE, academic Dr Abby Innes noted an underlying parallel between Brexit and the 1979 election that brought Margaret Thatcher to power. Then as now, the major problem faced by the economy was seen not as deindustrialisation, the collapse of Bretton Woods, oil price rises or radical technological changes (1979), any more than globalisation, radical technological change or the backwash of the 2008 was most urgent in 2016. Rather, the overriding problem was a self-seeking bureaucratic state (or superstate in the case of Europe today) that supposedly paralyses growth and crods out Britain’s native entrepreneurial spirits.
Having accepted this diagnosis, all succeeding British governments, whether tacitly or overtly, have made it their central project over the last four decades to transform the nature of the state. This they have done by letting the market in, through privatisation, outsourcing, managerialism (New Public Management), agencification, quasi-markets in health and welfare, and abandoning any attempt at industrial and employment policy. At the same time, permissive legislation allowed companies to opt out of their previous obligations (career, full-time jobs, employability, proper pensions, living wage) in the name of competitiveness. Similar dispensations have allowed firms to pursue a race to the bottom in taxes, standards and regulation, displacing the fiscal burden on to consumption and those at the bottom of the social heap.
This extreme focus on the supply side has certainly transformed the character of the state – but the consequences have been far from those intended. As sole buyer of complex, hard-to-value services, often under inherently flawed payment-by-results regimes, governments have been taken to the cleaners by a private sector that has little incentive to innovate or improve because, as it is well aware, the buyer is comprehensively locked in (a monster case of moral hazard). A barrage of complaints and failures has triggered the spread of a ‘Kafkaesque’ (Innes’ term) regulatory state that is both intrusive and ineffective.
One glaring result is that private-sector service provision is neither low cost nor high quality. Academic research confirms anecdotal and subjective evidence: according to one study, over the last 30 years UK administrative costs have increased by 40 per cent in real terms with 30 per cent fewer civil servants, while public spending has doubled. Running costs have gone up fastest in the outsourced areas. At the same time, the volume of complaints, challenges and failures has soared.
This of course is the opposite of the ‘more for less’ that was promised. Unfortunately, the same is true in the private sector. In one of the hugest market failures in capitalist history, the wave of innovation, investment and job creation that was supposed to be triggered by government ‘getting out of the way’ simply didn’t happen. Instead, the focus of the economic system has switched 180 degrees from wealth creation to wealth extraction through pervasive financialisation at every level – first, by (ironically) crowding out, and increasingly coopting the proceeds of, the productive sectors, creating an economy which is unbalanced even by British standards; second, by financialising non-financial firms through a regime of shareholder value maximisation in which profits, often gained through cost and job cutting, are extracted in pay-offs for executives and short-term shareholders; and third, by using debt of all kinds, now at record levels, to draw individuals into the same corrupted system, leading to massive increases of systemic vulnerability in periods of both boom and bust.
But even this may not be the full extent of the self-inflicted damage. Pessimists argue that to ineffective public services and a predatory, rent-seeking private sector should now be added the failings of an incompetent state, progressively stripped over the years by state-hating governments of the capability to exercise its overarching governance role. The more authority is devolved to the private sector, the harder it becomes for governments to change unpopular or even failing policies, let alone start to rebuild the capability of central institutions to oversee themselves in the name of all citizens. The Brexit referendum, apparently launched without thought for the constitutional consequences, is just the most striking example of this learned witlessness, unthinkable elsewhere.
As Innes notes, Brexit is likely to expose what we should have been talking about all along: the crisis of a disintegrating British political economy whose woes have absolutely nothing to do with Europe. Brexit has no answer for soaring inequality, stagnating productivity, threadbare public services, nor the deep structural divide between those educated, mobile and confident enough to survive, if not prosper, with minimal or even failing state protection, and those with little hope of profiting from the emerging freelance contracting economy. Theresa May, like Donald Trump, is right to sense the social fracture, but shows no sign of understanding its nature or the extent of the changes needed to bridge it. Beside these, Brexit is a distracting sideshow.
The hackers hacked
Life in technology-land, it transpires, is a lot more complicated than Silicon Valley had us believe. The price of ‘free’ is going up each day. And it is not just financial. Every internet upside, it’s becoming clear, has a downside. Thus the drive for frictionless online experience runs up against the online Catch-22, which is that if something is accessible it is not secure, and vice versa. Accessibility and security are incompatible. The threat is fractal, from phishing and other attempted scams that most of us will have experienced at our Mac or PC, to ransomware attacks on organisations (the recent hack involving the NHS and many others around the world was just the biggest so far); and even national level (subverting democracy, as alleged in the US elections and the Brexit vote). A variation on this law is that if you can hack you can get hacked back: to add insult to injury the malware used in the NHS attack was developed – at taxpayers’ expense – by, and then stolen from, the US National Security Agency.
Many observers fear that the next financial meltdown in the works is not the regulators’ inability to model the system they are supposed to regulate, nor the spread of derivatives backed by new types of securitised asset classes, eg student or car-purchase loans (although both of those are serious enough). It’s cyber attacks on big banks. An audience at Davos in 2016 heard that the hacking of even a mid-sized US bank could have systemic consequences globally. The panel agreed that cyber security was a major issue of our time: a management as much a technological problem. In a typically blunt piece in the Evening Standard, Tony Hilton recently pointed out that the threat is made worse by cost-cutting among the biggest companies, which has been the source of much of their apparent financial improvement over the last two decades.
‘In terms of increased operational efficiency the results have been impressive,’ Hilton noted. ‘But too often it comes at the expense of resilience leaving businesses vulnerable to shocks and setbacks. [British business] is fragile to a far greater degree than ever before’. This at a time fraud is becoming much more like a business, complete with professional specialisms and supply chains like any other industry. Criminologists suggest that the current downward trend in violent crime is nothing to do with deterrence, better policing or villains going straight. Rather it is the mirror image of the growth of online fraud: why would anyone take the physical risk of burgling a house when you can break into a bank account from your laptop without leaving the comfort of your own home?
Where technology, or anything else, is underpriced it is likely to be used indiscriminately and for its own sake rather than for its real, harder-to-achieve benefits. Technology can be used to make us smarter or dumber, to augment human intelligence or replace it. Moravec’s paradox says that what’s high-level for humans (go, chess, accountancy) is trivial for computers and vice versa (computers can’t unpack a washing-up machine or iron and fold clothes and put them in a drawer). Moravec in combination with artificially cheap tech tools perversely guarantees that computers will eat the high-level, high-paid jobs and leave humans with with low-level, low-paid ones (Uber gigs, the many varieties of personal care). Too many tech applications are about features, not benefits. Uber destroys more value than it creates, and of what it does create it appropriates by far the lion’s share. Do we need more dating or disappearing photo apps? What is the social question to which driverless cars are the answer?
Now add to this an ad-based, more bluntly surveillance, business model, whose currency is attention. An attention economy militates in favour of not only accessibility with attendant vulnerabilities as outlined above, but also persuasion, seduction, and manipulation of all kinds, whether through greed (betting) sensationalism (fake news), desire (porn) or fear, all designed to make users give up ever more of their personal details as they spend just an extra minute on each site. The costs for individuals are addiction, shortened attention span, solipsism, isolation. ‘The best minds of my generation are thinking about how to make people click ads’, one data guru famously lamented. This is technology that makes us dumber at a meta scale. ‘It’s not just ruining our attention, it’s ruining our minds,’ says Tristan Harris, an ex-Googler who founded a Time Well Spent movement. ‘The attention problem makes it harder to solve every other problem.’
All technology involves interference with ‘nature’ which creates winners and losers. As such it behoves us to use it not just ‘because we can’ but with forethought and a duty of care to people – in conjunction with humans, not against or instead of them. The heading on a post on FTAlphaville recently warned: ‘More technology, more problems. At some point we may have to ask: is it worth it?’
Capitalism as a zero-sum game
In the aftermath of the great financial crash, Alan Greenspan, chairman of the US Federal Reserve, admitted to a Congress committee: ‘I made a mistake in presuming that the self-interests of organisations, specifically banks and others, were such that they were best capable of protecting their own shareholders and their equity in the firms… I discovered a flaw in the model that I perceived as the critical functioning structure that defines how the world works.’
This was a bit like a clergyman confessing that God had failed. Rational self-interest is at the heart of the neo-liberal programme that has dominated economics in the Anglosphere since the 1980s. If you can reliably count on individuals to use objective reason in protecting their own self-interest, queried the fundamentalists, why would you need regulation, social legislation, even government apart from defence and a few other essentials, or anything else very much except a functioning market? In a world of rational self-interest, strivers will get their deserts and so will skivers, to the benefit of an ever more rational, meritocratic society.
As it turns out, it’s a big ‘if’. Given Greenspan’s public recognition of the collapse of the central tenet of his laissez faire manifesto, you might imagine that the lesson had been learned: claims of the self-policing powers of self-interest should be taken with fast-food-industry doses of salt. As Michael Lewis put it so colourfully in his brilliant The Big Short, Howie Huber, the Morgan Stanley bond trader who ran up a loss of $9bn, the largest in Wall Street history, ‘was smart enough to be cynical about his market, but not smart enough to realise how cynical he needed to be’.
But you’d be wrong. If the crash was the explosion of a bubble massively inflated by the non-self-policing self-interest of bankers and real estate vendors, today’s ugly rumblings of xenophobia and populism are the sound-track of the slower-mo disaster that is capitalists in their own self-interest grinding capitalism to bits.
History doesn’t appear to have taught them anything. In previous great technological growth spurts, jobs and wages were at the heart of what worked (most of the time) as a virtuous circle. Investment in new technologies led to higher productivity and wages, feeding demand which fuelled further investment and new employment. Jobs were central to the cycle – effectively a vehicle not only for redistribution, but also social mobility and inclusion.
Yes, it was a system, and like any other system it was easily destabilised (to paraphrase Lewis) by people smart enough to spot the flaws but not smart enough to know that you can’t improve a system by optimising the parts – it’s the interaction of the components that counts, not their sum.
The ideological shifts of the 1970s and 1980s destroyed the previous balance. Full employment was abandoned as the object of economic policy, and ‘flexibility’ (aka deregulation and weakening worker organisations) was installed as the mantra for labour markets. Economists decreed that concentrating on the supply side and leaving the rest to the market would maximise the general welfare. Meanwhile, managers responding to the new emphasis on shareholder value reined in job-creating investment in favour of cost-cutting and distributing the proceeds to shareholders; and these combined with technological advance to launch a wave of global outsourcing and offshoring that activated a very different circle that quickly went vicious.
Its consequence is what we are in the grip of now: jobless growth that is not cyclical but structural. In theory, every element is in place to generate a new ‘Golden Age’ of progress, in the words of development economist Carlota Perez, based on job-creating green growth. The world is awash with capital. Self-evidently there is no shortage of human needs to be filled, and technology is advancing by leaps and bounds. And more than anything in the world – more than family, security and peace, according to Gallup polls – people want to work; they want jobs with regular hours and a pay packet. The world needs 1.9bn more of them, again according to Gallup.
Yet we’re stuck. Business as usual won’t clear the blockage – it’s business as usual that caused it. And as the hype fades the truth is dawning that far from presaging a new and different economy, the internet and Silicon Valley combo is actually just the old one on steroids.
So far from creating a sharing economy, the tech giants are ‘modern monopolists’, in the FT’s words, that leave few crumbs behind for anyone else: just 10 per cent of companies account for 80 per cent of all profits, according to McKinsey. The game of the superstars is value appropriation, not creation – like Hollywood they prefer to recycle old narratives jazzed up with CGI for the times. Viewing people solely as cost, they grudgingly create real jobs only when they can’t use technology to break them up and spit out the bits as gigs. Facebook’s Mark Zuckerberg famously said, ‘we move fast and break things’; he has since distanced himself from the line, but the motto’s mixture of brashness and complacency perfectly sums up the Valley’s careless arrogance.
This isn’t creative destruction à la Joseph Schumpeter, more straight rapaciousness à la Ayn Rand (unsurprisingly, a major titan hero). But as even Greenspan came to realise, extreme self-interest and technology used for its own self are self-defeating. In a job-poor world there is no ‘invisible hand’ that will conjure up demand for products or services when (as must be very close) consumers’ ability to take on new debt finally runs out. When the cost of shareholder (and executive) enrichment is the jobs of the less well off who can no longer afford to buy the products they create, capitalism is a zero-sum game as counterproductive as a shark consuming its own tail. It’s already happening: student-loan-laden millennials, unable to afford either housing or pensions, the assets crucial to upward mobility, are already a drag on the economy on both sides of the Atlantic. Quae non possunt non manent: things that can’t last don’t. When millennials or their children start to break real, rather than virtual, things in earnest, Donald Trump may look like a minor worry.