In the context of the future of work, the question ‘will robots take our jobs?’ has become so ubiquitous it has spawned its own website — entertaining and devastating in equal measure, entirely dependent on your own occupation.
School teacher? ‘Totally safe.’
Legal clerk? ‘Watch this space.’
Accountant? ‘You are doomed.’
Fibreglass laminator and fabricator? ‘Totally doomed.’
More optimistically, the current sway of semi-scientific popular opinion suggests intelligent technologies will augment a greater percentage of human work than will be replaced. A recent report by Accenture forecasts more than 50% of current human work is augmentable, rather than automatable, enabling renewed capacity to focus on innovation and high-value tasks.
This confidence comes on the back of a recent boom of business investment in digital transformation, with seemingly everyone and their dog being reinvented by artificially intelligent, data-driven insights, coupled with hyper-personalised user experiences, underpinning a supercharged post-digital-strategy primed to win the customer-experience battleground right along a disaggregated value chain. In other words, lots of extra good stuff for everyone, that IDC forecasts will create more than 30 million additional new jobs globally to 2022, with up to $9 trillion of additional business revenue added to the global GDP.
Unfortunately, this positivity is balanced with a heavy dose of caution by those actually doing the work. Sixty per cent of 10,000 knowledge workers surveyed by PwC believe “few people will have stable, long-term employment in the future”. In more dramatic terms, the surge in populism currently afflicting the western world points to a general sense of global malaise unprecedented in modern times.
So we must ask, why does this pessimism persist?
Perhaps the answer lies in the sense that people have been here before and they did not win. Through the technology boom of the nineties and early noughties, we were told to sacrifice ourselves at the altar of accelerating technologies, which alongside third-way social-democratic politics would herald a new era of supercharged productivity and shared prosperity for all. Yet it is an extraordinary fact that for the decade to 2017, the US, still the birthplace of the majority of the world’s technological giants, witnessed its most sluggish productivity growth since the 1970s.
There have been winners in this era, but it was not the middle- or working-class of the developed world, which have experienced an unprecedented period of stagnating wages and social inequality. So, what is it that needs to happen to ensure the productivity potential promised by this new era of technological innovation is delivered to the benefit of all?
As Mauricio Macri, President of Argentina stated at the 2018 G20: “The future of work will be a race between education and technology.” Rarely in recent times has a global statesman resonated so profoundly.
The problem Macri’s statement points to is that as technological innovation accelerates at an ever-increasing rate, human and business capability to derive value from this productivity potential lags ever farther behind, because education and skills cannot keep pace with the rate of change.
If the future of work is a race between technology and education, then education is losing.
Accenture has called the result a “global skills crisis that could hold back the economic promise of intelligent technologies”, with $11.5 trillion of cumulative growth at risk over the next 10 years, or more than 1.1% of global GDP growth every year. It is this missed opportunity that will further stagnate wages, fail to create new jobs, and fail to arrest the downward spiral into divisive discourse that is promised by another two years of Brexit crisis merry-go-round and US presidential elections.
These warnings must be a call to action. The response must start with a rejection of the idea that education climaxes with the completion of university in our early-20s, producing graduates ready to jump onto the linear career ladder right through to retirement. Universities should be producing graduates ready for their first job, meaning current, relevant hard-skills enabling them to engage with up-to-date tools and technologies. But students must also graduate with the mentality to adapt, upskill and reskill throughout their careers, equipped with the soft skills required to change contexts, innovate and create their own jobs when necessary.
This lifelong learning mentality must be maintained into our 40s, 50s and 60s, as people prepare for a 50-year-long career across 10-15 different jobs. The accessibility of learning opportunities has never been greater with universities working directly with industry to build courses designed for the future of work, but proactive career management and strong self-motivation are critical to navigating this increasingly complex landscape. For those that successfully embrace the dynamic of continuous change, there is great opportunity to progress careers and demand significant salary premiums.
Businesses, for their part, need to accept that they cannot buy all their talent, they need to build it too. AT&T has shown the way, committing $1 billion to retrain over half its 250,000 staff in order to stay competitive and prolong their 100-plus years of existence. The companies that do this intelligently will retain staff, cultivate IP, save money and fulfil their fundamental obligation to contribute to a healthy and prosperous society.