For three decades, I have been documenting the lives of the Ju/’hoansi people of the north-western Kalahari, and their often traumatic encounter with modernity. The Ju/’hoansi are perhaps the best known of the handful of societies who still sustained themselves by hunting and gathering well into the 20th century. And to them, very little about the relentlessly expanding global economy makes sense. Why, they asked me, did government officials who sat in air-conditioned offices drinking coffee and chatting all day long get paid so much more than the young men they sent out to dig ditches? Why, when people were paid for their work, did they still go back the following day rather than enjoy the fruits of their labour? And why did people work so hard to acquire more wealth than they could ever possibly need or enjoy? It was hardly a surprise that the Ju/’hoansi asked these questions. By the time I started working with them, it was already widely accepted that they were the best modern exemplars of how all of our hunting and gathering ancestors must have lived. But the longer I stayed with them, the more I became convinced that understanding their economic approach not only offered insights into the past — it also provided clues as to how we in the industrialised world might organise ourselves in an increasingly automated future.
Seldom have these lessons seemed more urgent. As jobless numbers surge as a result of Covid-19’s spread, practices once seen as fringe are accepted as an almost inevitable part of the new world order. Governments are talking up their willingness to embrace revolutionary economic vaccines, from state-sponsored furlough schemes to giving us cash to eat in restaurants — anything to get people back to work. The same spirit infused pre-pandemic debates about the future of work, which focused mainly on concerns arising out of the relentless cannibalisation of the employment market by ever more productive automated systems and artificial intelligence. It is easy to see why this generates such anxiety. The work we do also defines who we are; determines our future prospects, dictates where and with whom we spend most of our time and moulds our values. So much so that we sing the praises of strivers and decry the laziness of shirkers, while the goal of universal employment remains a mantra for politicians of all stripes. But it wasn’t meant to be like this. Ever since the first stirrings of the industrial revolution, people have been tantalised by the prospect of a future in which automation progressively liberates ordinary folk from dreary work. In 1776, the founding father of modern economics, Adam Smith sang the praises of the “very pretty machines” that he believed would in time “facilitate and abridge labour”; in the 20th century, Bertrand Russell described how, in a soon-to-be automated world, “ordinary men and women, having the opportunity of a happy life, will become more kindly and less persecuting” and even lose their “taste for war”. Russell was hopeful that this change would happen in his lifetime. “The war showed conclusively that, by the scientific organisation of production,” he observed in 1932, “it is possible to keep modern populations in fair comfort on a small part of the working capacity of the modern world.” And from the turn of the 20th century to the onset of the second world war, weekly working hours in industrial countries did indeed reduce steadily.
The economist John Maynard Keynes, Russell’s contemporary, was of a similar mind. He predicted that by 2030, capital accumulation, improvements in productivity and technological advances would have solved the “economic problem” and ushered in an age in which no one besides a few “purposive moneymakers” worked more than 15 hours in a week. He also took the view that the metallic hum of automated production lines was the death-knell of orthodox economics. The institutions and structures that organise our formal economies are predicated squarely on the assumption of scarcity: that although people’s desires are limitless, the resources available to satisfy their needs and wants are not. In the automated future, he believed, absolute scarcity would be a thing of the past and as a result we would cheerfully discard our by then obsolete economic infrastructure and working culture.
Hindsight tells us they were wrong. We passed the thresholds Keynes argued would need to be met to achieve a “golden age of leisure” decades ago. Yet most of us now work longer hours than Keynes’s and Russell’s contemporaries did. And as automation and Covid-19 corrode the employment market, we remain fixated on finding new work for people to do — even if that work often seems to have no point other than to keep the wheels of commerce turning and pushing growth back into the black. Yet, beyond the urgency of our current predicament there are good reasons not to abandon these thinkers’ visions of a leisured future. For taking a far longer view of human history than that typically taken by economists reveals not only that many of our ideas about work and scarcity have their roots planted firmly in the soils of the agricultural revolution, but also that for more than 95 per cent of Homo sapiens’ history, people enjoyed more leisure than we do now. In a very fundamental way, we are born to work. All living organisms seek, capture and expend energy on growing, staying alive and reproducing. Doing this elemental work is one of the things that distinguishes living organisms such as bacteria, trees and people from dead things, like rocks and stars. But even among living organisms, humans are conspicuous for the work they do. Most organisms are “purposive” when they expend energy, meaning that while it is possible for an external observer to determine a purpose to their actions, there is little reason to believe that they set about their work with a clear vision of what they want to achieve in their minds. Humans, by contrast, are uniquely purposeful. When we go to work we usually do so for more reasons than just to capture energy.
Plotting our species’ evolutionary trajectory reveals that over thousands of generations our bodies and minds have been shaped progressively by different kinds of work our various evolutionary ancestors did. It also shows that natural selection moulded us into master generalists, supremely adapted to acquiring an astonishing range of skills during our lifetimes. Charting our evolutionary history also suggests that for most of history the more purposeful and accomplished at securing energy our evolutionary ancestors became — by virtue of the simple tools they made and eventually, perhaps half a million years ago, by their mastery of fire — the less time and energy they spent on the food quest. Instead, they spent time on other purposeful activities such as making music, exploring, decorating their bodies and socialising. Indeed, it is possible that our ancestors would never have developed language were it not for the free time won by fire and tools because, like our cousins the gorillas, they would have had to spend up to 11 hours a day laboriously foraging, chewing and processing fibrous, hard-to-digest foods. New genomic and archeological data now suggest that Homo sapiens first emerged in Africa about 300,000 years ago. But it is a challenge to infer how they lived from this data alone. To reanimate the fragmented bones and broken stones that are the only evidence of how our ancestors lived, beginning in the 1960s anthropologists began to work with remnant populations of ancient foraging peoples: the closest living analogues to how our ancestors lived during the first 290,000 years of Homo sapiens’ history. The most famous of these studies dealt with the Ju/’hoansi, a society descended from a continuous line of hunter-gatherers who have been living largely isolated in southern Africa since the dawn of our species. And it turned established ideas of social evolution on their head by showing that our hunter-gatherer ancestors almost certainly did not endure “nasty, brutish and short” lives. The Ju/’hoansi were revealed to be well fed, content and longer-lived than people in many agricultural societies, and by rarely having to work more than 15 hours per week had plenty of time and energy to devote to leisure.
Subsequent research produced a picture of how differently Ju/’hoansi and other small-scale forager societies organised themselves economically. It revealed, for instance, the extent to which their economy sustained societies that were at once highly individualistic and fiercely egalitarian and in which the principal redistributive mechanism was “demand sharing” — a system that gave everyone the absolute right to effectively tax anyone else of any surpluses they had. It also showed how in these societies individual attempts to either accumulate or monopolise resources or power were met with derision and ridicule. Most importantly, though, it raised startling questions about how we organise our own economies, not least because it showed that, contrary to the assumptions about human nature that underwrite our economic institutions, foragers were neither perennially preoccupied with scarcity nor engaged in a perpetual competition for resources. For while the problem of scarcity assumes that we are doomed to live in a Sisyphean purgatory, always working to bridge the gap between our insatiable desires and our limited means, foragers worked so little because they had few wants, which they could almost always easily satisfy. Rather than being preoccupied with scarcity, they had faith in the providence of their desert environment and in their ability to exploit this. If we measure the success of a civilisation by its endurance over time, then the Ju/’hoansi — and other southern African foragers — are exponents of the most successful and sustainable economy in all of human history. By a huge margin.
These days, Ju/’hoansi do not have much cause to celebrate this. Largely dispossessed of their lands over the past five decades, most scrape a living in shanties on the fringes of Namibian towns and in “resettlement areas” where they do battle with hunger and poverty-related diseases. Unable to secure jobs in a capital-intensive economy where youth unemployment hovers just below 50 per cent, they depend on begging, casual labour — often in return for maize-porridge or alcohol — and government aid. If our preoccupation with scarcity and hard work is not part of human nature, but a cultural artefact, then where did it originate? There is now good empirical evidence to show that our embrace of agriculture, beginning a little over 10,000 years ago, was the genitor of not just our belief in the virtues of hard work but, alongside it, the basic assumptions about human nature that underwrite the problem of scarcity and, in turn, the institutions, structures and norms that shape our economic — and social — lives today. It is no coincidence that our concepts of growth, interest and debt as well as much of our economic vocabulary — including words such as “fee”, “capital” and “pecuniary” — have their roots in the soils of the first great agricultural civilisations. Farming was much more productive than foraging, but it placed an unprecedented premium on human labour. Rapidly growing agricultural populations tended to always revert quickly to the maximum carrying capacity of their land and so constantly lived a drought, blight, flood or infestation away from famine and disaster. And no matter how favourable the elements, farmers were subject to an unrelenting annual cycle that ensured that most of the efforts only ever yielded rewards in the future. More than this, as any farmer will tell you, the fates will punish those who put off an urgent job like mending a fence or sowing a field in a timely fashion and reward those who go the extra mile to make contingencies for the unexpected.
Were Russell still alive today, he would probably be happy to learn that there is good evidence that our attitudes to work are a cultural byproduct of the miseries endured in early agricultural societies. Such a recognition would not only make his Utopia eminently more realisable, but also give teeth to the view that automation would spell the end to scarcity and the demise of orthodox economics — along with the social institutions, structures and norms that coalesced around it. But he might equally be discouraged about our intransigence over changing our behaviour, even when confronted with the costs associated with endless growth. Yet there are many good reasons to revisit our working culture, not the least of which being that for most people work brings few rewards beyond a payslip. As the pollster Gallup showed in its momentous survey of working life in 155 countries published in 2017, only one in 10 western Europeans described themselves as engaged by their jobs. This is perhaps unsurprising. After all, in another survey conducted by YouGov in 2015, 37 per cent of working British adults said their jobs were not making any meaningful contribution to the world. Even putting these facts aside, there is a far more urgent reason to transform our approach to work. Bearing in mind that at its most fundamental, work is an energy transaction, and that there is an absolute correspondence between how much work we collectively do and our energy footprint, there are good grounds to argue that working less — and consuming less — will not just be good for our souls but may also be essential to ensuring the sustainability of our habitat. The economic trauma induced by the pandemic has provided us with an opportunity to reimagine our relationship with work and to re-evaluate what jobs we consider really important. Few now would be willing to stick their necks out to argue in favour of an economy that incentivises our best and brightest to aspire to be derivatives traders rather than epidemiologists or nurses, and once fringe ideas such as the provision of universal basic income or the formalisation of a four-day week have flourished. And more than all this, the pandemic has also reminded us that when it comes to how we work, we are far more adaptable than we often realise.