BLOG

When life and work are intertwined, what gives, what takes?

 

cropped-1444856698794_14448563479521.jpgBY HEATHER ROBERTSON

WHEN  I was appointed editor of The Herald in Port Elizabeth in 2010, as  the first woman in the position, I felt pressured to prove that I could work as hard and smart as all my male predecessors (there were only male editors since The Herald was launched in 1845).

I ended up pulling long hours day and night, behaving like the female version of the Man from La Mancha frenetically fighting the windmills of the global decline of   newspaper circulation, sighing a breath of relief when the ABC (Audit Bureau of Circulation) figures showed the copy sales were stable and blaming myself when there was a dip.

I was in the newsroom fighting the windmill at about 7pm at night on June 17, 2014 when my older brother Michael called me to say my parents had been in a car accident outside Johannesburg, over 1000km away from Port Elizabeth. Dad was bruised and dazed  and Mom’s ankles were broken and she had internal injuries. My younger brother Peter put his phone to my mother’s ear so I could speak to her, just before she was wheeled into theatre. That was the last time I heard my mother’s voice.

The thing about the absolute finality of death is  that it makes you acutely aware of how precious every moment of life is. Up till the point you lose someone close to you, you behave like an immortal.  I did. I was sucked into the social media and print media maelstrom. A mediated  world in cyberspace. I lived in front of a keyboard. Chasing likes and follows. Copy sales and unique browsers. I literally did not smell the coffee or see the beautiful wide open African sky outside the office window. The only time I inhaled fresh air was to smoke cigarette after cigarette as a form of stress relief on the shabby balcony on the top floor of Newspaper House. I did not eat supper with my partner and two sons every night. I arrived home when the boys were sleeping and collapsed in front of the television. Something had to give. First to go was my sponsorship of the tobacco industry’s damage to my lungs and heart. I quit smoking. Next was  my job as editor at the end of 2015.

I escaped the ritual of running to my bolt-hole of fixed employment. My days are now both exhilarating and terrifying. I started off

images
Changes are inevitable

 spending much more time with my sons and partner, going on walks to the park, riding bicycles, playing lego, playing guitar, but when more work started streaming in from different clients, I threw myself head first into  it, pushing long hours into the night to meet deadlines. I verged on beating myself up about this, but on reflection maybe that’s not such a bad thing. Doing  one’s best to deliver the best possible service is a worthwhile pursuit as long as there are set boundaries and limits. I do enjoy spending  time with my family and friends and  I do love my work, specially now that I have the freedom of determining with whom, when, where and how I work.

I kicked off my first year of self-employment  working three days a week facilitating change in newsrooms for a client till November. I did Social Weaver, social media training with Social Weaver, speaking face to face with teachers and lecturers at King Hintsa Technical College in Butterworth and at Walter Sisulu University in East London, helping them use social media tools as an effective means of networking, curating content, researching, marketing and planning in their work. Facilitating these workshops has helped me  rediscover my passion for teaching.

My passion for learning has also been reignited by  studying for my masters in digital journalism at Rhodes University, learning with inspiring fellow students in the media industry and equally inspiring course leader Prof Harry Dugmore. I also rediscovered my love of interviewing, research and writing for a book chapter project on “Women in MK” I was commissioned to write by an old colleague of mine who I met when I started as a journalist at the now defunct South newspaper, Ryland Fisher.

I do sometimes slide  into old habits, I  get totally absorbed in a project and switch off to everything else around me like I did  when  developing this website  and blog while on holiday, which led my ten year old son to say “Mommy you are working more on your laptop now that you are home with us.” A bit rich from the guy who spends the whole night watching Dan TDM, Pewtie Pie and Think Noodle on his laptop, but I humbly accept he has a point.

The trick, I guess,  is to be flexible in one’s routine, to give and take. As life coach Lauren Laitin writes: “When you learn to set boundaries based on your various priorities and obligations, you’ll feel in control and at liberty to make decisions that work for you and the kind of life you want to lead—not the kind of life you’re supposed to lead.” If work is an integral part of your life as much as  eating healthily, going for a walk and  playing uno with your kids, then work-life balance is not what we seek, but the flexibility and common sense to prioritise what is important at a  specific moment in time, to basically “go with the flow” as my mother used to say.

Africa’s Beautiful Ones Are Born-We just need to listen to them

fullsizeoutput_e2Too often the narratives about Africa and my country South Africa in particular are dominated by the corruption, greed and power struggles of the ruling elite. While it is good to expose the rot in governments and businesses – other stories, other narratives drawn from the experiences of the diverse millions of ordinary Africans doing interesting, courageous, extraordinary things often do not get the same exposure. I attended a design-thinking workshop with a group of UCT postgraduate students from a range of different African countries and my hope for the future of my country and continent was renewed. This article first appeared in the Mail and Guardian.

By Heather Robertson

President Zuma’s firing of Finance Minister Pravin Gordhan and Deputy Finance Minister Mcebisi Jonas and his latest cabinet reshuffle, the shameful SASSA debacle, Western Cape Premier Helen Zille’s insensitive ‘benefits of colonialism’ tweets and the recent outbreaks of xenophobia in Pretoria, feeds into a growing narrative of South Africa as a venal, racist, intolerant banana republic. The kind of basketcase place that acclaimed Ghanain writer Ayi Kwei Armah depicted in his seminal 1968 novel on post-independence African countries descent into corruption- “The Beautyful Ones are Not yet Born”.

Any outsider watching the downward trajectory of us as a nation from the miracle of 1994 to the sad sagas of 2017- might well wonder whether there is any hope for us. Fortunately the self-serving behaviour of our politicians is not the only South African narrative. There are other narratives, albeit less public, which are playing out in our daily lives which offer an alternate view of ourselves. A less arrogant, less ego- centric, less brash, less materialistic, more open, curious and humane view.

I was recently privileged to experience and witness such an alternative as a participant in a day long design thinking bootcamp at the Hasso Plattner Institute of Design Thinking (the d-school) at the University of Cape Town’s graduate school of business campus at the Victoria and Alfred Waterfront.

While the marketing industry has associated design thinking with customer journeys and product design, the concept of design thinking has in fact a much broader social reach. Richard Perez, d-school director, says design thinking “is an enabler of innovation and new outcomes”. He defines design thinking as “a human-centred approach to understanding and solving problems; one which places people and their needs at the heart of any innovation.” Tim Brown, the President and CEO of American design company Ideo, says this approach uses the way designers think to “integratethe needs of people, the possibilities of technology, and the requirements for business success.”

Design thinking is playing a vital role in a world-wide movement that is shifting away from the take-make- dispose economy that places profit at the expense of the planet and its people. Brown and Ideo have worked with the Ellen MacArthur Foundation to come up with a design for an alternative global economy, which they have dubbed a circular economy- an economy that designs products that can be made again, powered with renewable energy that is good for people, the planet, and business.

This kind of collaborative, creative, constructive and innovative thinking is being pioneered right here in South Africa by the recently launched d-school at UCT. The bootcamp I attended was the beginning of a 12 week design thinking foundation course offered free of charge to UCT post graduate students. The students at the bootcamp ranged in age from 28 years old to 47 years old and came from South Africa, the Congo, Italy, Botswana, Uganda, Zimbabwe, Cameroon, Spain, Egypt and Ghana. The students came from academic backgrounds in Educational Technology, City Planning, Law, Social Policy, Economic History, Digital Forensics, Politics, Health, Geomatics, Neuroscience, Architecture, Geology, Geography, Journalism, Business Sport, Chemical Engineering, History and Marketing. A smorgasbord of cultures, ways of thinking and seeing.

Our challenge for the day was to improve the way-finding experience at UCT’s graduate school of business. In my group we had three men, an Egyptian, Zimbabwean, Italian and three women, all South African, two black, one white) Our disciplines and skills included architecture, engineering, journalism, pharmacy, farming and social policy. How do such a diverse group of people get to see eye to eye or agree on anything? The answer is with difficulty, laughter, a common goal, playfulness, empathy and good facilitation. We had two coaches, one a postgraduate student at UCT’s business school and the other an international tax post graduate trained in the design thinking process by programme managers Dr Rael Futerman and Dr Keneilwe Munyai, who very firmly and pleasantly guided us through the day. We started off by playing a complicated clapping exercise which had everyone in peals of laughter, jolted out of our comfort zones by doing something we’d probably last encountered at the pre-school playground.

We then did an exercise in listening with empathy to our team members, about who they are, what drives them and what they hope to achieve, followed by a presentation of the challenge. The term “way finding” drew a variety of interpretations from our group from the practical –how to create user friendly signage to enable visitors to find the right venues at the business school, to the more prosaic –how to enable anyone from any background to access the learning opportunities available at the UCT graduate school of business. After much debate, our group chose the latter. Our next exercise was to establish exactly who would benefit from a better way finding experience and we opted for two groups, students and traders.

Next we split into groups of three to go out into the field to interview students and traders about how they viewed the accessibility of the business school and how they think it could better serve their needs. Each group member had to have a turn to ask questions while another member recorded the interview and the third person observed the interviewee’s emotional responses.

Our foray into the field of face to face research brought us rich responses from traders at the Waterfront, students and lecturers at the UCT business school. We deduced from the rich real experience of traders in jewellery and crafts that the university community of students and lecturers could benefit from the traders real life learnings and the traders in turn could benefit from the networks and more national and global teachings of the graduate school of business.

We presented the idea (inspired by both students and traders we interviewed) that the Graduate School of Business invite traders to provide guest lectures and the university in turn provide traders with affordable management tools, through developing a prototype with lego blocks, drama and role play. This part of the design thinking exercise which uses the perspective of the end user in the creation of prototypes is the most revolutionary part of the entire process and might be anathema to rigid hierarchical organisations who rely on expert knowledge and authoritiarian power structures to function. The problem with these kinds of closed authoritarian organisations is that they are not adaptive to necessary innovation and change.

The beauty of the training that the array of post graduate UCT students are participating in is that they are learning how to co create with end users, collaborate with people from different disciplines and cultures, learn to accept failure as part of the learning process and be open to innovate and iterate over and over again.

What I learnt most from the process was to shut up and listen. Listen to my fellow team mates and listen to the traders at the waterfront. From listening with empathy, we were all able to come to a solution that not one of us would have come to by ourselves. This is a systematic way of thinking and working together that could help address our bigger political and social problems if utilized correctly.

UCT Vice Chancellor Max Price first encountered Design Thinking at Stanford University when he was part of a group of leaders from global universities who had come together to talk about trends in higher education. The question they grappled with was, “What will people be doing in their jobs 20-30 years from now?” Price saw the value of bringing Design Thinking to UCT to address the needs of the future. Upon his return to South Africa, Price set off for George where Hasso Plattner the German businessman and co-founder of SAP SE software company, also has a residence, to ask him to sponsor a school at UCT. Plattner agreed, committing R50 million to the school and his trust is now also on the verge of investing in a permanent d-school building at the UCT campus.

UCT’s d-school is one of only three Hasso Plattner Institute (HPI) schools of design globally. It is the only one of its kind in the southern hemisphere. Its forerunners are based at Stanford University and Potsdam, Germany. Professor Ulrich Weinberg, founding director of the HPI School of Design in Potsdam Germany who was in South Afriac recently to attend the d-school launch, explains that Design Thinking offers a paradigm shift in the way problems can be approached. He argues that while information technology has created a networked global society, human culture has not caught up with the direction that new technologies are pushing us in. “New network technologies demand greater collaboration, but we humans continue to be individualistic and competitive. Design Thinking dismantles this behavioural trait by fostering greater collaboration.”

To watch the South African students hold their own with their fellow African and European colleagues, engaging with complex problems around the tables and white boards at the d-school, learning to understand each other’s cultures and personalities through developing empathy, hearing them laugh and clap when solutions and prototypes are created,’ is to know that the beautiful ones are indeed born. Their voices just need to be amplified across the country and continent as a counter-narrative to the moral bankruptcy that has plagued our political stage.

Why we should give free money to everyone. A great argument for basic income grants by Rutger Bregman

I love this piece. More than a decade ago the idea for a basic income grant in South Africa was mooted by the trade union federation COSATU-the idea petered out in the whirlwind of the Polokwane putsch, when the trade union threw their weight behind a Jacob Zuma presidency-believing he would dismantle the neo-liberal machinery set up by his predecessor Thabo Mbeki. As a result of this terrible error and the 2008 recession, COSATU has lost members to retrenchment and rival trade unions and the president they backed is more interested in his overinflated income for life than a basic income for the poor. Have a read of this piece and tell me if you think the time has arrived for money to be given directly to the poor as opposed to the pockets of a predatory elite, bureaucrats and banks. We tend to think that simply giving people money makes them lazy. Yet a wealth of scientific research proves the contrary: free money helps. It is time for a radical reform of the social grant system. RUTGER BREGMAN is a philosopher and the progress correspondent for Dutch site De Correspondent. This piece appeared in The Washington Post and De Correspondent.This article has since developed into a book, published by De Correspondent. It’s called Utopia for Realists: The Case for a Universal Basic Income, Open Borders, and a 15-hour Workweek.

London, May 2009. A small experiment involving thirteen homeless men takes off. They are street veterans. Some of them have been sleeping on the cold tiles of The Square Mile, the financial center of Europe, for more than forty years. Their presence is far from cheap. Police, legal services, healthcare: the thirteen cost taxpayers hundreds of thousands of pounds. Every year.

That spring, a local charity takes a radical decision. The street veterans are to become the beneficiaries of an innovative social experiment. No more food stamps, food kitchen dinners or sporadic shelter stays for them. The men will get a drastic bailout, financed by taxpayers. They’ll each receive 3,000 pounds, cash, with no strings attached. The men are free to decide what to spend it on; counseling services are completely optional. No requirements, no hard questions. The only question they have to answer is:

What do you think is good for you?

Gardening classes

‘I didn’t have enormous expectations,’ an aid worker recalls.

Yet the desires of the homeless men turned out to be quite modest. A phone, a passport, a dictionary – each participant had his own ideas about what would be best for him. None of the men wasted their money on alcohol, drugs or gambling. On the contrary, most of them were extremely frugal with the money they had received. On average, only 800 pounds had been spent at the end of the first year.

Simon’s life was turned upside down by the money. Having been addicted to heroin for twenty years, he finally got clean and began with gardening classes. ‘For the first time in my life everything just clicked, it feels like now I can do something’, he says. ‘I’m thinking of going back home. I’ve got two kids.’

A year after the experiment had started, eleven out of thirteen had a roof above their heads. They accepted accommodation, enrolled in education, learnt how to cook, got treatment for drug use, visited their families and made plans for the future. ‘I loved the cold weather,’ one of them remembers. ‘Now I hate it.’ After decades of authorities’ fruitless pushing, pulling, fines and persecution, eleven notorious vagrants finally moved off the streets.

Costs? 50,000 pounds a year, including the wages of the aid workers. In addition to giving eleven individuals another shot at life, the project had saved money by a factor of at least 7. Even The Economist concluded:

‘The most efficient way to spend money on the homeless might be to give it to them.’

Santa exists

We tend to presume that the poor are unable to handle money. If they had any, people reason, they would probably spend it on fast food and cheap beer, not on fruit or education. This kind of reasoning nourishes the myriad social programs, administrative jungles, armies of program coordinators and legions of supervising staff that make up the modern welfare state. Since the start of the crisis, the number of initiatives battling fraud with benefits and subsidies has surged.

People have to ‘work for their money,’ we like to think. In recent decades, social welfare has become geared toward a labor market that does not create enough jobs. The trend from ‘welfare’ to ‘workfare’ is international, with obligatory job applications, reintegration trajectories, mandatory participation in ‘voluntary’ work. The underlying message: Free money makes people lazy.

Except that it doesn’t.

Meet Bernard Omandi. For years he worked in a quarry, somewhere in the inhabitable West of Kenya. Bernard made $2 a day, until one morning, he received a remarkable text message. ‘When I saw the message, I jumped up’, he later recalled. And with good reason: $500 had just been deposited into his account. For Bernard, the sum amounted to almost a year’s salary.

A couple of months later a New York Times reporter walked around his village. It was like everyone had won the jackpot – but no one had wasted the money. People were repairing their homes and starting small businesses. Bernard was making $6 to $9 a day driving around on his new Bajai Boxer, an Indian motor cycle which he used to provide transportation for local residents. ‘This puts the choice in the hands of the poor, and not me,’ Michael Faye, co-founder of GiveDirectly, the coordinating organization, said. ‘The truth is, I don’t think I have a very good sense of what the poor need.’ When Google had a look at his data, the company immediately decided to donate $2.5 million.

Bernard and his fellow villagers are not the only ones who got lucky. In 2008, the Ugandan government gave about $400 to almost 12,000 youths between the ages of 16 and 35. Just money – no questions asked. And guess what? The results were astounding. A mere four years later, the youths’ educational and entrepreneurial investments had caused their incomes to increase by almost 50%. Their chances of being employed had increased by 60%.

Another Ugandan program awarded $150 to 1,800 poor women in the North of the country. Here, too, incomes went up significantly. The women who were supported by an aid worker were slightly better off, but later calculations proved that the program would have been even more effective had the aid workers’ salary simply been divided among the women as well.

Studies from all over the world drive home the exact same point: free money helps. Proven correlations exist between free money and a decrease in crime, lower inequality, less malnutrition, lower infant mortality and teenage pregnancy rates, less truancy, better school completion rates, higher economic growth and emancipation rates. ‘The big reason poor people are poor is because they don’t have enough money’, economist Charles Kenny, a fellow at the Center for Global Development, dryly remarked last June. ‘It shouldn’t come as a huge surprise that giving them money is a great way to reduce that problem.’

Free-money programs have flourished in the past decade
In the 2010 work Just Give Money to the Poor, researchers from the Brooks World Poverty Institute, an independent institute based at the University of Manchester, give numerous examples of money being scattered successfully. In Namibia, malnourishment, crime and truancy fell 25 percent, 42 percent and nearly 40 percent respectively. In Malawi, school enrollment of girls and women rose 40 percent in conditional and unconditional settings. From Brazil to India and from Mexico to South Africa, free-money programs have flourished in the past decade. While the Millenium Development Goals did not even mention the programs, by now more than 110 million families in at least 45 countries benefit from them.

Researchers sum up the programs’ advantages: (1) households make good use of the money, (2) poverty decreases, (3) long-term benefits in income, health, and tax income are remarkable, (4) there is no negative effect on labor supply – recipients do not work less, and (5) the programs save money. Why would we send well-paid foreigners in SUVs when we could just give cash? This would also diminish risk of corrupt officials taking their share. Free money stimulates the entire economy: consumption goes up, resulting in more jobs and higher incomes.

‘Poverty is fundamentally about a lack of cash. It’s not about stupidity,’ author Joseph Hanlon remarks. ‘You can’t pull yourself up by your bootstraps if you have no boots.’

An old idea

Free money: the idea has been propagated by some of history’s greatest minds. Thomas More dreamt of it in his famous Utopia (1516). Countless economists and philosophers, many of them Nobel laureates, would follow suit. Proponents cannot be pinned down on the political spectrum: it appeals to both left- and right-wing thinkers. Even the founders of neoliberalism, Friedrich Hayek and Milton Friedman supported the idea. Article 25 of the Universal Declaration of Human Rights (1948) directly refers to it.

The basic income.

And not just for a few years, in developing countries only, or merely for the poor – but free money as a basic human right for everyone. The philosopher Philippe van Parijs has called it ‘the capitalist road to communism.’ A monthly allowance, enough to live off, without any outside control on whether you spend it well or whether you even deserve it. No jungle of extra charges, benefits, rebates – all of which cost tons to implement. At most with some extras for the elderly, unemployed and disabled.

The basic income – it is an idea whose time has come.

Mincome, Canada

In an attic of a warehouse in Winnipeg, Canada, 1,800 boxes are accumulating dust. The boxes are filled with data – tables, graphs, reports, transcripts – from one of the most fascinating social experiments in postwar history: Mincome.

Evelyn Forget, professor at the University of Manitoba, heard about the experiment in 2004. For five years, she courted the Canadian National Archive to get access to the material. When she was finally allowed to enter the attic in 2009, she could hardly believe her eyes: this archive stored a wealth of information on the application of Thomas More’s age-old ideal.

One of the almost 1,000 interviews tucked away in boxes was with Hugh and Doreen Henderson. Thirty-five years earlier, when the experiment took off, he worked as a school janitor and she took care of their two kids. Life had not been easy for them. Doreen grew vegetables and they kept their own chickens in order to secure their daily food supply.

From that moment, money was no longer a problem
One day the doorbell rang. Two men wearing suits made an offer the Henderson family couldn’t refuse. ‘We filled out forms and they wanted to see our receipts’, Doreen remembers. From that moment, money was no longer a problem for the Henderson family. Hugh and Doreen entered Mincome – the first large-scale social experiment in Canada and the biggest experiment implementing a basic income ever conducted.

In March 1973 the governor of the province had decided to reserve $17 million for the project. The experiment was to take place in Dauphin, a small city with 13,000 inhabitants north of Winnipeg. The following spring researchers began to crowd the town to monitor the development of the pilot. Economists were keeping track of people’s working habits, sociologists looked into the experiment’s effects on family life and anthropologists engaged in close observation of people’s individual responses.

The basic income regulations had to ensure no one would drop below the poverty line. In practice this meant that about a 1,000 families in Dauphin, covering 30% of the total population, received a monthly paycheck. For a family of five, the amount would come down to $18,000 a year today (figure corrected for inflation). No questions asked.

Four years passed until a round of elections threw a spanner in the works. The newly elected conservative government didn’t like the costly experiment that was financed by the Canadian taxpayer for 75%. When it turned out that there was not even enough money to analyze the results, the initiators decided to pack the experiment away. In 1,800 boxes.

The Dauphin population was bitterly disappointed. At its start in 1974, Mincome was seen as a pilot project that might eventually go national. But now it seemed to be destined for oblivion. ‘Government officials opposed to Mincome didn’t want to spend more money to analyze the data and show what they already thought: that it didn’t work,’ one of the researchers remembers. ‘And the people who were in favor of Mincome were worried because if the analysis was done and the data wasn’t favorable then they would have just spent another million dollars on analysis and be even more embarrassed.’

When professor Forget first heard of Mincome, no one knew how the experiment had truly worked out. However, 1970 had also been the year Medicare, the national health insurance system, had been implemented. The Medicare archives provided Forget with a wealth of data allowing her to compare Dauphin to surrounding towns and other control groups. For three years, she analyzed and analyzed, consistently coming to the same conclusion:

Mincome had been a great success.

From experiment to law

‘Politicians feared that people would stop working, and that they would have lots of children to increase their income,’ professor Forget says. Yet the opposite happened: the average marital age went up while the birth rate went down. The Mincome cohort had better school completion records. The total amount of work hours decreased by only 13%. Breadwinners hardly cut down on their hours, women used the basic income for a couple of months of maternity leave and young people used it to do some extra studying.

Forget’s most remarkable discovery is that hospital visits went down by 8,5%. This amounted to huge savings (in the United States it would be more than $200 billion a year now). After a couple of years, domestic violence rates and mental health also saw improvement. Mincome made the entire town healthier. The basic income continued to influence following generations, both in terms of income and health.

Dauphin, the town with no poverty, was one of five North-American basic income experiments. Four U.S. projects preceded it. Today, few people know how close the US was in the sixties to implementing a solid social welfare system that could stand the comparison with that of most Western-European countries nowadays. In 1964, president Lyndon B. Johnson declared a ‘war on poverty.’ Democrats and Republicans were united in their ambition to fundamentally reform social security. But first more testing was needed.

Several tens of millions were made available to test the effects of a basic income among 10,000 families in Pennsylvania, Indiana, North Carolina, Seattle and Denver. The pilots were the first large-scale social experiments differentiating between various test and control groups. The researchers were trying to find the answers to three questions. 1: Does a basic income make people work significantly less? 2: If so, will it make the program unaffordable? 3: And would it consequently become politically unattainable?

The answers: no, no and yes.

The decrease in working hours turned out to be limited. ‘The ‘laziness’ contention is just not supported by our findings’, the chief data analyst of the Denver experiment said. ‘There is not anywhere near the mass defection the prophets of doom predicted.’ On average, the decline in work hours amounted to 9 percent per household. Like in Dauphin, the majority of this drop was caused by young mothers and students in their twenties.

‘These declines in hours of paid work were undoubtedly compensated in part by other useful activities, such as search for better jobs or work in the home,’ an evaluative report of a Seattle project concluded. A mother who had never finished high school got a degree in psychology and went on to a career in research. Another woman took acting classes, while her husband started composing. ‘We’re now self-sufficient, income-earning artists’, they told the researchers. School results improved in all experiments: grades went up and dropout rates went down. Nutrition and health data were also positively affected – for example, the birth weight of newborn babies increased.

For a while, it seemed like the basic income would fare well in Washington.

WELFARE REFORM IS VOTED IN HOUSE, a NYT headline on April 17, 1970 read. An overwhelming majority had endorsed President Nixon’s proposal for a modest basic income. But once the proposal got to the Senate, doubts returned. ‘This bill represents the most extensive, expensive and expansive welfare legislation ever handled by the Committee on Finance,’ one of the senators said.

Then came that fatal discovery: the number of divorces in Seattle had gone up by more than 50%. This percentage made the other, positive results seem utterly uninteresting. It gave rise to the fear that a basic income would make women much too independent. For months, the law proposal was sent back and forth between the Senate and the White House, eventually ending in the dustbin of history.

Later analysis would show that the researchers had made a mistake – in reality the number of divorces had not changed.

Futile, dangerous and perverse

‘It Can Be Done! Conquering Poverty in the US by 1976’, James Tobin, who would go on to win a Nobel Prize, wrote in 1967. At that time, almost 80% of the American population was in favor of adopting a small basic income. Nevertheless, Ronald Reagan sneered years later: ‘In the sixties we waged a war on poverty, and poverty won.’

Almost 80% of the American population was in favor of adopting a small basic income
Milestones of civilization are often first considered impossible utopias. Albert Hirschman, one of the great sociologists of the previous century, wrote that utopian dreams are usually rebutted on three grounds: futility (it is impossible), danger (the risks are too big) and perversity (its realization will result in the opposite: a dystopia). Yet Hirschmann also described how, once implemented, ideas previously considered utopian are quickly accepted as normal.

Not so long ago, democracy was a grand utopian ideal. From the radical philosopher Plato to the conservative aristocrat Joseph de Maistre, most intellectuals considered the masses too stupid for democracy. They thought that the general will of the people would quickly degenerate into some general’s will instead. Apply this kind of reasoning to the basic income: it would be futile because we would not be able to afford it, dangerous because people would stop working, and perverse because we would only have to work harder to clean up the mess it creates.

But wait a second.

Futile? For the first time in history we are rich enough to finance a robust basic income. It would allow us to cut most of the benefits and supervision programs that the current social welfare system necessitates. Many tax rebates would be redundant. Further financing could come from (higher) taxing of capital, pollution and consumption.

Eradicating poverty in the United States would cost $175 billion – a quarter of the country’s $700 billion military budget.
A quick calculation. The country I live in, Holland, has 16.8 million inhabitants. Its poverty line is set at $1,300 a month. This would make for a reasonable basic income. Some simple math would set the cost on 193.5 billion euro annually, about 30% of our national GDP. That’s an astronomically high figure. But remember: the government already controls more than half of our GDP. It does not keep the Netherlands from being one of the richest, most competitive and happiest countries in the world.

The basic income that Canada experimented with – free money as a right for the poor – would be much cheaper. Eradicating poverty in the United States would cost $175 billion, economist Matt Bruenig recently calculated – a quarter of the country’s $700 billion military budget. Still, a system that only helps the poor confirms the divide with the well-to-do. ‘A policy for the poor is a poor policy,’ Richard Titmuss, the mastermind of the British welfare state, once wrote. A universal basic income, on the other hand, can count on broad support since everyone benefits.

Dangerous? Indeed, we would work a little less. But that’s a good thing, with the potential of working wonders for our personal and family lives. A small group of artists and writers (‘all those whom society despises while they are alive and honors when they are dead’ – Bertrand Russell) may actually stop doing paid work. Nevertheless, there is plenty of evidence that the great majority of people, regardless of what grants they would receive, want to work. Unemployment makes us very unhappy.

One of the perks of the basic income is that it stimulates the ‘working poor’ – who are, under the current system, more secure receiving welfare payments – to look for jobs. The basic income can only improve their situation; the grant would be unconditional. Minimum wage could be abolished, improving employment opportunities at the lower ends of the labor market. Age would no longer need to form an obstacle to finding and keeping employment (as older employees would not necessarily earn more) thereby boosting overall labor participation.

The welfare state was built to provide security but degenerated in a system of shame
Perverse? On the contrary, over the last decades our social security systems have degenerated into perverse systems of social control. Government officials spy on people receiving welfare to make sure they are not wasting their money. Inspectors spend their days coaching citizens to help them make sense of all the necessary paperwork. Thousands of government officials are kept busy keeping an eye on this fraud-sensitive bureaucracy. The welfare state was built to provide security but degenerated in a system of distrust and shame.

Think different

It has been said before. Our welfare state is out of date, based on a time in which men were the sole breadwinners and employees stayed with one company for their entire careers. Our pension system and unemployment protection programs are still centered around those lucky enough to have steady employment. Social security is based on the wrong premise that the economy creates enough jobs. Welfare programs have become pitfalls instead of trampolines.

Never before has the time been so ripe to implement a universal and unconditional basic income. Our ageing societies are challenging us to keep the elderly economically active for as long as possible. An increasingly flexible labor market creates the need for more security. Globalization is eroding middle-class wages worldwide. Women’s emancipation will only be completed when a greater financial independence is possible for all. The deepening divide between the low- and highly educated means that the former are in need of extra support. The rise of robots and the increasing automatization of our economy could cost even those at the top of the ladder their jobs.

Legend has it that while Henry Ford II was giving a tour around a new, fully automatic factory to union leader Walter Reuther in the 1960s, Ford joked:

‘Walter, how are you going to get those robots to pay your union dues?’

Reuther is said to have replied:

‘Henry, how are you going to get them to buy your cars?’

A world where wages no longer rise still needs consumers. In the last decades, middle-class purchasing power has been maintained through loans, loans and more loans. The Calvinistic reflex that you have to work for your money has turned into a license for inequality.

No one is suggesting societies the world over should implement an expensive basic income system in one stroke. Each utopia needs to start small, with experiments that slowly turn our world upside down — like the one four years ago in the City of London. One of the aid workers later recalled: ‘It’s quite hard to just change overnight the way you’ve always approached this problem. These pilots give us the opportunity to talk differently, think differently, describe the problem differently.’

That is how all progress begins.

WHY CAPITALISM CREATES DEADBEAT BULLSHIT JOBS

SOCIAL ANTHROPOLOGIST DAVID GRAEBER ARGUES THAT TECHNOLOGY HAS MADE US WORK MORE IN SOUL-DESTROYING “BULLSHIT JOBS” RATHER THAN THE PROMISED UTOPIA OF WORKING LESS AND SPENDING MORE OF OUR TIME IN PRODUCTIVE, CARING OR CREATIVE PURSUITS

By David Graeber

In the year 1930, John Maynard Keynes predicted that technology would have advanced sufficiently by century’s end that countries like Great Britain or the United States would achieve a 15-hour work week. There’s every reason to believe he was right. In technological terms, we are quite capable of this. And yet it didn’t happen. Instead, technology has been marshaled, if anything, to figure out ways to make us all work more. In order to achieve this, jobs have had to be created that are, effectively, pointless. Huge swathes of people, in Europe and North America in particular, spend their entire working lives performing tasks they secretly believe do not really need to be performed. The moral and spiritual damage that comes from this situation is profound. It is a scar across our collective soul. Yet virtually no one talks about it.

Why did Keynes’ promised utopia – still being eagerly awaited in the ‘60s – never materialise? The standard line today is that he didn’t figure in the massive increase in consumerism. Given the choice between less hours and more toys and pleasures, we’ve collectively chosen the latter. This presents a nice morality tale, but even a moment’s reflection shows it can’t really be true. Yes, we have witnessed the creation of an endless variety of new jobs and industries since the ‘20s, but very few have anything to do with the production and distribution of sushi, iPhones, or fancy sneakers.

So what are these new jobs, precisely? A recent report comparing employment in the US between 1910 and 2000 gives us a clear picture (and I note, one pretty much exactly echoed in the UK). Over the course of the last century, the number of workers employed as domestic servants, in industry, and in the farm sector has collapsed dramatically. At the same time, “professional, managerial, clerical, sales, and service workers” tripled, growing “from one-quarter to three-quarters of total employment.” In other words, productive jobs have, just as predicted, been largely automated away (even if you count industrial workers globally, including the toiling masses in India and China, such workers are still not nearly so large a percentage of the world population as they used to be).

But rather than allowing a massive reduction of working hours to free the world’s population to pursue their own projects, pleasures, visions, and ideas, we have seen the ballooning not even so much of the “service” sector as of the administrative sector, up to and including the creation of whole new industries like financial services or telemarketing, or the unprecedented expansion of sectors like corporate law, academic and health administration, human resources, and public relations. And these numbers do not even reflect on all those people whose job is to provide administrative, technical, or security support for these industries, or for that matter the whole host of ancillary industries (dog-washers, all-night pizza deliverymen) that only exist because everyone else is spending so much of their time working in all the other ones.

These are what I propose to call “bullshit jobs.”

It’s as if someone were out there making up pointless jobs just for the sake of keeping us all working. And here, precisely, lies the mystery. In capitalism, this is exactly what is not supposed to happen. Sure, in the old inefficient socialist states like the Soviet Union, where employment was considered both a right and a sacred duty, the system made up as many jobs as they had to (this is why in Soviet department stores it took three clerks to sell a piece of meat). But, of course, this is the very sort of problem market competition is supposed to fix. According to economic theory, at least, the last thing a profit-seeking firm is going to do is shell out money to workers they don’t really need to employ. Still, somehow, it happens.

While corporations may engage in ruthless downsizing, the layoffs and speed-ups invariably fall on that class of people who are actually making, moving, fixing and maintaining things; through some strange alchemy no one can quite explain, the number of salaried paper-pushers ultimately seems to expand, and more and more employees find themselves, not unlike Soviet workers actually, working 40 or even 50 hour weeks on paper, but effectively working 15 hours just as Keynes predicted, since the rest of their time is spent organising or attending motivational seminars, updating their facebook profiles or downloading TV box-sets.

The answer clearly isn’t economic: it’s moral and political. The ruling class has figured out that a happy and productive population with free time on their hands is a mortal danger (think of what started to happen when this even began to be approximated in the ‘60s). And, on the other hand, the feeling that work is a moral value in itself, and that anyone not willing to submit themselves to some kind of intense work discipline for most of their waking hours deserves nothing, is extraordinarily convenient for them.

Once, when contemplating the apparently endless growth of administrative responsibilities in British academic departments, I came up with one possible vision of hell. Hell is a collection of individuals who are spending the bulk of their time working on a task they don’t like and are not especially good at. Say they were hired because they were excellent cabinet-makers, and then discover they are expected to spend a great deal of their time frying fish. Neither does the task really need to be done – at least, there’s only a very limited number of fish that need to be fried. Yet somehow, they all become so obsessed with resentment at the thought that some of their co-workers might be spending more time making cabinets, and not doing their fair share of the fish-frying responsibilities, that before long there’s endless piles of useless badly cooked fish piling up all over the workshop and it’s all that anyone really does.

I think this is actually a pretty accurate description of the moral dynamics of our own economy. Now, I realise any such argument is going to run into immediate objections: “who are you to say what jobs are really ‘necessary’? What’s necessary anyway? You’re an anthropology professor, what’s the ‘need’ for that?” (And indeed a lot of tabloid readers would take the existence of my job as the very definition of wasteful social expenditure.) And on one level, this is obviously true. There can be no objective measure of social value.

I would not presume to tell someone who is convinced they are making a meaningful contribution to the world that, really, they are not. But what about those people who are themselves convinced their jobs are meaningless? Not long ago I got back in touch with a school friend who I hadn’t seen since I was 12. I was amazed to discover that in the interim, he had become first a poet, then the front man in an indie rock band. I’d heard some of his songs on the radio having no idea the singer was someone I actually knew. He was obviously brilliant, innovative, and his work had unquestionably brightened and improved the lives of people all over the world. Yet, after a couple of unsuccessful albums, he’d lost his contract, and plagued with debts and a newborn daughter, ended up, as he put it, “taking the default choice of so many directionless folk: law school.” Now he’s a corporate lawyer working in a prominent New York firm. He was the first to admit that his job was utterly meaningless, contributed nothing to the world, and, in his own estimation, should not really exist.

There’s a lot of questions one could ask here, starting with, what does it say about our society that it seems to generate an extremely limited demand for talented poet-musicians, but an apparently infinite demand for specialists in corporate law? (Answer: if 1% of the population controls most of the disposable wealth, what we call “the market” reflects what they think is useful or important, not anybody else.) But even more, it shows that most people in these jobs are ultimately aware of it. In fact, I’m not sure I’ve ever met a corporate lawyer who didn’t think their job was bullshit. The same goes for almost all the new industries outlined above. There is a whole class of salaried professionals that, should you meet them at parties and admit that you do something that might be considered interesting (an anthropologist, for example), will want to avoid even discussing their line of work entirely. Give them a few drinks, and they will launch into tirades about how pointless and stupid their job really is.

This is a profound psychological violence here. How can one even begin to speak of dignity in labour when one secretly feels one’s job should not exist? How can it not create a sense of deep rage and resentment. Yet it is the peculiar genius of our society that its rulers have figured out a way, as in the case of the fish-fryers, to ensure that rage is directed precisely against those who actually do get to do meaningful work. For instance: in our society, there seems a general rule that, the more obviously one’s work benefits other people, the less one is likely to be paid for it. Again, an objective measure is hard to find, but one easy way to get a sense is to ask: what would happen were this entire class of people to simply disappear? Say what you like about nurses, garbage collectors, or mechanics, it’s obvious that were they to vanish in a puff of smoke, the results would be immediate and catastrophic. A world without teachers or dock-workers would soon be in trouble, and even one without science fiction writers or ska musicians would clearly be a lesser place. It’s not entirely clear how humanity would suffer were all private equity CEOs, lobbyists, PR researchers, actuaries, telemarketers, bailiffs or legal consultants to similarly vanish. (Many suspect it might markedly improve.) Yet apart from a handful of well-touted exceptions (doctors), the rule holds surprisingly well.

Even more perverse, there seems to be a broad sense that this is the way things should be. This is one of the secret strengths of right-wing populism. You can see it when tabloids whip up resentment against tube workers for paralysing London during contract disputes: the very fact that tube workers can paralyse London shows that their work is actually necessary, but this seems to be precisely what annoys people. It’s even clearer in the US, where Republicans have had remarkable success mobilizing resentment against school teachers, or auto workers (and not, significantly, against the school administrators or auto industry managers who actually cause the problems) for their supposedly bloated wages and benefits. It’s as if they are being told “but you get to teach children! Or make cars! You get to have real jobs! And on top of that you have the nerve to also expect middle-class pensions and health care?”

If someone had designed a work regime perfectly suited to maintaining the power of finance capital, it’s hard to see how they could have done a better job. Real, productive workers are relentlessly squeezed and exploited. The remainder are divided between a terrorised stratum of the – universally reviled – unemployed and a larger stratum who are basically paid to do nothing, in positions designed to make them identify with the perspectives and sensibilities of the ruling class (managers, administrators, etc) – and particularly its financial avatars – but, at the same time, foster a simmering resentment against anyone whose work has clear and undeniable social value. Clearly, the system was never consciously designed. It emerged from almost a century of trial and error. But it is the only explanation for why, despite our technological capacities, we are not all working 3-4 hour days.

Originally published on Strike!

David Graeber’s most recent book, The Utopia of Rules: On Technology, Stupidity, and the Secret Joys of Bureaucracy, is published by Melville House.

2016 September 27