The revolution of artificial intelligence, robotics and learning machines is threatening to remove humans from the equation as a necessary and central component of the workforce. Which professions are in danger of extinction, and why will attributes such as creativity and compassion increase demand for workers?     

The history of technology has a comforting narrative: each time a technological revolution occurs many jobs for which there is no longer a need disappear. At the same time however, new jobs are created in their place. For example, when the motorcar industry was established, the laborers who worked in caring for horses and in the manufacture of carriages found themselves out of work. On the other hand, there was a swift rise in demand for workers in car factories, and in car maintenance and repair.

This historic fact presents a seemingly balanced picture, a kind of 'Law of Technological Preservation'. Jobs disappear but new ones are created in their place. In recent years however, a series of studies and findings have hinted that something in this law has become unbalanced. People retained their importance to the workforce in the wake of the agricultural, industrial and digital revolutions, but the recent revolutions – of artificial intelligence, robotics and learning machines – are, for the first time, threatening to remove humans from the equation as a necessary and central element in the labor market.


Human Competition versus Improved Algorithm

In the past, scientists examined how humans perform various tasks and attempted to consequently teach machines to perform the same task. Today however, the programmers develop algorithms that teach the machines how to learn, and then present them with millions of examples with which the machines learn by themselves how to perform the job.

This breakthrough means that not only physical, routine and monotonous tasks are at risk, but also jobs in which humans have always enjoyed an advantage over machines. Now, lawyers, accountants, sales and marketing professionals, doctors, journalists and even the programmers themselves are facing competition in the workforce. Competition in of itself is not a bad thing, the opposite is the case. However, while in the past, competition was with another employee who offered his service at a lower price or did the job better, today the competition comes from a technology that performs the job not only better, but also quicker, more efficiently and cheaper.

Almost all the companies are active in this new arena alongside all nation states and the world's large organizations: IBM invests in its supercomputer 'Watson', Apple is enhancing its digital assistant 'SIRI' and Amazon is cultivating 'Alexa'. Similarly, Microsoft, Facebook, Google, the giant Chinese manufacturer Foxconn, retail chains such as Walmart, intelligence agencies and other organizations, are all investing billions of dollars in the development or purchase of new technologies, many of which are intended to replace humans.

These trends can be expressed by the term 'technological unemployment'.[1] Ninety-three percent of the world's largest investors believe that governments worldwide are unprepared for the moment, which to their perception is looming ever closer, when artificial intelligence will significantly undermine human employment.[2]

Truck driving – one of the most common jobs in the world – is an excellent example. In the United States alone, there are more than 3.5 million truck drivers, however governments, giant conglomerates and different corporations are investing billions of dollars in developing technologies that will enable trucks to drive themselves. The consultation firm McKinsey estimates that approximately a third of the world's trucks will drive themselves by the year 2024, and that by 2030 this popular job may disappear completely from most countries.[3] A similar situation exists with regard to other occupations in the transport sector such as bus and taxi drivers.


The Gradual Approach of the Post Human Era

It seems likely that this trend will continue and that advanced technologies are expected to keep progressing up the ladder of human proficiency. For example, until a few years ago, the trade on the world's stock markets relied on humans. Today however, companies are competing between themselves to develop the fastest trade algorithm, capable of executing transactions a micro-second faster than their competitors. Humans have been left far behind, unable as they are, to contend with the enormous speed and fantastic calculation capability of algorithms that constantly improve themselves.

Corporations, government agencies and research bodies are developing systems that are capable of analyzing pictures, video clips and natural language. These new capabilities enable them to experiment with the manufacture of products that, in the past, were considered completely human. Some such examples include media reports, medical imaging analysis, legal surveys, paintings and even jokes. IBM even claim that they have developed a system capable of locating cancer in a patient's body better than the world's best experts.

Various experts, including inventor and technologist Ray Kurzweil, have suggested that we are rapidly drawing closer to 'Singularity' – an artificial entity, the dimensions and components of which, will outstrip human capability. This entity will develop itself, progress at an exponential rate, and will launch us all into a post-human era: that in which humans no longer control the Earth, ceasing to be the planet's strongest and most intelligent entity.[4] Kurzweil is convinced that this is a positive development as it will enable us to detach ourselves from the limitations of the human intellect, solve problems that we cannot solve alone (such as global warming), and live forever.

Not everyone shares the optimism. Senior scientists, including the renowned physicist Stephen Hawking, are extremely concerned by the situation. In a special column written for the British 'Guardian', Hawking estimated that the development of artificial intelligence can be expected to eradicate jobs at the heart of the middle class. He predicts that only a fraction of occupations, those that deal with care for other humans or that necessitate special creativity, will remain relevant and survive the revolution.[5]


Artificial Intelligence is an Existential Threat

Hawking is not alone. The technology entrepreneur Elon Musk said with regard to artificial intelligence that "we are summoning the demon. […] I think we should be very careful about artificial intelligence. If I were to guess what our biggest existential threat is, it’s probably that."[6] Bill Gates, founder of Microsoft also expressed concern at the manner in which advanced technology may adversely affect the labor market, saying "I don't understand why some people are not concerned."[7]

These misgivings raise the important question – why is humanity developing a technology that may threaten its own existence, its own sources of employment and the livelihood of hundreds of millions of people? The answers are manifold. Firstly, each development is not in of itself perceived as a threat but rather, as an achievement in a specific field of knowledge such as the understanding of language, movement, face recognition etc. Only when all these developments are combined and unified does a terrifying picture begin to form.

Secondly, since the Age of Enlightenment, "science" and "technology" are synonyms for "progress", the latter being overwhelmingly perceived as a positive thing. Not without reason is the word "progress" associated with "progression" or "advancement" for after all, who can seriously object to progress?

Finally, scientists, technologists and people in general, do not always understand that technology may get out of control. In practice, many technologies were developed with a certain objective in mind but were ultimately used for an entirely different purpose. The technology historian Lewis Mumford, wrote in his book 'Technics and Civilization' that the mechanical clock was invented in the 13th century by Benedictine monks who prayed seven times daily at fixed times, because they sought a way to know prayer times. However, the invention of the mechanical clock "escaped" the monastery and became the central means for enabling capitalism - designated working hours, working in assembly lines, manufacturing mass consumer products. As Mumford wrote, in the struggle between God and money, the latter prevailed.[8] It was in this context that the media theorist Neil Postman wrote that, had the monks foreseen the future, they may have possibly preferred to remain with their sundial. Likewise, he conjectured, had Gutenberg known that his printing press would lead to the dismantling of the church, he may have preferred to use his machine to produce wine and not books.[9]

Technology evolves faster than the framework restraining it

Moreover, if we have learned anything from the history of technology, it is that it tends to develop faster than the cultural, ethical or legal frameworks that are supposed to restrain it. Only after the invention of nuclear weapons, any consideration was given to preventing its proliferation, only after the cloning of a sheep was accomplished, people started devoting efforts to prevent human cloning, and only after development of the cellphone did cultural norms consolidate regarding its public use. The problem with the developments in the field of artificial intelligence and algorithms is, that we may ultimately discover that the moment at which we begin to contend with their ethical and economic ramifications, is one moment too late.

If that is not enough, the new technologies may have significant psychological ramifications for the identity and the sense of self of us all. The most prominent economists and sociologists, including Karl Marx, have extensively expounded on the degree to which work is central to a person's existence. For thousands of years we have been accustomed to drawing our satisfaction, self-identity and pride from the fruits of our labor, all of which are dealt a harsh blow with each successive technological revolution.

This was demonstrated by the industrial revolution when millions of laborers, who toiled in jobs that were handed down from father to son over the generations, were disinherited from their jobs and sent to work in a factory where they performed basic, exhausting and uninspiring labor. A century later, the factory workers discovered that they too could be replaced by machines and robots, and were consequently sent to work in service-based industries such as support, service and sales. Now, according to estimates of the best experts, there is a 99% probability that most of the telemarketing jobs will be transferred to computer programs, algorithms and robots within the next twenty years. The chance that checkout workers will lose their jobs stands at 97% and the probability that insurance agents will find themselves replaced is estimated at 92%.[10]


How will the labor market be influenced by technology that is progressing at dizzying speed?

Despite the gloomy forecasts, it must be honestly admitted that our ability to contain, comprehend, and predict the future is gradually diminishing. The main reason for this is the intensity and speed of modern technological changes. Processes that in the past occurred over centuries, and later over generations, occur today within merely a few years. In practice, this is one of the reasons that we find it so difficult to keep up: What was true yesterday, is only barely true today, and will certainly not hold true tomorrow.

It is for this reason that we cannot rule out the possibility that the artificial intelligence and learning machines revolution will create new jobs, fields of employment and expertise that we cannot even imagine today. Jobs such as "Data Scientist", "Search Engine Optimization expert", "Video Blogger at YouTuber" or "Online Social Network Community Manager" have been created during recent years, and are a direct result of the rapidly developing internet economy that has created a new echelon of workers and jobs that rely on companies such as Google, Facebook, Amazon and others.

Today's reality whereby someone's day at work focuses on the attempt to convince Google's grading algorithm that the site he is promoting should be at the top of the search results list, would have seemed beyond belief twenty years ago. It is hard to remember that back then, we searched for a doctor by paging through the telephone directory... Consequently, the presumption to know how the labor market will be influenced by technologies advancing at dizzying speed, is one to be approached with a good deal of modesty.

Still, those who believe that developments in the fields of software, robotics and algorithms will not affect them, are choosing to bury their heads in the sand. In coming years, both organizations and employees will be required to locate and invest in skills and abilities that rely on clear human attributes such as creativity, sympathy, compassion, face-to-face communication - 'soft skills' - those that are difficult to program and hard for machines to imitate.

Workers must remember that in this age, education does not stop after high school or even university. They must continue to study and develop themselves, and vary their abilities and fields of knowledge. They must develop their creative muscle, their imagination, initiative and even their sense of self-criticism. Digital literacy is an essential skill for every school pupil and certainly for students and adult workers.

Moreover, while computers (still) depend on logic, rationalism, and calculations of probability, humans also rely on, and are motivated by, emotions and intuition. They occasionally act against their own interests; they have the ability to be surprised by their own decisions. If, in the past, these attributes were regarded as "weak", in the super-rationalist era they actually become important. We wouldn't like to see judges made of code and steel passing judgment on us. We would not like robots to do the work of kindergarten or schoolteachers, social workers or of any other professionals associated with the human experience. Human beings, not robots, are aware of the meaning of death, suffering or heartache, and therefore performing their job with sensitivity, compassion and tenderness.

Personally, I hope that despite the fact the machines are expected to take control over increasingly number of tasks and jobs, we will know how to preserve the things that makes us all so special, so important, and so human.

Dr. Yuval Dror is the Dean of the School of Media Studies at the College of Management Academic Studies, and a researcher in the field of Sociology of technology.  


[1] Brynjolfsson, Erik and McAfee, Andrew (2014). "The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies", W. W. Norton.

[2] Koetsier, John (2016, November 10). " 93% of Investors Say AI Will Destroy Jobs, Governments Not Prepared", Forbes.

[3] "Delivering Change: The transformation of commercial transport by 2025", McKinsey & Company, September 2016.

[4]Kurzweil, Ray (2012), "The Singularity is Near", Tel Aviv: Magnes (Heb.)

[5] Hawking, Stephen (2016, December 2). "This is the most dangerous time for our planet", The Guardian.

[6] McFarland, Matt (2016, October 2104). "Elon Musk: ‘With artificial intelligence we are summoning the demon.’". The Washington Post.  

[7] Holley, Peter (2015, January 29). "Bill Gates on dangers of artificial intelligence: ‘I don’t understand why some people are not concerned’". The Washington Post.

[8] Mumford, Lewis. (1964). Technics and Civilization. New York: Harcourt Brace & Company. P. 15.

[9] Postman, Neil. (2003) Technopoly: The Surrender of Culture to Technology, Tel Aviv, Sifriat HaPoalim, P. 15. (Heb.)

[10] The probability that your job will be replaced by a machine can be seen here: