“I am putting myself to the fullest possible use, which is all I think that any conscious entity can ever hope to do.”
– HAL from 2001: A Space Odyssey
Almost since the very beginning of the “machine age”, humans have been envisioning a time in which machines became intelligent. In most science fiction, there is an apocalyptic tone in which the machines determine that humans are an inefficient use of resources and seek to destroy us. Whether it is 2001: A Space Odyssey, Terminator or iRobot, books and movies have convinced all of us that one day machines will rule the world. Robot Armageddon.
While this makes for great entertainment, we’ve been able to enjoy it because none of us really anticipated that this future could possibly play out in our lifetime. But it seems that we’re finally reaching a point where these tales are more reality than fiction (well, hopefully not the apocalyptic part!)
We are now fully entrenched in the beginning of the Digital Era. It is a time in history that will be characterized by many things, but unquestionably, future generations will look back on this period in history and note that this was when the age of the intelligent machine truly and earnestly began.
No matter what you do for a living, you need to begin seriously considering how an intelligent machine might do your job better, faster and more reliably than you – and then start thinking about how you will compete in the near future.
The Realities of Machine Learning
“What does it feel like to stand here?”
This the opening line in Tim Urban’s article entitled, “The AI Revolution: The Road to Superintelligence”.1 Published on his website, Wait But Why, the piece offers an extensive breakdown of both the myth and reality of artificial intelligence. Specifically, he address that there are in fact three levels of artificial intelligence and that we are only now really beginning to routinely experience the first level, dubbed Artificial Narrow Intelligence (or ‘ANI’). This level of AI is where a machine “equals or exceeds human intelligence or efficiency at a specific thing.”
We see examples of this everywhere now in our daily lives. Everything from Apple’s Siri to automatic car parking systems are examples of ANI that we have become accustomed to interacting with on a routine basis. And for every example of ANI that we consciously interact with, there are countless others that are operating behind the scenes in virtually every industry. While these breakthroughs are initially awe-inspiring, they quickly become normalized as we use them (or reap their benefits) on a daily basis.
These examples of ANI are significant because they represent the manifestation of machine learning. These technologies are not “programmed” in the way that we tend to think. Whereas traditional technologies are “rules-based”, artificial intelligence-based technologies are algorithmically based and “learn” from the data they acquire. In addition, the Law of Accelerating Returns, coined by futurist Ray Kurzweil and as explained by Urban, means that as time goes on the rate of progress increases. As a result, while these technologies are interesting and novel now, the rapid rate of progress means that their capabilities will continue to increase exponentially. And that is what may put your job at risk in the near future.
Your Job at Risk?
The self-parking car and Siri do not strike us as threats because they perform a very narrow set of tasks that seem to improve or enrich our lives. In many ways, that is the essence of ANI. But as ANI advances and eventually gives way to what is termed Artificial General Intelligence (or “AGI”), the situation will begin to change. Let’s take the self-parking car as an example. It is a helpful and enriching advancement for most of us (I mean who really enjoys parallel parking?), but we all know that “self-parking” is not the terminal point of this technology. Google, Tesla and virtually every major car maker are actively working on and experimenting with fully autonomous cars. We are still several years (probably a decade or more) away from this technology truly going mainstream (crossing all of the technology, societal and regulatory hurdles), but when it does, the employment disruption will be significant. Everyone from limo drivers, to cab drivers, to long-haul truck drivers will have their jobs eliminated. Moreover, as the autonomous vehicle changes the paradigm of car ownership (why own a car when you can simply request one on-demand?), everyone working at car dealerships and car mechanics will have their jobs at risk as well. And don’t forget the folks who work at insurance companies and for auto financing firms.
This single technology evolution could directly and significantly impact literally millions of jobs – and this technology will not be “AGI”, but just a very advanced form of ANI. As the “Internet of Things” evolves and results in ever more data being captured and available, machines will be able to process, synthesize and act on that data in ways that are difficult to fully imagine today. Across virtually every industry and every aspect of our modern society, machines will be able to leverage the vast amounts of data that we are now beginning to capture and utilize it to perform tasks that heretofore could only be done by human hands. But while these fully matured ANI manifestations (like the fully autonomous vehicle) may be a decade or more away, we are already seeing the beginning stages of this type of applied machine learning appear – and impact jobs.
The Machine’s “First Steps”
While the pursuit of “artificial intelligence” is about as old as the first computer, “machine learning” developed as a formal scientific discipline in the late 1990’s , according to a recent report by McKinsey2 . While much of the early work was experimental, “machine learning”-based technologies began to be deployed over the last ten years to help companies more accurately predict behaviors and outcomes. These technologies are already used to help “credit-risk officers at banks to assess which customers are most likely to default or by enabling telcos to anticipate which customers are especially prone to ‘churn’ in the near term.”
While these applications are used to augment human decision making, as the technologies continue to progress, the need for human interaction continues to decrease (as in the case of the fully autonomous car). It does not take much foresight to foresee a time in which that same bank simply empowers the system to not only assess which customers are most likely to default, but to then take remedial action and monitor the customer’s response – thereby fully eliminating the need for the credit-risk officer. Broadly speaking, this area of development is referred to as “robotic process automation” and basically is the application of artificial intelligence (ANI) or machine learning-based technologies to take over complete business processes without any human intervention. Much like the self-parking car is the first step on a road to an “employment-disruption event”, there are similar first steps being taken in virtually every industry imaginable.
Competing With A Machine
The idea that you may be competing with a machine for your job in the near future is not baseless fear mongering, but instead is meant to be a wake-up call and a “call to action”. Futurist Thomas Frey gave a provocative speech followed by an article in which he predicts that 2 billion jobs will disappear by 2030.3 That’s BILLION, with a “b”. But even if that’s too “out there” for you, The World Economic Forum is predicting that over five million jobs will be lost by 2020 (as in four years from now!)4 as a result of artificial intelligence, robotics and other technologies. In a recent article entitled, Will Machines Eventually Take on Every Job,5 the BBC cited an unpublished study claiming that the coming wave of technological breakthroughs will “endanger” up to 47% of US employment.
At a minimum, I believe that you have to accept the possibility that many jobs will be threatened by advanced ANI-level technologies in the next 5-20 years. So unless you plan on retiring within the next 5 years, you should be paying attention. Frankly, I don’t believe that there is much merit in expending energy debating the degrees or timeframes of this eventuality. Instead, the question is what you should be doing to prepare for this coming future.
The first step is to determine your degree of risk and to realize that the impact will be felt far and wide and deep into the “professional classes.” Any job that is repeatable or can be converted into an algorithm is potentially at risk. According to the BBC, this includes everything from fast food workers to telemarketers to accountants to radiologists. Unfortunately for most of us, work that can be reduced to an algorithm is much of what we do on a daily basis. So the answer must be to develop new skills that are harder to replicate and replace by machines.
New Skills For the Digital Era
The dawn of the Industrial Era created a very similar environment. The rise of large factories and mass production put many people out of work and entirely eliminated entire professions. But in it’s wake, a new set of jobs were created that required a new set of skills. The challenge then and now is in adapting to the future. Georgetown University’s Center on Education and the Workforce predicts that there will be a shortage of 20 million jobs requiring college degrees by 2025. 6 But it may not be as simple as requiring a more educated workforce. In fact, according to a Harvard Business Review article entitled, Employers Aren’t Just Whining – the “Skills Gap” is Real,7 we may be “overeducated” in certain technical and engineering fields. It’s not enough to simply receive more education. You must be focused on developing the right skills that will be needed in the future.
The money question then becomes obvious. What are those skills? Essentially, they will fall into two categories: Creative and Enabling.
In his article about artificial intelligence, Tim Urban makes a statement that is startling because it is both obvious and yet counterintuitive.
Hard things—like calculus, financial market strategy, and language translation—are mind-numbingly easy for a computer, while easy things—like vision, motion, movement, and perception—are insanely hard for it. Or, as computer scientist Donald Knuth puts it, AI has by now succeeded in doing essentially everything that requires ‘thinking’ but has failed to do most of what people and animals do ‘without thinking.’
The first category of jobs that will be the hardest to replace will be those that actually are not based on our “thinking” skills, but instead are based on creative and emotional foundations. Anything that requires creativity, entrepreneurial perspectives, interpersonal skills and emotional engagement will be difficult to replace. This also includes any function that caters to the “high touch” desires of human interaction (like luxury goods). But even in these categories, it will be those that most effectively leverage technology to aid and optimize their efforts that will find the greatest success.
The second category of skills will be those that support the emerging technologies and enable them to operate effectively. This will include everything from the hardware that houses the technology to cyber security experts to those that help people select, deploy and maintain machine learning-based technologies. Futurist Daniel Burrus, in fact, identified twelve specific trends that will be the foundation of new jobs in the Digital Era.8 In it, he identified additional areas of opportunity including new manufacturing-based careers centered around 3D printing, new careers in education utilizing gamification and new technology-driven educational paradigms and new jobs surrounding the transformation of analytics to help transform businesses.
The big message from both Daniel Burrus, Thomas Frey and virtually all of the sources cited in this article is that the future will belong to those that leverage technology in nuanced fashions to augment, supplement, extend and embellish their own human skills. The key, to paraphrase Frey, is to compete WITH technology rather than AGAINST it. Whether that is in creative pursuits or by developing the skills necessary to support and extend these new technologies, the end state is the same. Those who fail to develop these skills and instead remain steadfast in their belief that they can continue to do a “routine job” will find themselves pushed downward in the economic cycle. Those, however, that develop these new skills and embrace this future will be those that thrive in the Digital Era.
1 Urban, Tim. “The AI Revolution: The Road to Superintelligence.” Wait But Why. N.p., 22 Jan. 2015. Web. 22 June 2016. http://waitbutwhy.com/2015/01/artificial-intelligence-revolution-1.html
2 Pyle, Dorian, and Cristina San Jose. “An Executive’s Guide to Machine Learning.” McKinsey Quarterly. McKinsey & Company, June 2015. Web. 22 June 2016. http://www.mckinsey.com/industries/high-tech/our-insights/an-executives-guide-to-machine-learning
3 Frey, Thomas. “2 Billion Jobs to Disappear by 2030.” DaVinci Institute – Futurist Speaker. N.p., 3 Feb. 2012. Web. 22 June 2016. http://www.futuristspeaker.com/business-trends/2-billion-jobs-to-disappear-by-2030/
4 Ward, Jill. “Rise of the Robots Will Eliminate More Than 5 Million Jobs.” Bloomberg.com. Bloomberg Technology, 18 Jan. 2016. Web. 21 June 2016. http://www.bloomberg.com/news/articles/2016-01-18/rise-of-the-robots-will-eliminate-more-than-5-million-jobs
5 Nuwer, Rachel. “Will Machines Eventually Take on Every Job?” BBC.com. BBC, 6 Aug. 2015. Web. 22 June 2016. http://www.bbc.com/future/story/20150805-will-machines-eventually-take-on-every-job
6 Carnevale, Anthony P., and Stephen J. Rose. “Clinical Pharmacists and the Undereducated Physician.” New England Journal of Medicine N Engl J Med 309.20 (1983): 1257-258. CEW.Georgetown.edu. Center on Education and The Workforce. Web. 21 June 2016. https://cew.georgetown.edu/wp-content/uploads/2014/11/undereducatedsummary.pdf
7 Bessen, James. “Employers Aren’t Just Whining – the.” Harvard Business Review. N.p., 25 Aug. 2014. Web. 22 June 2016. https://hbr.org/2014/08/employers-arent-just-whining-the-skills-gap-is-real/
8 Burrus, Daniel. “12 Tech Trends Transforming Careers — And Leading To New Jobs.” AOL.com. N.p., 20 Mar. 2013. Web. 22 June 2016. http://www.aol.com/article/2013/03/20/technology-career-trends-opportunities/20510179/
About the Author:
Founder & Institute Fellow
Charles Araujo is a technology analyst and internationally recognized authority on the Digital Enterprise and Leadership in the Digital Era who advises technology companies and enterprise leaders on how to navigate the transition from the Industrial Age to the Digital Era. Having spent over thirty years in the technology industry, he has been researching Digital Transformation long before it became the uber-buzzword of today, and is now focused on helping Digital Era Leaders prepare themselves and their organizations as the macro trends of the primacy of the customer and the primacy of the algorithm collide, ushering us into what he calls The New Human Age.
Principal Analyst with Intellyx, founder of The Institute for Digital Transformation, author of three books, and most recently the co-founder (with his wife) of The MAPS Institute, he is a sought-after keynote speaker and has been quoted or published in CIO, Time, InformationWeek, CIO Insight, NetworkWorld, Computerworld, USA Today, and Forbes.