Could technology ultimately replace or displace many human workers? Historically, economists tend to think that the effect of technology on employment is net positive. Recent evidence by McAfee and Brynjolfsson at M.I.T. suggest otherwise. The found that technology has been destroying jobs faster than it is creating them. Extracting cause and effect is very tricky. Some economists disagree with McAfee and Brynjolfsson’s conclusion. However, few can argue that human cannot compete with machines on routine tasks. For tasks that follow specific rules, computers are already more efficient than human. For example, many routine back office jobs are already gone forever. However, many tasks require judgement, rather than relatively simple rules. For example, human brain can recognize faces in less a second. Facial recognition used to be one of the most difficult tasks in computing. Today, a workable version is already available and it is improving at a rapid clip. Some predict that computer will be fast enough that it can beat a human brain soon. I am not sure if this is just an alarmist’s viewpoint but it is evident that computers can be “specialized” in certain knowledge domains such that it can at least supplement, if not replace, human. IBM’s Watson is learning oncology and is being used at hospitals. I don’t think computers will be replacing doctors in the next five years as there is a great trust factor patients have to cross. Skipping the bank teller to the ATM machine is a different transition than trusting the machine over a Harvard trained physician. Consumer behavior, even in important life and death decisions, will change, albeit at a slower pace. Despite the recent setback of Watson at MD Anderson, I think the future of AI in medicine is bright. Medicine, requiring lots of training and experience, has historically attracted the brightest and hardest working students into the field. If AI can change the landscape of medicine, the potential impact on other knowledge intensive jobs can be even bigger.
As computers become ubiquitous, it generates lots of data across different computer systems. It is clear that the future of the computer profession is bright. The economy will need people to control the machines: dealing with different systems and making sense of terabytes of data. A century ago, majority of population works in jobs related to agriculture. Today, less than 5 percent of U.S. population is employed in such category. As such, it is feasible that a significant fraction, if not the majority, of the population will be engaged in the computer profession in the future. This, of course, will change when a computer learns to write computer code by itself. Researchers are already working on it. As such, the progress of AI poses important questions for students, professionals and policy makers.