Skip to main content

How Machine Learning May Change Jobs

Machine learning computer systems, which get better with experience, are poised to transform the economy much as steam engines and electricity have in the past. They can outperform people in a number of tasks, though they are unlikely to replace people in all jobs.
So say Carnegie Mellon University’s Tom Mitchell and MIT’s Erik Brynjolfsson in a Policy Forum commentary to be published in the Dec. 22 edition of the journal Science. Mitchell, who founded the world’s first Machine Learning Department at CMU, and Brynjolfsson, director of the MIT Initiative on the Digital Economy in the Sloan School of Management, describe 21 criteria to evaluate whether a task or a job is amenable to machine learning (ML).
“Although the economic effects of ML are relatively limited today, and we are not facing the imminent ‘end of work’ as is sometimes proclaimed, the implications for the economy and the workforce going forward are profound,” they write. The skills people choose to develop and the investments businesses make will determine who thrives and who falters once ML is ingrained in everyday life, they argue.
ML is one element of what is known as artificial intelligence. Rapid advances in ML have yielded recent improvements in facial recognition, natural language understanding and computer vision. It already is widely used for credit card fraud detection, recommendation systems and financial market analysis, with new applications such as medical diagnosis on the horizon.
Predicting how ML will affect a particular job or profession can be difficult because ML tends to automate or semi-automate individual tasks, but jobs often involve multiple tasks, only some of which are amenable to ML approaches.
“We don’t know how all of this will play out,” acknowledged Mitchell, the E. Fredkin University Professor in CMU’s School of Computer Science. Earlier this year, for instance, researchers showed that a ML program could detect skin cancers better than a dermatologist. That doesn’t mean ML will replace dermatologists, who do many things other than evaluate lesions.
“I think what’s going to happen to dermatologists is they will become better dermatologists and will have more time to spend with patients,” Mitchell said. “People whose jobs involve human-to-human interaction are going to be more valuable because they can’t be automated.”
Tasks that are amenable to ML include those for which a lot of data is available, Mitchell and Brynjolfsson write. To learn how to detect skin cancer, for instance, ML programs were able to study more than 130,000 labeled examples of skin lesions. Likewise, credit card fraud detection programs can be trained with hundreds of millions of examples.
ML can be a game changer for tasks that already are online, such as scheduling. Jobs that don’t require dexterity, physical skills or mobility also are more suitable for ML. Tasks that involve making quick decisions based on data are a good fit for ML programs; not so if the decision depends on long chains of reasoning, diverse background knowledge or common sense.
a woman and computer code
It is difficult to say whether the results can be transferred to people with depression, since animals obviously do not behave like humans. But Anders Abildgaard thinks it is possible to imagine some of the people wo suffer from depression benefiting from probiotics. NeuroscienceNews.com image is in the public domain.
ML is not a good option if the user needs a detailed explanation for how a decision was made, according to the authors. In other words, ML might be better than a physician at detecting skin cancers, but a dermatologist is better at explaining why a lesion is cancerous or not.
Work is underway, however, on “explainable” ML systems.
Understanding the precise applicability of ML in the workforce is critical for understanding its likely economic impact, the authors say. Earlier this year, a National Academies of Sciences, Engineering and Medicine study on information technology and the workforce, co-chaired by Mitchell and Brynjolfsson, noted that information technology advances have contributed to growing wage inequality.
“Although there are many forces contributing to inequality, such as increased globalization, the potential for large and rapid changes due to ML, in many cases within a decade, suggests that the economic effects may be highly disruptive, creating both winners and losers,” they write. “This will require considerable attention among policy makers, business leaders, technologists and researchers.”

Comments

Popular posts from this blog

The Secret Science 02:The 30 Most Disturbing Human Experiments in History

Disturbing human experiments aren’t something the average person thinks too much about. Rather, the progress achieved in the last 150 years of human history is an accomplishment we’re reminded of almost daily. Achievements made in fields like biomedicine and psychology mean that we no longer need to worry about things like deadly diseases or masturbation as a form of insanity. For better or worse, we have developed more effective ways to gather information, treat skin abnormalities, and even kill each other. But what we are not constantly reminded of are the human lives that have been damaged or lost in the name of this progress. The following is a list of the 30 most disturbing human experiments in history. 30. The Tearoom Sex Study Sociologist Laud Humphreys often wondered about the men who commit impersonal sexual acts with one another in public restrooms. He wondered why “tearoom sex” — fellatio in public restrooms — led to the majority of homosexual arrests in ...

The Strange and Stranger Case of Wyndham Lathem

A Northwestern University plague researcher has been charged with a brutal murder. Here’s what we know about him. WIKIMEDIA,  TONY WEBSTER O n July 27,  The  Chicago Tribune   reported that there was an arrest warrant issued for  Wyndham Lathem , a microbiologist at Northwestern University. The crime Lathem would later be charged with was brutal—26-year-old Trenton James Cornell-Duranleau, whose body was found in Lathem’s apartment, had been stabbed dozens of times. But Lathem was nowhere to be found. As events unfolded over the following days, it became clear he had fled from Chicago to California with a second suspect, 56-year-old Andrew Warren, a University of Oxford employee from the United Kingdom visiting the states. Along the way, the two men apparently made an anonymous $1,000 donation in Cornell-Duranleau’s name to the Lake Geneva Public Library and another donation for $5,610 to a Chicago health center. Lathem had also sent a video to fa...

Popular painkiller doesn’t have more heart risks than others, study claims

NEW ORLEANS — A long-awaited study on painkillers called nonsteroidal anti-inflammatory drugs, the most widely prescribed class of drugs in the world, has concluded that the three most commonly used carry a similar risk of cardiovascular complications. Yet critics say the study was too flawed to fairly compare them. Concerns about a type of NSAID called COX-2 inhibitors peaked in 2004 when the drug Vioxx was withdrawn from the market — a decision steeped in scandal because manufacturer Merck & Co had initially hidden data that would reveal the drug’s cardiovascular risks. A second COX-2 inhibitor, Pfizer Inc.’s Celebrex, was allowed to remain on the market with the condition that Pfizer conduct a study to prove that Celebrex was no worse than two older NSAIDs, naproxen and ibuprofen. The study lasted 10 years and enrolled more than 24,000 patients, but faced challenges. Doctors in European Union countries would not participate because they were worried a...