Learning about machine learning

How machine learning unlocks new potential

Full Transparency

Our editorial transparency tool uses blockchain technology to permanently log all changes made to official releases after publication. However, this post is not an official release and therefore not tracked. Visit our learn more for more information.

Learn more

It’s hard to overstate how deeply entrenched machine learning is in our everyday lives. It enables Facebook to recognize your face in photographs—and Siri to understand what you say. The next time you order a rideshare and get an incredibly accurate ETA—you can thank machine learning. It also helps make near real-time credit card fraud detection possible.

These are just a few examples of how this technology is becoming increasingly present within everyday life.

Machine learning is a subset of artificial intelligence, which aims to simulate intelligent human behavior—such as problem-solving and decision-making—in computers. Just as learning is fundamental to human intelligence, machine learning is an integral part of artificial intelligence. However, while all machine learning is a form of artificial intelligence, not all artificial intelligence is machine learning.

Essentially, machine learning is the ability of a computer system to learn on its own without being programmed. Instead of manually programming a computer and telling it what to do, machine learning algorithms analyze data, identify patterns and learn to make decisions or perform tasks on their own. Using enormous amounts of data, a system can be trained to recognize images, or speech patterns, or even what a fraudulent credit card transaction looks like.

“Machine learning is responsible for the revolution in computer vision. It’s responsible for the revolution in speech recognition. It’s responsible for self-driving cars,” says Tom Mitchell, computer science professor at Carnegie Mellon University and a leader in the field of machine learning.

“Over the last ten years, computers have gone from being blind basically—not being able to look at a photograph and recognize what objects are in it—to being able to do that today in a way that’s often at the same level as a human,” Mitchell said. “Likewise, computers were basically deaf just eleven years ago. When the iPhone came out, you couldn’t talk to it because speech recognition didn’t work. Today, speech recognition is highly advanced due to machine learning.”

Mitchell explains that it would be nearly impossible to write a program to effectively identify an individual’s face. But with machine learning, it’s reasonably straightforward. By repeatedly showing the system different images of a person, over and over again, the program detects subtle features and patterns that distinguish that one face from any others. Eventually, the computer is able to identify that face in a photograph on its own.

This same image recognition capability holds great promise in helping doctors diagnose and treat patients. Researchers have already trained a machine learning algorithm to look at a skin blemish and detect whether it’s skin cancer, and, if so, what type of cancer. They did this by exposing the program to hundreds of thousands of images of moles and other marks so the system learns how to spot key differences. “The program has proven competitive with the very best doctors in terms of the accuracy of its diagnosis,” Mitchell said.

Because computers learn from data, most businesses that need to make important decisions and have a lot of historical data to analyze are good candidates for machine learning. This technology is already proving essential in many industries including oil and gas, retail and financial services.

In transportation, machine learning is being used to train self-driving cars to identify a pedestrian crossing the road and to steer, brake and accelerate safely. Machine learning can also help make mass transit more efficient.

In telecommunications, machine learning may one day be used to analyze data traffic to decide how to best route information and where to add wires to a network to improve overall performance, Mitchell says.

To some people the idea of machine learning and artificial intelligence may conjure images of machines taking away jobs and replacing humans in the workplace. But Tom Mitchell, who has studied the implications of machine learning on the workforce, says machine learning will actually enhance existing jobs more than replace them. “It’s going to assist workers with various tasks and thereby allow them more time to do other parts of their jobs,” Mitchell says.

Doctors, for instance, may be supported by a computer-based assistant that can screen a patient and act as a second pair of eyes. Using machine learning technology, doctors may be able to make faster, more accurate diagnoses and then have more time to actually spend with a patient discussing options and caring for him or her, Mitchell explains.

Today, many people rely on machine learning throughout the day far more often than they may realize—and this is really just the beginning. Machine learning is now being used to develop self-driving cars, prevent cyberattacks and even save lives with smarter weather forecasts.

For related media inquiries, please contact story.inquiry@one.verizon.com

For more on the Fourth Industrial Revolution, please visit this page.

Related Articles

Doctors in Surgery
02/15/2021
Virtual Reality (VR) has begun to transform medicine in profound ways. VR solutions are being used to train doctors and to plan and practice operations.
04/17/2019
Emergency workers are utilizing AR and VR technology to respond to deadly scenarios such as car accidents, natural disasters and terrorist attacks.