Download Machine Learning Changing the Economics of Business, Industry

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Human-Computer Interaction Institute wikipedia , lookup

The Measure of a Man (Star Trek: The Next Generation) wikipedia , lookup

Affective computing wikipedia , lookup

AI winter wikipedia , lookup

Data (Star Trek) wikipedia , lookup

Philosophy of artificial intelligence wikipedia , lookup

Concept learning wikipedia , lookup

History of artificial intelligence wikipedia , lookup

Time series wikipedia , lookup

Machine learning wikipedia , lookup

Pattern recognition wikipedia , lookup

Transcript
JULY 9, 2015
Machine Learning Changing the Economics of
Business, Industry, and Society
By Dick Slansky
Keywords
Machine Learning, AI, Deep Learning, Big Data, Predictive Analysis, RulesBased Systems, Expert Systems, Data Mining, Cognitive Computing
Overview
The notion of teaching machines to think, or at least act like humans, has
been around since machines become capable of replacing human work.
When Alan Turing developed the theory of computing and conceived the
notion of powerful computing machines in the middle of the 20th century, he
helped usher in the age of computing. He also posed the famous question,
“Can machines think?” and created the equally famous Turing Test of a machine’s ability to mimic human behavior.
At the same time computer
scientists, mathematicians, and psychologists were beginning to explore the
Machine learning is ushering in
many new applications and solutions
that are giving new life to older IT
sectors like data mining, data
modeling, statistical process control,
and enabling current trends like Big
Data, predictive analysis, and
intelligent personal assistants.
emerging field of artificial intelligence (AI). They were
looking for a way to program a machine that could
make intelligent decisions and, moreover, learn in the
same way a human brain would learn and develop
knowledge.
AI has come quite a ways since Turing. It has spawned
a number of fields of research like speech recognition,
cybernetics, knowledge representation, neural net-
works, rules-based learning, and search engines to name just a few. Many of
these areas of research have produced real and useful applications that have
made significant impact on business, industry, and society. AI has been used
in a number of fields, including medical diagnosis, stock trading, robotic control systems, remote sensing, character recognition, music composition,
transportation and logistics, and even toys. AI has also had its ups and
downs. It has been through cycles of great promise where thinking machines
were supposedly right around the corner, yet AI failed to deliver on many of
these promises at the time.
VISION, EXPERIENCE, ANSWERS FOR INDUSTRY
ARC Insights, Page 2
During the 80s and early 90s a few leading manufacturers in industries such
as aerospace & defense, automotive, and medicine tried to develop expert
systems, a branch of AI, that would capture the expert knowledge of human
operators and technicians into a computer algorithm that could duplicate the
experts’ methods and techniques. Companies like Boeing and GM tried to
develop expert systems for machining and parts fabrication. The US Department of Defense developed some expert systems for flight simulation and
intelligent cockpits to help pilots cope with very complex systems. While a
few of these early efforts were somewhat successful, most failed to produce
a true expert system that mimicked the human expert. The AI methods and
programming tools at the time were simply not up to the task of capturing
the complexity of human knowledge. Moreover, computing power was not
yet developed that could handle the massive computational requirements.
Today, we are experiencing another AI cycle manifested in a new set of applications and technologies based on the continuing advancement of the
field. With new generations of advanced computing resources powering advanced AI algorithms, fields such as machine learning have emerged to usher
in many new applications and solutions. These are giving new life to older
IT sectors like data mining, data modeling, statistical process control, and
enabling current trends like big data, predictive analysis, and intelligent personal assistants.
Machine learning (ML) is based on algorithms that can learn from data without relying on rules-based programming. It emerged as a scientific discipline
in the late 1990s as steady advances in readily available computing power
enabled data scientists to discontinue building finished models based on
rules-based programming and, instead, train computers to develop their own
adaptive models. The unmanageable volume and complexity of the big data
now flooding the world has increased both the potential and need for machine learning.
Machine Learning Encompasses Multiple Research Areas
and Applications
At a basic level, machine learning is the science of getting computers to act
without being explicitly programmed. In other words, with a certain degree
of autonomy. In the past decade, ML has given us self-driving cars, practical
speech recognition, effective web search, and significantly improved understanding of the human genome and pharmaceuticals. ML is a subfield of
©2015 • ARC • 3 Allied Drive • Dedham, MA 02026 USA • 781-471-1000 • ARCweb.com
ARC Insights, Page 3
computer science that evolved from the study of pattern recognition and
computational learning theory in AI. ML research and development has led
to the creation and study of algorithms that can learn from and make
predictions on data.
In the past decade, machine learning
has given us self-driving cars, practical
speech recognition, effective web
Machine learning is sometimes associated with data
mining, an area that focuses more on exploratory
data analysis. ML and data mining often employ the
search, and a significantly improved
same methods and can overlap significantly. ML,
understanding of the human genome
however, primarily focuses on prediction, based on
and pharmaceuticals.
known information learned from training data, a set
of data used to discover potentially predictive rela-
tionships. Data mining, on the other hand, focuses on the discovery of
(previously) unknown properties in the data, which is essentially a
knowledge discovery process and analysis step for databases.
Today, ML incorporates data analyses covering predictive analytics, data
mining, pattern recognition, and multivariate statistics. With the emergence
of big data methodologies and applications that are projected to drive a large
portion of IT spending in the near future, ML-based algorithms provide the
engine to process big data that will essentially make predictions for the future of business, industry, consumers, and society.
Machine Learning vs. Cognitive Computing
The fundamentals of ML have been around for decades. The idea is to avoid
having to deal with the complexities of human thinking and the actual cognitive process and, instead, compare millions or even billions of pieces of
information from available, but basically unstructured data. This very different from learning in the human sense, at least to this point in time. For
example, Watson, IBM’s Jeopardy-playing supercomputer, or Siri, Apple’s
iPhone assistant, have very little to do with intelligence or the cognitive reasoning of the human brain.
But what ML already does exceeding well—and will get even better at—is
relentlessly processing any amount of data and every combination of variables. Eventually, pattern matches are made and predictable outcomes form.
The process is brilliantly straightforward and relatively simple, but requires
both access to a lot of data and a lot of computing power, which today presents no problem. Computer scientists discarded the rules-based approach
used to develop the expert systems that failed to deal with the complexities
©2015 • ARC • 3 Allied Drive • Dedham, MA 02026 USA • 781-471-1000 • ARCweb.com
ARC Insights, Page 4
of human reasoning and learning in favor of this machine learning approach
that could actually make use of the massive amount of data in which business, industry, and society are drowning.
The expert systems that had once been AI's “poster child” failed because of
the brittleness or inflexibility of the rules-based algorithms. The approach
was fundamentally broken. If one considers machine translation from one
language to another, long a holy grail of AI, the typical expert systems approach might involve putting linguists and translators into a room and trying
to convert their expertise into rules for a program to follow. This failed for
obvious reasons: no set of rules can realistically manage a human language;
language is too complex and variable; and for every rule obeyed, another
rule could be broken.
For example, Google Translate, based on the same machine learning principles as IBM's earlier Candide, is now the world’s leading machine-translation
system. Google basically took one of these simple machine-learning algorithms (one that academia had given up on), to build
ML works so well because it discards
Google Translate. The computer scientists and engi-
the first-order problem of emulating
neers at Google discovered that when you go from
human thinking and learning and
replaces the task of understanding
the data with nuts-and-bolts
engineering based on multivariate
statistics, pattern matching, and
various forms of data mining.
10,000 training examples to 10 billion, it all starts to
work. Data trumps everything.
The technique is so effective that the Google Translate
team can be made up of people who don’t speak most
of the languages their application translates. They are
computer scientists and engineers, and engineering is
what counts in a world in which translation is an exercise in data mining on
a massive scale. This is why the same ML approaches cognitive computing
researchers previously gave up on now work so outstandingly well on what
we now refer to as Big Data. ML discards the first-order problem of emulating human thinking and learning and replaces the task of understanding the
data with nuts-and-bolts engineering based on multivariate statistics, pattern
matching, and various forms of data mining.
Current Research and Development Trends in ML
The interesting fact about ML is that while only about a dozen or so basic
categories of ML tasks and approaches are used to develop algorithms --
©2015 • ARC • 3 Allied Drive • Dedham, MA 02026 USA • 781-471-1000 • ARCweb.com
ARC Insights, Page 5
regression, support vector machines (supervised learning), clustering, decision trees, Bayesian, etc. -- hundreds of versions of these basic categories are
being applied to various solutions for business and industry.
One area of research currently receiving a lot of attention is deep learning.
Deep learning itself is a revival of an even older idea for computing: neural
networks. These systems, loosely inspired by the densely interconnected
neurons of the brain, mimic human learning by changing the strength of simulated neural connections on the basis of experience (see ARC Insight: The
Next Generation of Computing: Emulating the Human Brain). One characteristic of deep learning is that it gets better and better as you feed it more
A significant benefit of deep learning
is that it can be applied generally to
many industries and problems and
does not require extensive domain
expertise to create effective solutions.
data. In this sense, deep learning achieves what computer scientists were trying to do with rules-based
inferencing engines years ago. Since deep neural nets
are really good at finding patterns in data sets and also
with unstructured data, the bigger the data is in volume and density, the more the system will learn. One
of the significant benefits of deep learning is that it can be applied generally
to many industries and problems and does not require extensive domain expertise to create effective solutions.
For example, Google Brain, a deep learning project that used about one million simulated neurons and one billion simulated connections, was ten times
larger than any previous deep neural network. Project founder Andrew Ng,
now director of the Artificial Intelligence Laboratory at Stanford University
in California, has gone on to make deep learning systems ten times larger
again. In June 2014, a Google deep-learning system that had been shown ten
million images from YouTube videos proved almost twice as good as any
previous image recognition effort at identifying objects. Google also used its
deep learning algorithms to cut the error rate on speech recognition in its
latest Android-based mobile software.
In October 2014, Microsoft chief research officer Rick Rashid wowed attendees at a lecture in China with a demonstration of speech software that
transcribed his spoken words into English text with an error rate of only 7
percent, translated these into Chinese-language characters on an overhead
screen, and then simulated his own voice speaking to them in Mandarin.
Additionally, Apple and IBM have been busy acquiring startup companies
and researchers with deep-learning expertise. For everyday consumers, the
©2015 • ARC • 3 Allied Drive • Dedham, MA 02026 USA • 781-471-1000 • ARCweb.com
ARC Insights, Page 6
results include software better able to sort through photos, understand spoken commands, and translate text from foreign languages. For scientists and
industry, deep-learning computers can really begin to democratize the use
of advanced analytics, search for potential drug candidates, or map real neural networks in the brain.
Deep learning has already been put to use in services like Apple’s Siri virtual
personal assistant speech recognition service and in Google’s Street View,
which uses machine vision to identify specific addresses. Microsoft uses
deep learning for its Cortana virtual assistant and is also applying it to its
Azure Machine Learning Studio where users can build, test, and deploy predictive analytics solutions on their own data.
Real World Applications for ML in Business and Industry
The effects of ML technology will change the economics of virtually every
industry. Although the market value of machine learning (and the data scientists who develop new applications) is increasing rapidly, the value of
human labor in industry is declining. This change marks a true technological
disruption, and industry adoption of ML will be rapid. There are also tremendous social consequences to consider that require as much creativity and
investment as the more immediately lucrative deep learning startups
emerge.
The overall ML landscape and providers of applications and solutions can be
segmented into roughly four market areas:
•
Core technologies: AI, ML, deep learning, natural language platforms,
predictive analysis APIs, image recognition, and speech recognition
•
Enterprise-level solutions for sales, security/administration, fraud detection, HR/recruiting, marketing, personal assistants, and business
intelligence tools
•
Industry-focused solutions for manufacturing, energy, process, and utilities, medical, retail and consumer goods, infrastructure, and finance
•
Human interaction: augmented reality, service robotics, gestural computing, facial and emotional recognition
©2015 • ARC • 3 Allied Drive • Dedham, MA 02026 USA • 781-471-1000 • ARCweb.com
ARC Insights, Page 7
Machine learning technologies and applications have particular significance
for manufacturing. As manufacturers make the transition to the smart connected factory and extend Industrial Internet of Things (IIoT) ecosystems
across factories, plants, and supply chains, efficient and flexible data analysis
will be critical to the operations of production systems and plant assets. AdAs manufacturers make the transition
to the smart connected factory and
extend IIoT ecosystems across
factories, plants, and supply chains,
efficient and flexible data analysis will
be critical to the operations of
production systems and plant assets.
vanced analytics powered by ML algorithms will
access and analyze large volumes of production records and information to drive real and actionable
continuous process improvement and complete the
as-built to as-designed feedback loop. As manufacturers introduce next-generation materials to
product design and build, understanding the manufacturing processes, and discovering and adapting
to new material characteristics and properties will be essential. ML will be
able to make sense out of the large volumes of production data and the many
variables involved to help streamline production processes and improve
product designs.
Conclusions and Recommendations
Where IoT represents the concept of smart connected ecosystems for people,
products, industry, business, infrastructure, and cities; machine learning represents a science and technology that will enable much of this IoT
environment. ML will provide an answer and a solution to the thorny problem of too much data and what to do with it. Business and industry across
all sectors are rapidly embracing the solutions and applications powered by
ML technology. Many are already reaping benefits.
Machine learning technology is clearly providing a wide range of practical
applications for business, industry, government, and society and fulfilling
some significant portion of the initial promise of the broader field of AI. Currently, many ML startups are providing a rich range of applications and
solutions that address big data and many other business issues. Now is the
time for companies to take a serious look at the potential benefits of ML.
For further information or to provide feedback on this Insight, please contact your
account manager or the author at [email protected]. ARC Insights are published and copyrighted by ARC Advisory Group. The information is proprietary to
ARC and no part may be reproduced without prior permission from ARC.
©2015 • ARC • 3 Allied Drive • Dedham, MA 02026 USA • 781-471-1000 • ARCweb.com