June 25, 2013
There seems to be some confusion about exactly what the term “cognitive computing” means. S.E. Smith writes, “Cognitive computing refers to the development of computer systems modeled after the human brain. Originally referred to as artificial intelligence, researchers began to use the term cognitive computing instead in the 1990s, to indicate that the science was designed to teach computers to think like a human mind, rather than developing an artificial system.” [“What is Cognitive Computing?” wiseGEEK, 25 February 2013] On the other hand, Forrester analyst John Brand writes, “The term ‘cognitive computing’ emerged in response to the failings of what was once termed ‘artificial intelligence’.” [“Make No Mistake – IBM’s Watson (and Others) Provide the *Illusion* of Cognitive Computing,” John Brand’s Blog, 21 May 2013]
Dictionary.com offers two definitions of the adjective cognitive. First, it pertains “to the act or process of knowing, perceiving, remembering, etc.” Second, it pertains “to the mental processes of perception, memory, judgment, and reasoning, as contrasted with emotional and volitional processes.” When it comes to cognitive computing both definitions seem apply. To my mind, cognitive computing involves the processing of data in such a way that the system reasons about it, remembers it, and makes judgments (i.e., decisions) based on what it learns. All cognitive computer systems are learning systems. Whether the computer exactly mirrors human thought processes is irrelevant. Frankly, we don’t know enough about human thought processes to mimic them accurately. What researchers are finding is that computers can learn a lot given enough data and the right algorithms (see my post entitled Intelligence from Chaos).
Most analysts seem to agree that cognitive computing is step forward into a new era. An article on cognitive computing, written by analysts at IBM Research, asserts that it’s necessary to change computing paradigms in order to progress. [“Cognitive systems: A new era of computing“]. It states:
“Over the past few decades, Moore’s Law, processor speed and hardware scalability have been the driving factors enabling IT innovation and improved systems performance. But the von Neumann architecture—which established the basic structure for the way components of a computing system interact—has remained largely unchanged since the 1940s.”
If you are not familiar with the von Neumann architecture, you can watch the following short video on the subject. There are a few spelling errors in the presentation, but it is packed with information and is cleverly presented.
Given the rise of big data, the IBM article insists that the von Neumann architecture is “no longer good enough.” It explains:
“We now are entering the Cognitive Systems Era, in which a new generation of computing systems is emerging with embedded data analytics, automated management and data-centric architectures in which the storage, memory, switching and processing are moving ever closer to the data. Whereas in today’s programmable era, computers essentially process a series of ‘if then what’ equations, cognitive systems learn, adapt, and ultimately hypothesize and suggest answers. Delivering these capabilities will require a fundamental shift in the way computing progress has been achieved for decades.”
The article goes to describe several characteristics of cognitive computer systems including that they are data-centric (use big data) and designed for statistical analytics. Bernie Meyerson, IBM’s vice president of innovation, told Brian Deagan, “Cognitive computing is a completely different approach to drive performance in computers.” [“IBM Predicts Cognitive Systems As New Computing Wave,” Investors.com, 23 January 2013] He continued:
“These machines will perform better because they learn, they adapt, they sense — and by doing that you don’t program it so much as you can teach the system to learn. That is incredibly efficient, compared to what you can do today, where you literally type in millions of lines of code to get the machine to do what you want. This is a machine that can observe and follow.”
Deagan asked Meyerson if IBM’s Watson was a cognitive computing system. Meyerson answered, “Yes. Watson is the embodiment of cognitive computing. For example, it can be taught not only to recognize the right and probable answer to a medical diagnostic issue such as a cancer, but it can also learn from uncertain data, even if you have conflicting data. Watson, because it is probabilistic, might not know the exact answer, but if the odds favor or point to one answer it will assign a high probability to the correct answer.” Forrester’s Brand disagrees with Meyerson, he writes, “Let’s get real. Despite the fact that ‘Watson’ was trained to successfully win a game show (Jeopardy), IBM’s technology (and others to be fair) are not cognitive computing systems at all. That’s not to say they aren’t valuable – just that we shouldn’t overstate their value or capabilities.”
Brand’s real quibble is that systems like Watson don’t really match human intelligence. Since he admits they are nevertheless valuable, does it really matter? What matters is that cognitive computer systems can learn (either through discovery or by being taught). Meyerson told Deagan, “With cognitive computing, it’s about providing the computer a richer set of data to make decisions. The idea behind cognitive machines is that you don’t program them, you teach them.” Cognitive computing systems are far enough along that Shweta Dubey believes they could be the next disruptive technology. [“Is Cognitive Computing the Next Disruptive Technology?” The Motley Fool, 7 January 2013] She writes:
“Cognitive computing will be a larger part of the future as an emerging field in which computers can operate more like a human brain. The computers will go beyond performing static operations and will begin using the five senses, just like a human brain.”
Bertrand Duperrin, a Consulting Director at Nextmodernity, agrees with Dubey.
“There are many chances,” he writes, “the next wave will be about cognitive computing.” [“Towards cognitive computing,” Bertrand Duperrin’s Notepad, 21 May 2013] He continues:
“A new era is starting, where small and big decisions will me made, at any level, by people having a comprehensive view of their environment and of the way it moves. New ways of doing things that will be supported by new platforms and a new approach to computing : Cognitive Computing. … If Big Data is about mass data processing, it’s only one side of Cognitive Computing which is also about data analysis and tagging, pattern discovery and the ability the system has to learn from his own experience. Cognitive Computing also [has a] human side. … People will still be key for interpretation — provided the information they’re given is of quality, well targeted and they have the required knowledge and training. Cognitive Computing is what will help to move [organization’s] from [having] data to [having] information. … If computing is a matter of data, social computing a matter of people, cognitive computing is more a convergence than a next step: the need of using people and data together.”
The most exciting potential of cognitive computing is using vast databases to discover new relationships. The Technical Committee on Cognitive Computing of the Systems, Man & Cybernetics Society, asserts:
“Cognitive Computing breaks the traditional boundary between neuroscience and computer science, and paves the way for machines that will have reasoning abilities analogous to a human brain. It is an interdisciplinary research and application field, and uses methods from psychology, biology, signal processing, physics, information theory, mathematics, and statistics. The development of Cognitive Computing will cross fertilize these other research areas with which it interacts.”
My company, Enterra Solutions, is focused on the development and application of an advanced artificial intelligence system, the Enterra Cognitive Reasoning Platform™ (CRP), that analyzes both structured and unstructured data using ontologies and mathematical algorithms. The CRP is capable of addressing various commercial markets and disciplines using a generalized framework, yet it is designed so that it can be tailored to handle the disparate data sources and specific challenges found in individual industries and in different functional areas. Since most major universities have scientists conducting research in this area, I expect that field of cognitive computing will mature rapidly. That’s good news for all of us.