Home » Artificial Intelligence » Dawn of the Cognitive Computing Era

Dawn of the Cognitive Computing Era

September 7, 2017

supplu-chain

More and more commentators are talking about the dawn of the Cognitive Computing Era. An “era” isn’t something one should discuss lightly. It is defined as “a long and distinct period of history with a particular feature or characteristic.” If the commentators are using the term correctly, cognitive computing will be the defining characteristic of IT in the years ahead. Ahmed K. Noor, an adjunct professor at Old Dominion University’s Center for Advanced Engineering Environments, is one of the pundits who believes we are entering such an era. He writes, “The history of computing can be divided into three eras. The first was the tabulating era, with the early 1900 calculators and tabulating machines made of mechanical systems, and later made of vacuum tubes. In the first era the numbers were fed in on punch cards, and there was no extraction of the data itself. The second era was the programmable era of computing, which started in the 1940s and ranged from vacuum tubes to microprocessors. It consisted of taking processes and putting them into the machine. Computing was completely controlled by the programming provided to the system. The third era is the cognitive computing era, where computing technology represented an intersection between neuroscience, supercomputing and nanotechnology.”[1] It’s not just academics who believe we sit on the cusp of a new era. The Financial reports, “Cognitive computing has nearly endless possibilities to improve business processes and functions with 73 percent of surveyed CEOs in a recent IBM study citing it will play a key role in their organizations’ future and all executives in the study anticipating a 15 percent return on investment from their cognitive initiatives.”[2]

 

Defining Cognitive Computing

 

Even the most vigorous proponents of cognitive computing agree there is no universally accepted definition of cognitive computing. Some describe cognitive computing as “computers that think.” But this implies self-awareness, which computers don’t have. Others describe it as “computers that mimic the human brain.” That’s an overstatement today. At best you can say cognitive computing mimics one powerful process of the human brain: decision-making. At Enterra Solutions®, cognitive computing refers to software that combines human-like reasoning with cutting-edge mathematics, wrapped in natural language, to solve complex problems. Our entry in the cognitive computing field is the Enterra Enterprise Cognitive System™ (ECS) — a system that can Sense, Think, Act, and Learn®. One of the things that distinguishes our system from others is the incorporation of an ontology to ensure queries have context. Lora Cecere (@lcecere), founder of Supply Chain Insights, believes an ontology is essential for gaining actionable insights. She explains, “In this evolution, machine learning using an ontology drives insights. (A view, or starting point, of complex interactions that are many-to-many.) This ontology uses structured and unstructured data. It is not limited to relational database technologies. Things no longer are force-fit into rows and columns. … The ontology is the beginning state for learning. As the computer learns, the ontology is updated. This enables new insights. The computer answers the questions that we do not know to ask through sense, learn and act workflows.”[3]

 

Colleen Balda, a business consultant for Technology Solutions’ Big Data and Analytics team, isn’t put off by the lack of universal definition. “Definitions for cognitive computing may vary,” she writes, “but recognizing the driving forces for cognitive is more attainable than pinpointing a universal definition.”[4] The drivers she sees spurring the dawn of the cognitive computing era are: Small-scale practical integration; data-driven programming; edge computing; and serverless computing. In Balda’s mind, cognitive computing is about augmented decision-making as opposed to autonomous decision-making. “Cognitive computing helps us make smarter decisions on our own leveraging the machines,” she writes, “while AI is rooted in the idea that machines can make better decisions on our behalf.”[5] I’m not sure I’m willing to make augmented and autonomous decision-making the fine line dividing cognitive computing and AI. Although I agree cognitive computing systems can help decision-makers make better decisions, the fact is, cognitive computing systems can also be programmed to make autonomous decisions. In time-sensitive activities, autonomous decisions can prove crucial. In fact, technologists from Maruti Tech Labs assert, “The purpose of cognitive computing is the creation of computing frameworks that can solve complicated problems without constant human intervention.”[6] They go on to note, “In order to implement cognitive function computing in commercial and widespread applications, the Cognitive Computing consortium has recommended the following features for the computing systems”:

1. Adaptive. “This is the first step in making a machine learning based cognitive system. The solutions should mimic the ability of human brain to learn and adapt from the surroundings. The systems can’t be programmed for an isolated task. It needs to be dynamic in data gathering, understanding goals, and requirements.”

2. Interactive. “Similar to brain the cognitive solution must interact with all elements in the system — processor, devices, cloud services and user. Cognitive systems should interact bi-directionally. It should understand human input and provide relevant results using natural language processing and deep learning.”

3. Iterative and stateful. “The system should ‘remember’ previous interactions in a process and return information that is suitable for the specific application at that point in time. It should be able to define the problem by asking questions or finding an additional source. This feature needs a careful application of the data quality and validation methodologies in order to ensure that the system is always provided with enough information and that the data sources it operates on to deliver reliable and up-to-date input.”

4. Contextual. “They must understand, identify, and extract contextual elements such as meaning, syntax, time, location, appropriate domain, regulations, user’s profile, process, task, and goal. They may draw on multiple sources of information, including both structured and unstructured digital information, as well as sensory inputs (visual, gestural, auditory, or sensor-provided).”

The Financial concludes, “Cognitive computing is a next generation information system that can understand, reason, learn, and interact with humans in natural language. While traditional analytics can provide data-based insights, cognitive more easily turns these insights into actionable recommendations.”

 

Summary

 

The Maruti Lab technologists conclude, “Cognitive computing is definitely the next step in computing started by automation. It sets a benchmark for computing systems to reach the level of the human brain. But it has some limitations which make AI difficult to apply in situations with a high level of uncertainty, rapid change or creative demands. The complexity of problem grows with the number of data sources. It is challenging to aggregate, integrate and analyze such unstructured data. A complex cognitive solution should have many technologies that coexist to give deep domain insights.” The advances in cognitive computing technologies have been rapid. Does that mean we are witnessing the dawn of the cognitive computing era? Only time can answer that question. As Cecere observes, “We cannot confuse activity with progress. It takes both; and hopefully, we will not let the technology community overhype and short change one of the most promising shifts in technologies in over a decade.”

 

Footnotes
[1] Ahmed K. Noor, “Potential of Cognitive Computing and Cognitive Systems,” De Gruyter Open, 14 October 2016.
[2] Staff, “Half of Surveyed Chief Executive Officers Plan to Adopt Cognitive Computing by 2019,” The Financial, 7 July 2017.
[3] Lora Cecere, “Cognitive Computing: Getting Clear on Definitions,” Supply Chain Shaman, 7 August 2017.
[4] Colleen Balda, “De-mystifying Cognitive Computing Part One: What are the drivers?Avnet Advantage Blog, 29 June 2017.
[5] Colleen Balda, “De-mystifying Cognitive Computing Part Two: Artificial Intelligence vs. Cognitive Computing,” Avnet Advantage Blog, 5 July 2017.
[6] Staff, “What is Cognitive Computing? Features, Scope, & Limitations,” Maruti Tech Labs, 2017.

Related Posts: