Cognitive Computing and the Cognification of the Business World

Stephen DeAngelis

December 04, 2017

Futurists and visionaries, like Kevin Kelly (@kevin2kelly), founding Executive Editor of Wired magazine, are excited about cognitive computing. They believe this technology can confront challenges that have historically proven difficult. Kelly tweeted, “In the very near future you will cognify everything in your life that is already electrified.” That’s a bold statement considering he had to make up a verb (i.e., cognify) to describe how the business world is going to change. Jennifer Zaino (@Jenz514) agrees with Kelly. She writes, “Cognitive Computing increasingly will be put to work in practical, real-world applications. The industries that are adopting it are not all operating at the same maturity levels; there remain some challenges to conquer. The wheels are very much in motion to make cognitive-driven Artificial Intelligence (AI) applications a key piece of enterprise toolsets.”[1] Because cognitive computing systems are adaptable, Accenture analysts believe cognitive computing is “the ultimate long-term solution” for many of businesses’ most nagging challenges.[2]

Complexity and Ambiguity

I’m not sure life has ever been simple; but, I am sure life is getting more complex every day. One of the reasons analysts are excited about cognitive technologies is because they can help deal with complexity. Dale Walker explains, “Computing based on cognition, or the processes of the human brain, involves creating systems that are able to self-learn, recognize patterns and objects, understand language, and ultimately operate without the input of a human.”[3] Although cognitive systems can operate autonomously in certain situations, like many people involved in this field, I believe the greatest benefit of cognitive computing is enhancing human decision-making. IBM’s Ginni Rometty (@GinniRometty) asserts cognitive computing is a type of “augmented intelligence” — a system “to help you and I make better decisions amid cognitive overload.”[4] Cognitive systems can handle more variables in sophisticated ways than previous analytics platforms. This results in greater understanding and actionable insights. Bernard Marr (@BernardMarr) adds, “Just as the heroes of science fiction movies rely on their computers to make accurate predictions, gather data, and draw conclusions, so we will move into an era when computers can augment human knowledge and ingenuity in entirely new ways.”[5]

Just as important as dealing with complexity is dealing with ambiguity. The staff at the Cognitive Computing Consortium (CCC) asserts, “Cognitive computing makes a new class of problems computable. It addresses complex situations that are characterized by ambiguity and uncertainty; in other words it handles human kinds of problems.”[6] To accomplish this, explains Bob Violino (@BobViolino), “Cognitive computing uses technology and algorithms to automatically extract concepts and relationships from data, understand their meaning, and learn independently from data patterns and prior experience — extending what people or machines could do on their own.”[7]

Some pundits object to the term cognitive computing because they believe it implies sentience and, clearly, cognitive systems are not sentient. Cognition is defined as “the action or process of acquiring knowledge and understanding through thought, experience, and the senses.” Of course, that definition needs to be modified when applied to a “cognitive” machine. A cognitive system is a system that discovers knowledge, gains insights, and establishes relationships through analysis, machine learning, and sensing of data. I define cognitive computing as a combination of semantic reasoning (i.e., the use of machine learning, natural language processing, and ontologies) and computational intelligence (i.e., advanced analytics). None of this requires a system to be self-aware.

Cognitive Computing and the Enterprise

Jeanne Ross (@jrossCISR), principal research scientist at MIT Sloan’s Center for Information Systems Research, observes, “Given the hype around artificial intelligence, you might be worried that you’re missing the boat if you haven’t yet invested in cognitive computing applications in your business. Don’t panic! … Business applications of AI are still in the early stages.”[8] Some business executives aren’t sure how cognitive technologies might help their businesses. Analysts from Deloitte suggest there are three areas in which enterprises can leverage cognitive technologies.[9] They are:

Cognitive insights. “Machine intelligence can provide visibility into what is happening now and likely to happen next. For example, call center service representatives use multifunction customer support programs to answer product questions, take orders, investigate billing problems, and address other concerns. In many such systems, workers jump back and forth between screens to access information needed to answer specific queries. Machine intelligence can streamline this process and help business leaders prescribe actions to augment worker performance.”

Cognitive engagement. “An emerging field of cognitive applications will likely provide access to complex information, perform digital tasks such as admitting patients to the hospital, or recommend products and services. They may offer even greater business potential in customer service, where cognitive agents could replace some human agents by handling billing or account interactions, fielding tech support questions, and answering HR-related questions from employees.”

Cognitive automation. “Machine learning, RPA, and other cognitive tools could develop deep domain-specific expertise and then automate related tasks. For example, one health care startup is already applying deep learning technology to analyze radiology images. In testing, its system has been up to 50 percent better than expert human radiologists at judging malignant tumors. In education, machine intelligence capabilities embedded in online learning programs mimic the benefits of one-on-one tutoring by tracking the learner’s ‘mental steps’ during problem-solving tasks to diagnose misconceptions, then providing timely guidance, feedback, and explanations.”

As cognitive technologies mature, businesses will find cognitive enterprise applications are more extensive than first imagined. In fact, they will only be limited by our imaginations.

Summary

Analysts from Maruti Techlabs conclude, “Cognitive Computing is going to transform how we live, how we work and how we think, and that’s why Cognitive Computing will be a big deal. Cognitive computing is a powerful tool, but a tool nevertheless — and the humans having the tool must decide how to best use it.”[10] I agree that cognitive computing is going to be a big deal, which is why Kelly can safely predict the cognification of the business world.

Footnotes
[1] Jennifer Zaino, “Cognitive Computing, Artificial Intelligence Apps Have Big Future in the Enterprise,” Dataversity, 17 September 2015.
[2] “From Digitally Disrupted to Digital Disrupter,” Accenture, 2014.
[3] Dale Walker, “What is cognitive computing?ITPRO, 26 October 2017.
[4] Megan Murphy, “Ginni Rometty on the End of Programming,” Bloomberg BusinessWeek, 20 September 2017.
[5] Bernard Marr, “What Everyone Should Know About Cognitive Computing,” Forbes, 23 March 2016.
[6] Staff, “Cognitive Computing Defined,” Cognitive Computing Consortium.
[7] Bob Violino, “Primer: Make sense of cognitive computing,” InfoWorld, 5 June 2017.
[8] Jeanne Ross, “Seeing past the hype around cognitive computing,” Information Management, 11 May 2017.
[9] Nitin Mittal, Peter Lowes, and Rajeev Ronanki, “Machine Intelligence Mimics Cognition,” The Wall Street Journal, 5 June 2017.
[10] Maruti Techlabs, “Cognitive Computing and Why You Need to Know About It,” Chatbots Magazine, 30 May 2017.