Enable breadcrumbs token at /includes/pageheader.html.twig

Thinking Machines Going Mainstream

Radical change awaits as cognitive computing becomes the norm.

Millions of hits result from searching Google for the phrase “how cognitive computing will change the world,” reflecting the public’s big appetite for information about the emerging technology. But some experts foresee a time when the extraordinary is ordinary.

“The most important thing I will predict is that we will stop talking about the technology of cognitive computing. It will be simply a behavior that will be built into any newer system,” says Sue Feldman, a co-founder of the Cognitive Computing Consortium, a forum for researchers, developers and practitioners of cognitive computing and associated technologies. “It will be a requirement. It will be an expectation of the users that this assistive, interactive, iterative role that it plays within decision making becomes the norm.”

This type of evolution happens with every technology. For the most part, people no longer talk about fuel-injection engines in cars, Feldman points out. “Gradually, this one will become invisible as it becomes expected,” she says.

U.S. Army researchers pretty much make the same point when they talk about the concept of pervasive computing. The idea is that sensors and computers will be so ubiquitous that people give them no more thought than they do to the electrical wires inside walls—a notion that was once rather frightening.

While no one can predict all the changes cognitive computing will bring, one thing is certain, asserts Feldman, who is also CEO and founder of Synthexis, a consulting firm that provides business advisory services to vendors and buyers of cognitive computing, search and text analytics technologies. “The thing I am quite sure of is that the computing system of the future is going to be wholly different from what we’ve got now,” she offers. “We are in the early stages of a transition from the old stuff to systems that are much more interactive.”

More interactive technology, she continues, will be “able to return answers or graphs or whatever is necessary in an iterative manner with the people who are using it.” It also will be contextual, meaning that when the system sorts through vast amounts of data, it will provide information more tailored to a specific user than millions of Google hits. “Context becomes one of the main important pieces of cognitive computing in that the answers or the information returned by the system are appropriate to the person who was asking the question, to the job that he is trying to do. That will be very, very different,” Feldman elaborates.

Another big change on the horizon is the blurring of lines between technological devices and other objects, a trend that will only accelerate as cognitive computing advances, indicates Hadley Reynolds, who co-founded the Cognitive Computing Consortium with Feldman. “The boundaries between things that we are accustomed to today—for example, the boundary between my cellphone and my refrigerator—are going to begin to blur, and then they’re going to disappear,” Reynolds says.

Already, a Levi’s denim jacket integrates high-tech threads developed by Google to allow wearers to control their smartphones with simple hand gestures, he explains. Although it is not powered by cognitive computing capabilities, the jacket is an example of those blurring lines, Reynolds notes. “Now, all you have to do is scratch your sleeve to answer the phone or to get the weather. I’m not proposing you’ll have a cognitive blue jean jacket. I am proposing this is the way systems will go together with the advances in hardware and networking technology and also in the advances in the software and the sensors and the interactive styles,” he says.

Robotics is on the verge of making a variety of serious contributions to this overall trend, Reynolds indicates. He lists Amazon’s use of drones for delivery and robots in warehouses as two examples. A slew of advances is fueling the rise of robotic technologies. “The level of control is getting better. The kinds of applications underlying robotics have gotten better and better,” Reynolds says.

Computer vision alone is much better. The graphics processing unit and the ability to process millions of data points to move confidently in three-dimensional space have improved dramatically just in the last four or five years, Reynolds reports.

Robotics is even reaching agriculture. A farmer’s fields will soon contain sensors that tell a self-propelled tractor when to distribute fertilizer or harvest crops, he says. “Now the field is exhibiting its own intelligence because of miniaturized communicating technologies. This is going to be the real impact of the Internet of Things,” Reynolds foretells. “When you start multiplying these boundary-shifting or boundary-blurring approaches and innovations, that’s going to have a tremendous impact on the way we all interact with our world in a five- or 10-year time frame.”

The technological advances do create some concerns. “One of the challenges is something like self-driving cars because as machines cross into the physical rather than the cyber world, you run into problems that are life or death, as opposed to right answer versus wrong answer. And that changes the ballgame,” Feldman declares.

Furthermore, robots could replace human labor, Reynolds says. “In many cases, they already have,” he notes.

The consortium co-founders indicate that big data has been, and will continue to be, a major driver of cognitive computing advances. The technology can be especially helpful for sorting through health industry data. For example, cognitive computers can sift through insurance information, previous treatments and other data to determine which treatments might be most effective. “Humans simply cannot process enough of the information that is pertinent to a big question like who a patient is or what would be the best way to treat him,” Feldman says.

For many other industries, big data has become big business. “Companies that are tens and even hundreds of thousands of employees large desperately need access to their information in order to reduce risks, prevent disasters, raise revenue and all of those things an enterprise has been facing forever and continues to face,” Feldman says. “It’s not just health care. It’s things like mergers and acquisitions in business. We’re seeing a lot of use in shopping applications—things that require a decision in a world that is not cut and dried.”

In fact, a whole new profession could arise around cognitive computing and artificial intelligence (AI). Feldman says that possibility was partly why she and Reynolds decided to create the consortium a few years ago. “We saw the emergence of a modified or even an entirely new kind of profession, which would be available to people who were going to work with these intelligent systems,” she says. “That is going to require an adjustment of traditional software development and application development roles in terms of skills and competencies and outlooks, or perhaps an entirely new suite of professionals who would be called the artificial intelligence team or the cognitive computing department.”

She notes that cognitive computing “seems to keep seeping into established research areas,” such as content management, knowledge management and data search. “It’s a leap into the next generation of computing,” she says.

Although cognitive computing is related to AI, experts distinguish between the two. The difference is mostly a matter of degrees. Cognitive intelligence is sometimes referred to as weak AI or augmented AI, while strong or general AI is referred to as autonomous intelligence.

With cognitive computing, Reynolds says, “The human is still guiding the machine in some way, shape or form. The machine is an assistant to a human. Artificial intelligence is where the machine is essentially guiding the person, or the machine is creating its own instructions and acting autonomously.”

Feldman says cognitive machines can be helpful because they are never bashful. “If you want to discover things that are improbable, that you would never have thought of, getting a machine in the mix is actually a good idea because they don’t get embarrassed,” she states. “They just see a pattern, and the human then, as the other partner, can say, ‘That’s kind of ridiculous.’ But nevertheless it raises these questions.”

Also, machines do not carry the same prejudices that humans have. “People tend to be biased. There is simply no way around that. They cannot be purely unbiased. That has advantages and disadvantages,” Feldman says.

Machines, perhaps surprisingly, may not be entirely unbiased, but can be less so than humans. “Machines can be set up to be quite unbiased. I would not go so far as to say they are entirely unbiased because they rely on human-generated algorithms,” she asserts.

Reynolds credits the British company Autonomy Corporation for being the first to implement machine learning and neural network technologies. Now IBM, Apple, Amazon and others vie for dominance as Watson, Siri and Alexa become household names.

When asked about the remaining challenges for cognitive computing, Feldman and Reynolds share different but related opinions. Reynolds says ethics is the biggest challenge. “We don’t have any rules, regulations, standards, morals—anything about any of this at the moment. Legal and ethical matters are going to become huge in this area. Everybody is going to want to figure out what is healthy and what is not healthy in some of these developments,” he asserts.

Citing the flood of news about data breaches, Feldman offers trust as the primary challenge. “It’s not something you can build into the system. It’s something that is legal and ethical and a human problem. It’s big,” Feldman says.

Both experts stress that the approaching changes will not always be easy. “There’s definitely going to be a level of disruption in communities and careers and confidences and relationships among peoples of the world and peoples in a given society that are going to pose some very serious challenges to how all of this gets developed,” Reynolds warns. “There’s a social and political aspect as well as a legal and ethical and trust element to this.”