That will change in the next five years, says IBM. Computers at that time will be much more aware of the world around them, and be able to understand it. The company’s annual “5 in 5” list, in which IBM predicts the five trends in computing that will arrive in five years’ time, reads exactly like a list of the five human senses — predicting computers with sight, hearing, taste, smell and touch.
The five senses are really all part of one grand concept: cognitive computing, which involves machines experiencing the world more like a human would. For example, a cognizant computer wouldn’t see a painting as merely a set of data points describing color, pigment and brush stroke; rather, it would truly see the object holistically as a painting, and be able to know what that means.
“That’s a foundationally different way of thinking of computing,” Bernie Meyerson, IBM’s vice president of innovation, told Mashable in an interview. “You have to change how you think about absorbing data. You can’t just take a picture and file the picture. You have to treat the picture as an entity at a very high level, as opposed to just a bunch o’ bits.”
“[Cognitive computing] makes for some very interesting shifts in capability,” he adds. “That’s a rather profound sort of driver.”
One of the key differences between a cognizant computer and a traditional one is the idea of training. A cognitive system won’t just continue to give the same wrong or unhelpful answer; if it arrives at the wrong conclusion, it can change its approach and try again.
“In a cognitive machine, you set it up and run it, but it observes,” Meyerson says. “And that’s very different because it statistically calculates an end result. However, if that answer is incorrect and you tell it, it’ll actually re-weight those probabilities that led it to get the wrong answer and eventually get to the right answer.”
Cognition Does Not Equal Intelligence
Attributing human senses to machines can’t help but conjure images of androids or self-aware computers capable of independent thought and action. Meyerson says there’s a massive chasm separating cognitive computing and true artificial intelligence.
“This is really an assistive technology,” he explains. “It can’t go off on its own. It’s not designed to do that. What it’s designed to do, in fact, is respond to a human in an assistive manner. But by providing a human-style of input, it’s freed us from the task of programming and moved to the task of training. It simply has — not more intelligence — but more bandwidth, and there’s a huge difference between the two.”
What’s your take on cognitive computing? Is IBM on to something with PCs that can taste, smell, touch, hear and see? How would you use the technology? Share your thoughts in the comments.
Homepage photo courtesy of IBM
IBM Predicts the Rise of Cognitive Computing
In its annual “5 in 5” prediction, IBM says that within five years, we will see the rise of cognitive computers — machines that can experience the world in a similar way that humans do through the five senses. Instead of interpreting objects as a set of data points, a cognitive system would look at the object holistically, as an entity.
For example, instead of seeing a painting as a canvas with various colors and brushstrokes, a cognitive computer could interpret it simply as da Vinci’s The Mona Lisa. Or a fake of The Mona Lisa. Cognitive systems have advantages over traditional computers by being more efficient and able to learn from mistakes.
In five years, you will be able to touch through your phone. IBM is working on bringing a sense of touch to mobile devices, and bringing together virtual and real world experiences for a number of industries including retail. Shoppers will be able to “feel” the texture and weave of a fabric or product by brushing their finger over the item’s image on a device’s screen.
In five years, computers will not only be able to look at images, but understand them. Computers will be trained to turn pictures and videos into features, identifying things like color distribution, texture patterns, edge information and motion information. A pixel will be worth a thousand words.
In five years, computers will hear what matters. Hearing systems of the future will be trained by ‘listening’ to sounds and will use this input to start detecting patterns and building models to decompose sounds. Machines will be used to predict when a tree might fall or to translate “baby talk” so parents understand if a baby’s fussing indicates hunger, tiredness or pain.
In five years, a computer system will know what you like to eat better than you do. A machine that experiences flavor will determine the precise chemical structure of food and why people like it. Not only will it get you to eat healthier, but it will also surprise us with unusual pairings of foods that are designed to maximize our experience of taste and flavor. Digital taste buds will help you to eat smarter.
In five years, computers will have a sense of smell. We will see vast advances where sensors will be equipped to smell potential diseases that feed back into a cognitive system to tell us if they suspect a possible health issue. Your phone will detect if you’re coming down with a cold or illness before you do.