IBM Researchers Predict Future Computers Will See, Smell, Touch, Taste and Hear
IBM on Monday unveiled the seventh annual "IBM 5 in 5" - a list of innovations that have the potential to change the way people work, live and interact during the next five years.
The IBM 5 in 5 is based on market and societal trends as well as emerging technologies from IBM's R&D labs around the world that can make these transformations possible.
This year's IBM 5 in 5 explores innovations that will be the underpinnings of the next era of computing, which IBM describes as the era of cognitive systems. This new generation of machines will learn, adapt, sense and begin to experience the world as it really is. The predictions focus on one element of the new era, the ability of computers to mimic the human senses -- in their own way, to see, smell, touch, taste and hear.
"IBM scientists around the world are collaborating on advances that will help computers make sense of the world around them," said Bernie Meyerson, IBM Fellow and VP of Innovation. "Just as the human brain relies on interacting with the world using multiple senses, by bringing combinations of these breakthroughs together, cognitive systems will bring even greater value and insights, helping us solve some of the most complicated challenges."
IBM scientists are developing applications for the retail, healthcare and other sectors using haptic, infrared and pressure sensitive technologies to simulate touch, such as the texture and weave of a fabric -- as a shopper brushes her finger over the image of the item on a device screen. "Utilizing the vibration capabilities of the phone, every object will have a unique set of vibration patterns that represents the touch experience: short fast patterns, or longer and stronger strings of vibrations. The vibration pattern will differentiate silk from linen or cotton, helping simulate the physical sensation of actually touching the material," IBM says.
Computers today only understand pictures by the text we use to tag or title them; the majority of the information -- the actual content of the image -- is a mystery.
In the next five years, IBM says that systems will not only be able to look at and recognize the contents of images and visual data, they will turn the pixels into meaning, beginning to make sense out of it similar to the way a human views and interprets a photograph. In the future, "brain-like" capabilities will let computers analyze features such as color, texture patterns or edge information and extract insights from visual media. This will have a profound impact for industries such as healthcare, retail and agriculture.
Within five years, these capabilities will be put to work in healthcare by making sense out of massive volumes of medical information such as MRIs, CT scans, X-Rays and ultrasound images to capture information tailored to particular anatomy or pathologies. What is critical in these images can be subtle or invisible to the human eye and requires careful measurement. By being trained to discriminate what to look for in images -- such as differentiating healthy from diseased tissue -- and correlating that with patient records and scientific literature, systems that can "see" will help doctors detect medical problems with far greater speed and accuracy.
A distributed system of clever sensors will detect elements of sound such as sound pressure, vibrations and sound waves at different frequencies, IBM researchers say. It will interpret these inputs to predict when trees will fall in a forest or when a landslide is imminent. Such a system will "listen" to our surroundings and measure movements, or the stress in a material, to warn us if danger lies ahead.
Raw sounds will be detected by sensors, much like the human brain. A system that receives this data will take into account other "modalities," such as visual or tactile information, and classify and interpret the sounds based on what it has learned. When new sounds are detected, the system will form conclusions based on previous knowledge and the ability to recognize patterns.
In the next five years, by learning about emotion and being able to sense mood, systems will pinpoint aspects of a conversation and analyze pitch, tone and hesitancy to help us have more productive dialogues that could improve customer call center interactions, or allow us to interact with different cultures.
Today, IBM scientists are beginning to capture underwater noise levels in Galway Bay, Ireland to understand the sounds and vibrations of wave energy conversion machines, and the impact on sea life, by using underwater sensors that capture sound waves and transmit them to a receiving system to be analyzed.
IBM researchers are also developing a computing system that actually experiences flavor, to be used with chefs to create the most tasty and novel recipes. It will break down ingredients to their molecular level and blend the chemistry of food compounds with the psychology behind what flavors and smells humans prefer. By comparing this with millions of recipes, the system will be able to create new flavor combinations that pair, for example, roasted chestnuts with other foods such as cooked beetroot, fresh caviar, and dry-cured ham.
A system like this can also be used to help us eat healthier, creating novel flavor combinations that will make us crave a vegetable casserole instead of potato chips.
The computer will be able to use algorithms to determine the precise chemical structure of food and why people like certain tastes. These algorithms will examine how chemicals interact with each other, the molecular complexity of flavor compounds and their bonding structure, and use that information, together with models of perception to predict the taste appeal of flavors.
During the next five years, tiny sensors embedded in your computer or cell phone will detect if you're coming down with a cold or other illness. By analyzing odors, biomarkers and thousands of molecules in someone's breath, doctors will have help diagnosing and monitoring the onset of ailments such as liver and kidney disorders, asthma, diabetes and epilepsy by detecting which odors are normal and which are not.
Today IBM scientists are already sensing environmental conditions and gases to preserve works of art. This innovation is beginning to be applied to tackle clinical hygiene, one of the biggest challenges in healthcare today. For example, antibiotic-resistant bacteria such as Methicillin-resistant Staphylococcus aureus (MRSA), which in 2005 was associated with almost 19,000 hospital stay-related deaths in the United States, is commonly found on the skin and can be easily transmitted wherever people are in close contact. One way of fighting MRSA exposure in healthcare institutions is by ensuring medical staff follow clinical hygiene guidelines. In the next five years, IBM technology will "smell" surfaces for disinfectants to determine whether rooms have been sanitized. Using novel wireless "mesh" networks, data on various chemicals will be gathered and measured by sensors, and continuously learn and adapt to new smells over time.
This year's IBM 5 in 5 explores innovations that will be the underpinnings of the next era of computing, which IBM describes as the era of cognitive systems. This new generation of machines will learn, adapt, sense and begin to experience the world as it really is. The predictions focus on one element of the new era, the ability of computers to mimic the human senses -- in their own way, to see, smell, touch, taste and hear.
"IBM scientists around the world are collaborating on advances that will help computers make sense of the world around them," said Bernie Meyerson, IBM Fellow and VP of Innovation. "Just as the human brain relies on interacting with the world using multiple senses, by bringing combinations of these breakthroughs together, cognitive systems will bring even greater value and insights, helping us solve some of the most complicated challenges."
IBM scientists are developing applications for the retail, healthcare and other sectors using haptic, infrared and pressure sensitive technologies to simulate touch, such as the texture and weave of a fabric -- as a shopper brushes her finger over the image of the item on a device screen. "Utilizing the vibration capabilities of the phone, every object will have a unique set of vibration patterns that represents the touch experience: short fast patterns, or longer and stronger strings of vibrations. The vibration pattern will differentiate silk from linen or cotton, helping simulate the physical sensation of actually touching the material," IBM says.
Computers today only understand pictures by the text we use to tag or title them; the majority of the information -- the actual content of the image -- is a mystery.
In the next five years, IBM says that systems will not only be able to look at and recognize the contents of images and visual data, they will turn the pixels into meaning, beginning to make sense out of it similar to the way a human views and interprets a photograph. In the future, "brain-like" capabilities will let computers analyze features such as color, texture patterns or edge information and extract insights from visual media. This will have a profound impact for industries such as healthcare, retail and agriculture.
Within five years, these capabilities will be put to work in healthcare by making sense out of massive volumes of medical information such as MRIs, CT scans, X-Rays and ultrasound images to capture information tailored to particular anatomy or pathologies. What is critical in these images can be subtle or invisible to the human eye and requires careful measurement. By being trained to discriminate what to look for in images -- such as differentiating healthy from diseased tissue -- and correlating that with patient records and scientific literature, systems that can "see" will help doctors detect medical problems with far greater speed and accuracy.
A distributed system of clever sensors will detect elements of sound such as sound pressure, vibrations and sound waves at different frequencies, IBM researchers say. It will interpret these inputs to predict when trees will fall in a forest or when a landslide is imminent. Such a system will "listen" to our surroundings and measure movements, or the stress in a material, to warn us if danger lies ahead.
Raw sounds will be detected by sensors, much like the human brain. A system that receives this data will take into account other "modalities," such as visual or tactile information, and classify and interpret the sounds based on what it has learned. When new sounds are detected, the system will form conclusions based on previous knowledge and the ability to recognize patterns.
In the next five years, by learning about emotion and being able to sense mood, systems will pinpoint aspects of a conversation and analyze pitch, tone and hesitancy to help us have more productive dialogues that could improve customer call center interactions, or allow us to interact with different cultures.
Today, IBM scientists are beginning to capture underwater noise levels in Galway Bay, Ireland to understand the sounds and vibrations of wave energy conversion machines, and the impact on sea life, by using underwater sensors that capture sound waves and transmit them to a receiving system to be analyzed.
IBM researchers are also developing a computing system that actually experiences flavor, to be used with chefs to create the most tasty and novel recipes. It will break down ingredients to their molecular level and blend the chemistry of food compounds with the psychology behind what flavors and smells humans prefer. By comparing this with millions of recipes, the system will be able to create new flavor combinations that pair, for example, roasted chestnuts with other foods such as cooked beetroot, fresh caviar, and dry-cured ham.
A system like this can also be used to help us eat healthier, creating novel flavor combinations that will make us crave a vegetable casserole instead of potato chips.
The computer will be able to use algorithms to determine the precise chemical structure of food and why people like certain tastes. These algorithms will examine how chemicals interact with each other, the molecular complexity of flavor compounds and their bonding structure, and use that information, together with models of perception to predict the taste appeal of flavors.
During the next five years, tiny sensors embedded in your computer or cell phone will detect if you're coming down with a cold or other illness. By analyzing odors, biomarkers and thousands of molecules in someone's breath, doctors will have help diagnosing and monitoring the onset of ailments such as liver and kidney disorders, asthma, diabetes and epilepsy by detecting which odors are normal and which are not.
Today IBM scientists are already sensing environmental conditions and gases to preserve works of art. This innovation is beginning to be applied to tackle clinical hygiene, one of the biggest challenges in healthcare today. For example, antibiotic-resistant bacteria such as Methicillin-resistant Staphylococcus aureus (MRSA), which in 2005 was associated with almost 19,000 hospital stay-related deaths in the United States, is commonly found on the skin and can be easily transmitted wherever people are in close contact. One way of fighting MRSA exposure in healthcare institutions is by ensuring medical staff follow clinical hygiene guidelines. In the next five years, IBM technology will "smell" surfaces for disinfectants to determine whether rooms have been sanitized. Using novel wireless "mesh" networks, data on various chemicals will be gathered and measured by sensors, and continuously learn and adapt to new smells over time.