New User Interface Enables Innovative Handling of Data by Directly Touching Objects
Fujitsu Laboratories have developed a new user interface technology that enables fingertip operations. The technology accurately detects where the user's finger is and what it is touching based on data captured by off-the-shelf cameras and projectors.
The new technology uses cameras along with a projector to trace a finger touching a document on a table, it translates the information to digital data, and displays it. This simple interaction with objects makes the technology an interface between people and ICT services.
In recent years, commercial applications have been developed for hand gesture-based screen control technologies and speech interaction technologies, and new user interfaces have begun to be employed as replacements for the mouse, keyboard and touchscreen. At the same time, to test the viability of enhanced user interfaces, actually touching and moving objects has been suggested as a potential area of development. Until now, however, this approach has required special sensors to be embedded into objects.
If it was possible to perform contactless detection of touch operations on real world objects, this would obviate the need to embed objects with special sensors. Current commercial gesture technologies function predominantly on operations performed in the air without touching anything. However, when a user's hands draw near to background objects, these technologies have difficulty differentiating between the hands and background. As a result, this approach has been unsuitable for detecting touch operations. In addition, technologies that use infrared and other special devices to measure distance have also begun to be employed in user interfaces. Again, these technologies lack the resolution to detect subtle changes in finger movements, they are bulky and costly.
Fujitsu claims that its new technology can accurately and rapidly detect finger operations on actual objects using an ordinary camera, which automatically measures irregularly shaped objects on a table and then automatically adjusts the coordinate systems for the camera, projector and the actual objects. This makes it possible for the software to take the finger movements and touching of objects made by the user, and then match them with the digital display projected onto physical objects.
The technology recognizes the shape of the user's fingers by extracting the fingers' colors and contour features. Other technology controls the color and lighting of the camera image, depending on the surrounding ambient light, along with technology that corrects for differences among individual fingers. A stable extraction of the form of the user's fingers, one that is minimally impacted by the environment and individual variations, can be obtained.
By enhancing images of the user's fingertips, the technology is able to obtain a level of precision that is sufficient for touch detection, even using low-resolution images that can be captured on a regular webcam. Moreover, the technology is capable of fingertip tracking speeds of 300 mm per second, thereby enabling it to follow natural finger movements.
A variety of operations are possible using the newly developed technology, one being the ability to copy important parts of a document in digital format by simply tracing one's finger across the physical document while it is placed on a table. Another is the ability to project copied data onto a tabletop and then expand and shrink the projected image by tracing with a finger. In addition, the technology can capture graphic data from handwritten sticky-notes that have been attached to a table, and enable the user to move around these newly digitalized sticky-notes with a finger over a physical surface. It would then be possible to group and rearrange these digitalized notes or perform other useful operations with them. This new technology makes it possible to easily have interactions with actual objects and serve as an interface between people and ICT services, thereby helping to expand the ways in which ICT is employed.
Fujitsu Laboratories plans to evaluate the new system and its software applications in real usage environments with the aim of commercializing in fiscal 2014.
In recent years, commercial applications have been developed for hand gesture-based screen control technologies and speech interaction technologies, and new user interfaces have begun to be employed as replacements for the mouse, keyboard and touchscreen. At the same time, to test the viability of enhanced user interfaces, actually touching and moving objects has been suggested as a potential area of development. Until now, however, this approach has required special sensors to be embedded into objects.
If it was possible to perform contactless detection of touch operations on real world objects, this would obviate the need to embed objects with special sensors. Current commercial gesture technologies function predominantly on operations performed in the air without touching anything. However, when a user's hands draw near to background objects, these technologies have difficulty differentiating between the hands and background. As a result, this approach has been unsuitable for detecting touch operations. In addition, technologies that use infrared and other special devices to measure distance have also begun to be employed in user interfaces. Again, these technologies lack the resolution to detect subtle changes in finger movements, they are bulky and costly.
Fujitsu claims that its new technology can accurately and rapidly detect finger operations on actual objects using an ordinary camera, which automatically measures irregularly shaped objects on a table and then automatically adjusts the coordinate systems for the camera, projector and the actual objects. This makes it possible for the software to take the finger movements and touching of objects made by the user, and then match them with the digital display projected onto physical objects.
The technology recognizes the shape of the user's fingers by extracting the fingers' colors and contour features. Other technology controls the color and lighting of the camera image, depending on the surrounding ambient light, along with technology that corrects for differences among individual fingers. A stable extraction of the form of the user's fingers, one that is minimally impacted by the environment and individual variations, can be obtained.
By enhancing images of the user's fingertips, the technology is able to obtain a level of precision that is sufficient for touch detection, even using low-resolution images that can be captured on a regular webcam. Moreover, the technology is capable of fingertip tracking speeds of 300 mm per second, thereby enabling it to follow natural finger movements.
A variety of operations are possible using the newly developed technology, one being the ability to copy important parts of a document in digital format by simply tracing one's finger across the physical document while it is placed on a table. Another is the ability to project copied data onto a tabletop and then expand and shrink the projected image by tracing with a finger. In addition, the technology can capture graphic data from handwritten sticky-notes that have been attached to a table, and enable the user to move around these newly digitalized sticky-notes with a finger over a physical surface. It would then be possible to group and rearrange these digitalized notes or perform other useful operations with them. This new technology makes it possible to easily have interactions with actual objects and serve as an interface between people and ICT services, thereby helping to expand the ways in which ICT is employed.
Fujitsu Laboratories plans to evaluate the new system and its software applications in real usage environments with the aim of commercializing in fiscal 2014.