With the combined effort of the Carnegie Mellon Human Computer Interaction Institute and Microsoft, researchers have developed wearable hardware that turns any surface into a usable workspace. Similar in design to the Microsoft Kinect for the Xbox 360, the OmniTouch device uses a camera that senses short-range depth, a small pico projector and a 3D modeling system to understand where the user is touching. The projector and the camera have to be calibrated to the user in order for the touch system to accurately match up to the user’s actions. The device can sense when a user’s hand if hovering over the surface as well as understand the depth when a user taps a button on the main surface.
OmniTouch-keypadThe surface can be anything from a nearby table to something as simple as the human hand. The current size of the device is approximately the same size at the Kinect, but researchers are planning to shrink the device down as they continue to modify the hardware. A demonstration of the hardware showed users dialing a number on the phone interface, likely tying into a future ability to dial a number on a smartphone. Another example alerted the user to a new email and the user had the ability to select a new surface and define a region for the email to appear. Users could also zoom and pan around within the application.
When projecting the main screen on a surface like a desk or table, the user can bring up a full QWERTY keyboard to type. The user can also use their hand to pull up a secondary pop-up menu to interact with the program on the main screen. The research team completed the OmniTouch prototype while working at Microsoft Research in Redmond, Washington. For more information and media on the OmniTouch, take a look at one of the researcher’s personal Web page.