With the news of Microsoft Surface announced, we found detail behind how the technology might work.
During the Popular Mechanics interview, Microsoft noted that the display uses a series of IR sensors to detect objects on the surface providing controls of objects. A recently published Microsoft patent application provides further insight (and images) on how these sensors work.
Microsoft’s June, 2005 filed patent application on an “Optical flow-based manipulation of graphical objects” covers what appears to be some of the core technology used in Surface. While touch screens alone allow for manipulation of objects, Microsoft is using IR sensors to understand the physical actions on top of a display (but not touching it) to add an additional layer of input and direction.
From the application:
One aspect of this method processes the motion of points in a patch in an image of the display surface that represents at least a portion of a physical object that is being moved. In some applications of the subject matter discussed below, the movement of a physical object will only be applied in manipulating a graphical object if a portion of the physical object corresponding to the patch is actually touching the display surface. Alternatively, optical flow can be computed for points in one or more patches representing portions of the physical object that are proximate to the display surface…
Objects above display surface 64a include a “touch” object 76a that rests atop the display surface and a “hover” object 76b that is close to but not in actual contact with the display surface. As a result of using translucent layer 64b under the display surface to diffuse the IR light passing through the display surface, as an object approaches the top of display surface 64a, the amount of IR light that is reflected by the object increases to a maximum level that is achieved when the object is actually in contact with the display surface.
In addition to the IR sensors, Microsoft smartly adopted multi-touch technology from Jeff Han’s research, presented early last year to TED:
Combining multi-touch sensors with 3-D physical world IR sensors for an integrated experience has the potential to add even more to Steve Jobs’ announcement of the iPhone claiming the finger is much better than a stylus. With Surface, Microsoft adds even more of the physical world to that list.