Jump to content

OmniTouch

From Wikipedia, the free encyclopedia

This is the current revision of this page, as edited by GreenC bot (talk | contribs) at 17:37, 16 January 2022 (Rescued 1 archive link. Wayback Medic 2.5). The present address (URL) is a permanent link to this version.

(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)
OmniTouch can project multitouch interfaces onto everyday surfaces, including the skin.

OmniTouch is a wearable computer, depth-sensing camera and projection system that enables interactive multitouch interfaces on everyday surface. Beyond the shoulder-worn system, there is no instrumentation of the user or the environment. For example, the present shoulder-worn implementation allows users to manipulate interfaces projected onto the environment (e.g., walls, tables), held objects (e.g., notepads, books), and their own bodies (e.g., hands, lap). On such surfaces - without any calibration - OmniTouch provides capabilities similar to that of a touchscreen: X and Y location in 2D interfaces and whether fingers are “clicked” or hovering. This enables a wide variety of applications, similar to what one might find on a modern smartphone. A user study assessing pointing accuracy of the system (user and system inaccuracies combined) suggested buttons needed to be 2.3 cm (0.91 in) in diameter to achieve reliable operation on the hand, 1.6 cm (0.63 in) on walls. This is approaching the accuracy of capacitive touchscreens, like those found in smart phones, but on arbitrary surfaces.

OmniTouch was developed by researchers Chris Harrison, Hrvoje Benko and Andy Wilson at Microsoft Research in 2011. The work was accepted to and presented at the 2011 ACM User Interface and Software Technology conference. Many major news outlets and online tech blogs covered the technology. [1][2] [3] [4] [5] [6]

It is conceptually similar to efforts such as Skinput and SixthSense. A central contribution of the work was a novel depth-driven, fuzzy template matching approach to finger tracking and click registration. The system also finds and tracks surfaces suitable for projection, on which interactive applications can be projected.

Citations

[edit]
  1. ^ Mitroff, Sarah (19 October 2011). "OmniTouch Turns Everything Into a Touchscreen". PC World. Retrieved 6 November 2011.
  2. ^ Weir, Bill (3 November 2011). "The End of Keyboards & Monitors: the OmniTouch". ABC News. Retrieved 28 January 2012.
  3. ^ Tarantola, Andrew (17 October 2011). "The OmniTouch Makes Any Surface Interactive". Gizmodo. Retrieved 28 January 2012.
  4. ^ Smith, Mat (18 October 2011). "OmniTouch projection interface makes the world your touchscreen". Engadget. Retrieved 28 January 2012.
  5. ^ Graham-Rowe, Duncan (18 October 2011). "Kinect Turns Any Surface Into a Touch Screen". Technology Review. Retrieved 28 January 2012.
  6. ^ "Turn Any Surface Into a Touch Screen". Forbes. 19 October 2011. Archived from the original on October 21, 2011. Retrieved 28 January 2012.
[edit]