Skinput

From Wikipedia, the free encyclopedia
Jump to: navigation, search
The Skinput system rendering a series of buttons on the arm. Users can press the buttons directly, with their fingers, much like a touch screen.

Skinput is an input technology that uses bio-acoustic sensing to localize finger taps on the skin. When augmented with a pico-projector, the device can provide a direct manipulation, graphical user interface on the body. The technology was developed by Chris Harrison, Desney Tan, and Dan Morris, at Microsoft Research's Computational User Experiences Group.[1] Skinput represents one way to decouple input from electronic devices with the aim of allowing devices to become smaller without simultaneously shrinking the surface area on which input can be performed. While other systems, like SixthSense have attempted this with computer vision, Skinput employs acoustics, which take advantage of the human body's natural sound conductive properties (e.g., bone conduction).[2] This allows the body to be annexed as an input surface without the need for the skin to be invasively instrumented with sensors, tracking markers, or other items.

Microsoft has not commented on the future of the projects, other than it is under active development. It has been reported this may not appear in commercial devices for at least 2 years.[3]

Operation[edit]

Ten channels of acoustic data generated by three finger taps on the forearm, followed by three taps on the wrist. The exponential average of the channels is shown in red. Segmented input windows are highlighted in green. Note how different sensing elements are actuated by the two locations.

Skinput has been publicly demonstrated as an armband, which sits on the biceps. This prototype contains ten small cantilevered Piezo elements configured to be highly resonant, sensitive to frequencies between 25 and 78 Hz.[4] This configuration acts like a mechanical Fast Fourier transform and provides extreme out-of-band noise suppression, allowing the system to function even while the user is in motion. From the upper arm, the sensors can localize finger taps provided to any part of the arm, all the way down to the finger tips, with accuracies in excess of 90% (as high as 96% for five input locations).[5] Classification is driven by a support vector machine using a series of time-independent acoustic features that act like a fingerprint. Like speech recognition systems, the Skinput recognition engine must be trained on the "sound" of each input location before use. After training, locations can be bound to interactive functions, such as pause/play song, increase/decrease music volume, speed dial, and menu navigation.

With the addition of a pico-projector to the armband, Skinput allows users to interact with a graphical user interface displayed directly on the skin. This enables several interaction modalities, including button-based hierarchical navigation, list-based sliding navigation (similar to an iPod/SmartPhone/MID), text/number entry (e.g., telephone number keypad), and gaming (e.g., Tetris, Frogger)[6][7]

Demonstrations[edit]

Despite being a Microsoft Research internal project, Skinput has been demonstrated publicly several times. The first public appearance was at Microsoft's TechFest 2010, where the recognition model was trained live on stage, during the presentation, followed by an interactive walkthrough of a simple mobile application with four modes: music player, email inbox, Tetris, and voice mail.[8] A similar live demo was given at the ACM CHI 2010 conference, where the academic paper received a "Best Paper" award. Attendees were allowed to try the system. Numerous media outlets have covered the technology,[9][10][11][12][13] with several featuring live demos.[14]

References[edit]

  1. ^ "Skinput:Appropriating the Body as an Input Surface". Microsoft Research Computational User Experiences Group. Retrieved 26 May 2010. 
  2. ^ Harrison, Chris; Tan, Desney; Morris, Dan (10–15 April 2010). "Skinput:Appropriating the Body as an Input Surface" (PDF). proceedings of the ACM CHI conference 2010. 
  3. ^ Goode, Lauren (26 April 2010). "The Skinny on Touch Technology". Wall Street Journal. 
  4. ^ Sutter, John (19 April 2010). "Microsoft's Skinput turns hands, arms into buttons". CNN. 
  5. ^ Ward, Mark (26 March 2010). "Sensors turn skin into gadget control pad". BBC News. 
  6. ^ "Skinput: Appropriating the Body as an Input Surface" (blog). Chrisharrison.net. 
  7. ^ "Skinput: Appropriating the Body as an Input Surface". Youtube (from CHI 2010 conference). 
  8. ^ Dudley, Brier (1 March 2010). "A peek at where Microsoft thinks we're going tomorrow". Seattle Times. 
  9. ^ Hope, Dan (4 March 2010). "'Skinput' turns body into touchscreen interface". MSNBC. 
  10. ^ Hornyak, Tom (2 March 2010). "Turn your arm into a phone with Skinput". CNET. 
  11. ^ Marks, Paul (1 March 2010). "Body acoustics can turn your arm into a touchscreen". New Scientist. 
  12. ^ Dillow, Clay (3 March 2010). "Skinput Turns Any Bodily Surface Into a Touch Interface". Popular Science. 
  13. ^ "Technology: Skin Used As An Input Device" (interview transcript). National Public Radio. 4 March 2010. 
  14. ^ Savov, Vladislav (2 March 2010). "Skinput: because touchscreens never felt right anyway" (video). Engadget. 

External links[edit]