SixthSense

From Wikipedia, the free encyclopedia
Jump to: navigation, search
A man with a fist-sized black plastic dome attached to his tie reaches for an item on the shelf in a grocery store.
Steve Mann wearing a camera+projector dome in 1998, which he used as one node of the collaborative Telepointer system
A man holds his hands in front of cellphone-sized pendant with a shining blue light. His fingertips are painted in different colors.
Pranav Mistry wearing a similar device in 2012, which he and Maes and Chang named "WUW", for Wear yoUr World.[1]
SixthSense HMD prototype by Pranav Mistry, MIT Media Lab

SixthSense is a gestural interface device comprising a neckworn pendant that contains both a data projector and camera. Headworn versions were also built at MIT Media Lab in 1997 that combined cameras and illumination systems for interactive photographic art, and also included gesture recognition (e.g. finger-tracking using colored tape on the fingers).[2]

SixthSense is a name for extra information supplied by a wearable computer, such as the device called "WuW" (Wear yoUr World) by Pranav Mistry et al., building on the concept of the Telepointer, a neckworn projector and camera combination first proposed and reduced to practice by MIT Media Lab student Steve Mann.[3][4]

Origin of the "Sixth Sense" name[edit]

Sixth Sense technology (camera combined with light source) was developed in 1997 (headworn) and 1998 (neckworn), but the Sixth Sense name for this work was first coined and published in 2001. Mann referred to this wearable computing technology as affording a "Synthetic Synesthesia of the Sixth Sense", i.e. the idea that wearable computing and digital information can act as an additional (i.e. sixth) sense.[5] Ten years later, Pattie Maes (also with MIT Media Lab) also used the term "Sixth Sense" in this same context, in her TED talk.

Construction and workings[edit]

The SixthSense technology contains a pocket projector, a mirror and a camera contained in a head-mounted, handheld or pendant-like, wearable device. Both the projector and the camera are connected to a mobile computing device in the user’s pocket. The projector projects visual information enabling surfaces, walls and physical objects around us to be used as interfaces; while the camera recognizes and tracks users' hand gestures and physical objects using computer-vision based techniques. The software program processes the video stream data captured by the camera and tracks the locations of the colored markers (visual tracking fiducials) at the tips of the user’s fingers. The movements and arrangements of these fiducials are interpreted into gestures that act as interaction instructions for the projected application interfaces. SixthSense supports multi-touch and multi-user interaction

Example applications[edit]

Augmented reality newspaper
Gestural camera

During the TED talk given by Professor Pattie Maes,[6] she showed a video demonstrating a number of applications of the SixthSense system:

  1. (2:35) Four colored cursors are controlled by four fingers wearing different colored markers in real time. The projector displays video feedback to the user on a vertical wall.
  2. (3:15) The projector displays a map on the wall, and the user can control the map using zoom and pan gestures.
  3. (3:20) The user can make a frame gesture to instruct the camera take a picture. It is hinted that the photo will be automatically cropped to remove the user's hands.
  4. (3:30) The system could project multiple photos on a wall, and the user could sort, re-size and organize them with gestures. This application was called Reality Window Manager (RWM) in Mann's headworn implementation of Sixth Sense.[7]
  5. (4:07) A number pad is projected onto the user's palm, and the user can dial a phone number by touching his palm with a finger. It was hinted that the system is able to pin point the location of the palm. It was also hinted the camera and projector are able to adjust themselves for surfaces that are not horizontal.
  6. (5:15) The user can pick up a product in supermarket (e.g. a package of paper towels), and the system could display related information (e.g. the amount of bleach used) back on the product itself.
  7. (5:55) The system can recognize any book picked up by the user and display Amazon rating on the book cover.
  8. (6:14) As the user opens a book, the system can display additional information such as reader's comments.
  9. (6:19) The system is able to recognize individual pages of a book and display annotation by the user's friend. This demo also hinted the system's ability to handle tilted surface.
  10. (6:35) The system is able to recognize newspaper articles and project the most recent video on the news event on a blank region of the newspaper.
  11. (6:46) The system is able to recognize people by their appearances and project a word cloud of related information retrieved from the internet on the person's body.
  12. (7:10) The system is able to recognize a boarding pass and display related information such as flight delay and gate change.
  13. (7:18) The user can draw a circle on his wrist, and the system will project a clock on it. Note this demo hinted at the ability to accurately detect the location of the wrist.

Despite wearing the device during the presentation, Professor Maes did not give a live demonstration of the technology. During the talk, she had emphasized repeatedly that the SixthSense technology was a work in progress, however it was never clarified whether the video demos were showing real working prototypes or merely made-up examples for illustrating the concept.

Current status[edit]

Although the SixthSense technology achieved wide press coverage in 2009, no commercial product had been released at that time. As of September 2013, the open source code published has not been updated since October 2012,[8] and the Java development branch of the project was similarly stalled.[9] With many users encountering difficulties compiling and running the source code, the technology itself has not spread as widely as its media coverage. Pranav Mistry hinted at several reasons for not being able to deliver the technology so far, including the need to incorporate newer hardwares and to remove the dependencies on proprietary Microsoft code libraries.[10]

References[edit]

  1. ^ "WUW - wear Ur world: a wearable gestural interface", Proceedings of CHI EA '09 Extended Abstracts on Human Factors in Computing Systems Pages 4111-4116, ACM New York, NY, USA
  2. ^ IEEE Computer, Vol. 30, No. 2, February 1997, Wearable Computing: A First Step Toward Personal Imaging, pp25-32
  3. ^ "IEEE ISWC P. 177" (PDF). Retrieved 2013-10-07. 
  4. ^ "Cyborg: Digital Destiny and Human Possibility in the Age of the Wearable Computer", Steve Mann with Hal Niedzviecki, ISBN 0385658257 (Hardcover), Random House Inc, 304 pages, 2001.
  5. ^ An Anatomy of the New Bionic Senses [Hardcover], by James Geary, 2002, 214pp
  6. ^ Pattie Maes + Pranav Mistry: Meet the SixthSense interaction | Video on. Ted.com. Retrieved on 2013-12-09.
  7. ^ Intelligent Image Processing, Wiley, 2001
  8. ^ "sixthsense/sixthsense · GitHub". Github.com. Retrieved 2013-09-29. 
  9. ^ "Poincare/sixthsense 路 GitHub". Github.com. Retrieved 2013-09-29. 
  10. ^ Brown, Jesse (2011-02-25). "Stuck between invention and implementation - Jesse Brown, Science & Technology, Technology". Macleans.ca. Retrieved 2013-09-29. 

External links[edit]