Hands-free computing

From Wikipedia, the free encyclopedia
Jump to: navigation, search

Hands-free computing is any computer configuration where a user can interface without the use of their hands, an otherwise common requirement of human interface devices such as the mouse and keyboard. Hands-free computing is important because it is useful to both able and disabled users. Speech recognition systems can be trained to recognize specific commands and upon confirmation of correctness instructions can be given to systems without the use of hands. This may be useful while driving or to an inspector or engineer in a factory environment. Likewise disabled persons may find hands-free computing important in their everyday lives. Just like visually impaired have found computers useful in their lives.[1]

This can range from using the tongue, lips, mouth, or movement of the head to voice activated interfaces utilizing speech recognition software and a microphone or bluetooth technology.

Examples of available hands-free computing devices include mouth-operated joystick types such as the TetraMouse, the QuadJoy, the Jouse2 and the IntegraMouse, camera based head tracking systems such as SmartNav, Tracker Pro, FreeTrack, HeadMouse Extreme, HeadMaster and KinesicMouse[2], and speech recognition specialized for disabilities such as Voice Finger. The joystick types require no physical connections to the user and enhances the user's feeling of independence. Camera types require targets mounted on the user, usually with the help of a caregiver, that are sensed by the camera and associated software. Camera types are sensitive to ambient lighting and the mouse pointer may drift and inaccuracies result from head movements not intended to be mouse movements. Other examples of hands-free mice are units that are operated using switches that may be operated by the feet (or other parts of the body), such as the NoHands Mouse and the switch-adapted TetraMouse. Speech recognition specialized for disabilities and hands-free computing focus more on low-level control of the keyboard and mouse than on usual areas like dictation.

References[edit]

  1. ^ Burgy, Christian (March 17–18, 2005). "Speech-Controlled Wearables: Are We There, Yet?". 2nd International Forum on Applied Wearable Computing. Switzerland: VDE VERLAG. pp. 17–27. Retrieved June 12, 2012. 
  2. ^ KinesicMouse: a Kinect based camera mouse for hands-free computing detecting more than 50 facial expressions http://kinesicmouse.com

See also[edit]