Scratch input
In computing, scratch input is an acoustic-based method of Human-Computer Interaction (HCI) that takes advantage of the characteristic sound produced when a finger nail or other object is dragged over a surface, such as a table or wall. The technique is not limited to fingers; a stick or writing implements (e.g. chalk, or a pen) can also be used. The sound is often inaudible to the naked ear (i.e., silent). However, specialized microphones can digitize the sounds for interactive purposes. Scratch input was invented by Mann et al. in 2007,[1][2][3] though the term was first used by Chris Harrison et al.[4]
History
[edit]A natural interface for musical expression operating on scratch input principles was first published and presented in June 2007.[1] Later that year, it was extended to an implementation on a smartphone and also a wearable computer system.[2]
In 2008, the Scratch Input project[4] demonstrated a mobile device input system utilizing scratch input, simultaneously popularizing the term.[5] This system captured audio transmitted through a surface on which a mobile phone was placed, enabling the entire surface to be used as an input device.
Uses
[edit]Scratch input is an enabling input technique that is used in multitude of applications. The earliest application was a highly expressive musical instrument (Mann et al.) for use with mobile devices on natural objects, surfaces, or the like, as a non-synthesizing (i.e. idiophonic) musical instrument. Harrison et al.[4] proposed it to create large, ad hoc gestural input areas when mobile devices are rested on tables.
Commercial potential
[edit]Microsoft has expressed interest in Scratch Input.[5]
With this we can start to think of every flat surface as an potential input area. If mass produced this sensor could cost less than a dollar....Despite the limitations, the technology holds enough promise to make it into the hands of consumers. It is exciting because it is so low cost. This idea has the potential to go beyond just a research project.
— Daniel Wigdor, User experience architect at Microsoft and curator of the emerging technology demos at SIGGRAPH
See also
[edit]- Scratch Input with ice skates
- Vision-assisted Scratch Input
- Scratch Input explanation and demonstration
References
[edit]- ^ a b Steve Mann, "Natural Interfaces for Musical Expression: Physiphones and a physics-based organology", Proceedings of the 2007 Conference on New Interfaces for Musical Expression (NIME07), June 6–10, New York, NY, USA.
- ^ a b Steve Mann, Ryan Janzen, Raymond Lo, and Chris Aimone, "Inventing new instruments based on a computational `hack' to make a badly tuned or unpitched instrument play in perfect harmony", Proceedings of the 2007 International Computer Music Conference (ICMC2007), August 27–31, Copenhagen.
- ^ a b Steve Mann, Ryan Janzen and Raymond Lo, "Hyperacoustic instruments: Computer-controlled instruments that are not electrophones", Proc. International IEEE conference (ICME 2008), Hannover, Germany, June 23–26, 2008.
- ^ a b c Harrison, Chris and Hudson, Scott E. "Scratch Input: Creating Large, Inexpensive, Unpowered and Mobile finger Input Surfaces". In Proceedings of the 21st Annual ACM Symposium on User interface Software and Technology. UIST '08. ACM, New York, NY, 205-208
- ^ a b Priya Ganapati (7 August 2009). "To Answer the Phone, Scratch Your Jeans". WIRED.