Tangible user interface

From Wikipedia, the free encyclopedia
Jump to: navigation, search

A tangible user interface (TUI) is a user interface in which a person interacts with digital information through the physical environment. The initial name was Graspable User Interface, which is no longer used. The purpose of TUI development is to empower collaboration, learning, and design by giving physical forms to digital information, thus taking advantage of human abilities of grasp and manipulate physical objects and materials.[1]

One of the pioneers in tangible user interfaces is Hiroshi Ishii, a professor in the MIT Media Laboratory who heads the Tangible Media Group. His particular vision for tangible UIs, called Tangible Bits, is to give physical form to digital information, making bits directly manipulable and perceptible. Tangible bits pursues seamless coupling between these two very different worlds of bits and atoms.

Characteristics of tangible user interfaces[edit]

  1. Physical representations are computationally coupled to underlying digital information.
  2. Physical representations embody mechanisms for interactive control.
  3. Physical representations are perceptually coupled to actively mediated digital representations.
  4. Physical state of tangibles embodies key aspects of the digital state of a system

According to,[2] five basic defining properties of tangible user interfaces are as follows:

  1. space-multiplex both input and output;
  2. concurrent access and manipulation of interface components;
  3. strong specific devices;
  4. spatially aware computational devices;
  5. spatial re-reconfigurability of devices.

Examples[edit]

A simple example of tangible UI is the computer mouse. Dragging the mouse over a flat surface and have a pointer moves on the screen accordingly. There is a very clear relationship about the behaviors shown by a system with the movements of a mouse.

Another example of a tangible UI is the Marble Answering Machine by Durrell Bishop (1992). A marble represents a single message left on the answering machine. Dropping a marble into a dish plays back the associated message or calls back the caller.

Another example is the Topobo system. The blocks in Topobo are like LEGO blocks which can be snapped together, but can also move by themselves using motorized components. A person can push, pull, and twist these blocks, and the blocks can memorize these movements and replay them.

Another implementation allows the user to sketch a picture on the system's table top with a real tangible pen. Using hand gestures, the user can clone the image and stretch it in the X and Y axes just as one would in a paint program. This system would integrate a video camera with a gesture recognition system.

Another example is jive. The implementation of a TUI helped make this product more accessible to elderly users of the product. The 'friend' passes can also be used to activate different interactions with the product.

Several approaches have been made to establish a generic middleware for TUIs. They target toward the independence of application domains as well as flexibility in terms of the deployed sensor technology. For example, Siftables provides an application platform in which small gesture sensitive displays act together to form a human-computer interface.

For collaboration support TUIs have to allow the spatial distribution, asynchronous activities, and the dynamic modification of the TUI infrastructure, to name the most prominent ones. This approach presents a framework based on the LINDA tuple space concept to meet these requirements. The implemented TUIpist framework deploys arbitrary sensor technology for any type of application and actuators in distributed environments.

A further example of a type of TUI is a Projection Augmented model.

State of the art[edit]

Since the invention of Durell Bishop's Marble Answering Machine (1992)[3] two decades ago, the interest in Tangible User Interfaces (TUIs) has grown constantly and with every year more tangible systems are showing up. In 1999 Gary Zalewski patented a system of moveable children's blocks containing sensors and displays for teaching spelling and sentence composition.[4] A similar system is being marketed as "Siftables".

The MIT Tangible Media Group, headed by Hiroshi Ishi is continuously developing and experimenting with TUIs including many tabletop applications.[5]

The Urp[6] and the more advanced Augmented Urban Planning Workbench[7] allows digital simulations of air flow, shadows, reflections, and other data based on the positions and orientations of physical models of buildings, on the table surface.

Newer developments go even one step further and incorporate the third dimension by allowing a user to form landscapes with clay (Illuminating Clay[8]) or sand (Sand Scape[9]). Again different simulations allow the analysis of shadows, height maps, slopes and other characteristics of the interactively formable landmasses.

InfrActables[10] is a back projection collaborative table that allows interaction by using TUIs that incorporate state recognition. Adding different buttons to the TUIs enables additional functions associated to the TUIs. Newer Versions of the technology can even be integrated into LC-displays[11] by using infrared sensors behind the LC matrix.

The Tangible Disaster[12] allows the user to analyze disaster measures and simulate different kinds of disasters (fire, flood, tsunami,.) and evacuation scenarios during collaborative planning sessions. Physical objects ‚gpuckss’ allow positioning disasters by placing them on the interactive map and additionally tuning parameters (i.e. scale) using dials attached to them.

Apparently the commercial potential of TUIs has been identified recently. The repeatadly awarded Reactable,[13] an interactive tangible tabletop instrument, is now distributed commercially by Reactable Systems, a spinoff company of the Pompeu Fabra University, where it was developed. With the Reactable users can set up their own instrument interactively, by physically placing different objects (representing oscillators, filters, modulators... ) and parametrise them by rotating and using touch-input.

Microsoft is distributing its novel Windows-based platform Microsoft Surface[14] (now Microsoft PixelSense) since 2009. Beside multi touch tracking of fingers the platform supports the recognition of physical objects by their footprints. Several applications, mainly for the use in commercial space, have been presented. Examples reach from designing an own individual graphical layout for a snowboard or skateboard to studying the details of a wine in a restaurant by placing it on the table and navigating through menus via touch input. Also interactions like the collaborative browsing of photographs from a handycam or cell phone that connects seamlessly once placed on the table are supported.

Another notable interactive installation is instant city[15] that combines gaming, music, architecture and collaborative aspects. It allows the user to build three-dimensional structures and set up a city with rectangular building blocks, which simultaneously results in the interactive assembly of musical fragments of different composers.

The development of the Reactable and the subsequent release of its tracking technology reacTIVision[16] under the GNU/GPL as well as the open specifications of the TUIO protocol have triggered an enormous amount of developments based on this technology.

In the last few years also many amateur and semi-professional projects, beside academia and commerce have been started. Thanks to open source tracking technologies (reacTIVision[16]) and the ever increasing computational power available to end-consumers, the required infrastructure is nowadays accessible to almost everyone. A standard PC, a web-cam, and some handicraft work allow to set up tangible systems with a minimal programming and material effort. This opens doors to novel ways of perception of human-computer interaction and gives room for new forms of creativity for the broad public, to experiment and play with.

It is difficult to keep track and overlook the rapidly growing number of all these systems and tools, but while many of them seem only to utilize the available technologies and are limited to some initial experiments and tests with some basic ideas or just reproduce existing systems, a few of them open out into novel interfaces and interactions and are deployed in public space or embedded in art installations.[17][18]

The Tangible Factory Planning[19] is a tangible table based on reacTIVision[20] that allows to collaboratively plan and visualize production processes in combination with plans of new factory buildings and was developed within a diploma thesis.

Another of the many reacTIVision-based tabletops is ImpulsBauhaus-Interactive Table[21] and was on exhibition at the Bauhaus-University in Weimar marking the 90th anniversary of the establishment of Bauhaus. Visitors could browse and explore the biographies, complex relations and social networks between members of the movement.

See also[edit]

References[edit]

  1. ^ Ishii, H. 2008. Tangible bits: beyond pixels. In Proceedings of the 2nd international Conference on Tangible and Embedded interaction (Bonn, Germany, February 18 - 20, 2008). TEI ‘08. ACM, New York, NY, xv-xxv.
  2. ^ Mi Jeong Kim, Mary Lou Maher.: The impact of tangible user interfaces on spatial cognition during collaborative design.In: Design Studies, Vol 29, No. 3, May 2008
  3. ^ G. C. Smith. The marble answering machine. In The Hand That Rocks the Cradle, pages 60–65, May/June 1995.
  4. ^ G. M. Zalewski, Wireless I/O Apparatus and Method of Computer Assisted Instruction, US Patent 5,991,693, 1999.
  5. ^ "Tangible Media". www.media.mit.edu. MIT Media Lab. Retrieved 10 December 2014. 
  6. ^ Urp: a luminous-tangible workbench for urban planning and design John Underkoffler, Hiroshi Ishii May 1999 CHI '99: Proceedings of the SIGCHI conference on Human factors in computing systems
  7. ^ Augmented Urban Planning Workbench: Overlaying Drawings, Physical Models and Digital Simulation Hiroshi Ishii, Eran Ben-Joseph, John Underkoffler, Luke Yeung, Dan Chak, Zahra Kanji, Ben Piper September 2002 ISMAR '02: Proceedings of the 1st International Symposium on Mixed and Augmented Reality
  8. ^ Illuminating clay: a 3-D tangible interface for landscape analysis Ben Piper, Carlo Ratti, Hiroshi Ishii April 2002 CHI '02: Proceedings of the SIGCHI conference on Human factors in computing systems: Changing our world, changing ourselves
  9. ^ The tangible user interface and its evolution, Hiroshi Ishii, June 2008, Communications of the ACM , Volume 51 Issue 6
  10. ^ InfrActables: Multi-User Tracking System for Interactive Surfaces, C. Ganser Schwab, A. Steinemann, A. Kunz, Proceedings of the IEEE conference on Virtual Reality (IEEE VR 2006), 2006, Alexandria, Virginia, USA. InfrActables: http://www.youtube.com/watch?v=2l59zCQ3JbE
  11. ^ MightyTrace: Multiuser Tracking Technology on LC-displays, R. Hofer, A. Kunz, P. Kaplan, Proceeding of the twenty-sixth annual SIGCHI conference on Human factors in computing systems, 2008, Florence, Italy. http://www.youtube.com/watch?v=Q7az0fMRBHQ&feature=related
  12. ^ Tangible user interface for supporting disaster education Kazue Kobayashi, Tatsuhito Kakizaki, Atsunobu Narita, Mitsunori Hirano, Ichiro Kase August 2007 SIGGRAPH '07: SIGGRAPH 2007 posters
  13. ^ The reacTable: exploring the synergy between live music performance and tabletop tangible interfaces Sergi Jordà, Günter Geiger, Marcos Alonso, Martin Kaltenbrunner February 2007 TEI '07: Proceedings of the 1st international conference on Tangible and embedded interaction
  14. ^ Demo I Microsoft Surface and the Single View Platform Josh Wall May 2009 CTS '09: Proceedings of the 2009 International Symposium on Collaborative Technologies and Systems
  15. ^ instant city: a music building game table Sibylle Hauert, Daniel Reichmuth, Volker Böhm June 2007 NIME '07: Proceedings of the 7th international conference on New interfaces for musical expression
  16. ^ a b reacTIVision: a computer-vision framework for table-based tangible interaction Martin Kaltenbrunner, Ross Bencina February 2007 TEI '07: Proceedings of the 1st international conference on Tangible and embedded interaction
  17. ^ reacTIVision on Vimeo, http://vimeo.com/channels/reactivision
  18. ^ Sourceforge TUIO User Exhibition, http://sourceforge.net/apps/phpbb/reactivision/viewforum.php?f=7&sid=a5ef9e8af6f70496591e4dd5552a7f5a
  19. ^ Tangible Factory Planning, Diploma Thesis, Daniel Guse, http://www.danielguse.de/tangibletable.php
  20. ^ Martin Kaltenbrunner and Ross Bencina. 2007. reacTIVision: a computer-vision framework for table-based tangible interaction. In Proceedings of the 1st international conference on Tangible and embedded interaction (TEI '07). ACM, New York, NY, USA, 69-74. DOI=10.1145/1226969.1226983 http://doi.acm.org.ezproxy.rit.edu/10.1145/1226969.1226983
  21. ^ Interactive Table with reacTIVision : ImpulsBauhaus http://sourceforge.net/apps/phpbb/reactivision/viewtopic.php?f=7&t=59

External links[edit]