Tangible Computing: “Touch me if you can!”

In his defence Brygg Ullmers mentioned that everything started out in the physical context whether it was the first computers that used to be programmed by using punch cards or any sort of system where the interaction with it was entirely physical [1]. However, as the technology advanced, most of the physical aspects started demolishing and converting into the digital world and what used to be controlled or represented by a tangible object, became a digital one in the form of pixels representing a GUI. Professor Hiroshi Ishii was one of the pillars in introducing the Tangible User Computing to the world and has been leading this field with his team at the Tangible Media Group since the late 90s. Tangible User Interface is a one in which allows the user by using physical artefacts, that have some properties or links to the system, to manipulate a digital system. In other words, for the HCI world, the notion of Tangible User Interfaces acts as bringing computing back to the real world.

I Like how the Tangible Media Group at MIT describe the TUI and how they compare it to the GUI by using the metaphorical iceberg as an example [2]:

A tangible user interface is like an iceberg: there is a portion of the digital that emerges beyond the surface of the water—into the physical realm—so that we may interact directly with it.
A tangible user interface is like an iceberg: there is a portion of the digital that emerges beyond the surface of the water—into the physical realm—so that we may interact directly with it.
A graphical user interface only lets us see information and interact with it indirectly, as if we were looking through the surface of the water to interact with the forms below.
A graphical user interface only lets us see information and interact with it indirectly, as if we were looking through the surface of the water to interact with the forms below.

 

 

 

 

 

 

 

 

 

 

 

Tangible, Physical Embodied Computing stands out from other areas of HCI in that almost all other parts of HCI are user-centric, whereas, TUI focuses on the design of the interaction process. Also, other areas of HCI such as ubiquitous computing, as described by Weiser’s as the invisible technology of the future, seem to be contradicting to the TUI which aims make it visible by taking out the abstractions of digital world and making the interaction process as real to us as sensing the real world around us.

Finally, I would like to draw on one use of TUI from my personal experience that when I was doing an internship at the European centre of virtual reality lab (CERV) in France. In fact, Tangible interfaces are widely used with the field of virtual reality and, especially when designing for the augmented virtually where tools from the physical world are used in order to manipulate the virtual one, for example a case of a dentist training environment where the dentist is performing a surgery in the virtual environment but using real tools from the physical world. Therefore, here we can see the potential benefit of Tangible User Interfaces as they eliminate the confusion of using some other interface that will not be the same to use in a real surgery by using the same real ones and thereby increasing the efficiency and outcome of the training process.

References:

[1] Ullmer, B. (2002). ‘Brygg Ullmer Thesis Defense’. Retrieved 10 November, 2015, URL: https://vimeo.com/136275460

[2] Tangible Media Group. 2016. Tangible Media Group. [ONLINE] Available at: http://tangible.media.mit.edu/vision/. [Accessed 07 November 2016].


Related Papers:

[1]. Robert Jack, Tony Stockman, and Andrew McPherson. 2016. Navigation of Pitch Space on a Digital Musical Instrument with Dynamic Tactile Feedback. In Proceedings of the TEI ’16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction (TEI ’16). ACM, New York, NY, USA, 3-11. DOI: http://dx.doi.org/10.1145/2839462.2839503

I chose this paper as it directly relates to my interest in combining music with tehnology, and I believe that this paper is a good example of a tangible music interface TMI.

[2]. Oren Zuckerman, Tamar Gal, Tal Keren-Capelovitch, Tal Karsovsky, Ayelet Gal-Oz, and Patrice L. Tamar Weiss. 2016. DataSpoon: Overcoming Design Challenges in Tangible and Embedded Assistive Technologies. In Proceedings of the TEI ’16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction (TEI ’16). ACM, New York, NY, USA, 30-37. DOI: http://dx.doi.org/10.1145/2839462.2839505

This paper is about DataSpoon, a sensor-based spoon that seamlessly assesses the self-feeding skills of children with cerebral palsy. Although this might not be a clear example of TUI, yet I found it to be a good example of how Tangible Embodied Computing and ubiquitous computing can work together to produce better solutions to such problems that we face.

 

[3].  Danli Wang, Lan Zhang, Chao Xu, Haichen Hu, and Yunfeng Qi. 2016. A Tangible Embedded Programming System to Convey Event-Handling Concept. In Proceedings of the TEI ’16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction (TEI ’16). ACM, New York, NY, USA, 133-140. DOI: http://dx.doi.org/10.1145/2839462.2839491

 

The last paper is quite interesting as it imposes the use of TUI in order to teach young children some programming skills as it is said to have a good impact on their development.

 

Leave a Reply

Your email address will not be published. Required fields are marked *