How is Tangible, Physical and Embodied Computing different to other areas of HCI research?

Tangible, physical and embodied computing is about using physical objects to interact with the digital. It can give ‘physical form to digital information’ [4] whilst using physical objects to represent and control that information. Being entirely new to the field of tangible computing, a whole host of examples helped me to understand the concept [3, 5, 6] including metaDESK [1] which shows a physical object of a building that both represents an actual building whilst also being used to control the information. When the building is placed on the interface, a map of the area in which the building is situated appears and the digital information can be moved, rotated and scaled using the physical representation of the building.

Clear differences can be seen between tangible computing and other areas of HCI through the use of physical objects both as representations and controls. In ubiquitous computing or other HCI areas, interaction with computers is generally through graphical user interfaces or uses input devices such as keyboards. Even in embedding computation in everyday objects, the object itself is not responsible for representation and control as it is in tangible computing.

In addition, Paul Dourish in ‘Where the Action Is’ [2] explains that generally the computer revolution has focused on overcoming the ‘inherent limitations of the everyday world’ by creating a virtual world where information is inserted and distilled. In this sense, tangible computing ‘is not defined in opposition’ [2] to this notion but it must be recognised that while the digital and physical might be similar in terms of information, it is not possible to interact with them in the same way.

To interact in the world we rely on knowledge of the physical through what we see and hear, what we feel, the texture, colour and much more. Other areas of HCI can overlook this notion but tangible computing relies on our knowledge of the physical for representation, control and, therefore interaction.  Tangible computing can change the way we interact with computation compared to the ‘norm’ of a computer.

Cool New Interfaces

The three papers I have chosen for this week are:

Nazzi, E., & Sokoler, T. (2015, July). Augmenting everyday artefacts to support social interaction among senior peers. InProceedings of the 8th ACM International Conference on PErvasive Technologies Related to Assistive Environments (p. 11). ACM.

This paper sparked my interest through the combination of the tangible computing with social computing to support older populations. The paper explores the beneficial nature of such technologies to increase and improve social interaction.

Antle, A. N., Fan, M., & Cramer, E. S. (2015, January). PhonoBlocks: A Tangible System for Supporting Dyslexic Children Learning to Read. InProceedings of the Ninth International Conference on Tangible, Embedded, and Embodied Interaction(pp. 533-538). ACM.

I was interested to learn more about how tangible interfaces could be used in a learning capacity, particularly to support a specific learning difficulty. With the recognition that we all learn in different ways, whether visually or kinaesthetically, this paper was interesting to me to see how a digital representation could be used in such a way.

Behrens, M., Valkanova, N., & Brumby, D. P. (2014, June). Smart Citizen Sentiment Dashboard: A Case Study Into Media Architectural Interfaces. InProceedings of The International Symposium on Pervasive Displays(p. 19). ACM.

With tangible computing being a potentially very useful tool for urban planning, shown through the example of Urp, I was keen to explore newer manifestations of tangible interfaces in the urban landscape. This paper uses physical buildings as a projection screen and allows a tangible interface to allow interaction.

 

[1] Ullmer, B. (2002). Brygg Ullmer Thesis Defense. Retrieved 10 November, 2015, from https://vimeo.com/136275460

[2] Dourish, P. (2004). Where the action is.

[3] Jordà, S. (2010, April). The reactable: tangible and tabletop music performance. In CHI’10 Extended Abstracts on Human Factors in Computing Systems (pp. 2989-2994). ACM.

[4] Ullmer, B., & Ishii, H. (2000). Emerging frameworks for tangible user interfaces.IBM systems journal39(3.4), 915-931

[5] Underkoffler, J., & Ishii, H. (1999, May). Urp: a luminous-tangible workbench for urban planning and design. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems (pp. 386-393). ACM.

[6] Wellner, P. (1993). Interacting with paper on the DigitalDesk. Communications of the ACM36(7), 87-96.

Leave a Reply