Tangible, Physical & Embedded Computing; Touchy Feely Much-Appealy

Before I jump straight in to provide a definition of what tangible and embodied computing is, I implore you to first participate in a whistle-stop tour of the intriguing notion of ‘embodiment’ in order to get closer to the aims of the interesting field of tangible/embodied computing. Embodiment, a central theme in European phenomenology (the study of structures of individual experience and consciousness) writes that ‘things’ are situated in the world with the reality of those ‘things’ depend on the external effects of that setting. By focusing on the human activity of an interaction and how our understanding and use of particular artefacts can change depending on our environment, researchers in Tangible Computing pull down lofty abstract phenomena into concrete objects; as Ullmer puts perfectly “giving form to digital information” [1].

Embodiment does not just mean existing in a physical, visible matter but also in the way physical and social phenomena “unfold in real time and real space” [2] as part of the world in which we are situated. For Tangible Computing, two core aspects seem to be present; firstly giving form to something abstract by ‘physically embedding’ them in our world and secondly that ‘things’, our mind, cognition, environment and bodily actions are far greater intertwined than following a Cartesian dualism model. This theory describes the mind and body as distinctive things which can be detected in other fields of HCI. This prompts an interesting new field of interaction within HCI; physically interacting with representations of digital information through artefacts – ‘things’ which a person can directly manipulate and grasp – but ‘embedded’ within our reality whereby movement and gesture have a greater importance than just an interaction alone.

Figure 1: Tangible User Inferface - inForm
Figure 1: Tangible User Inferface – inForm

A common misconception I’ve found is to pose modern Graphical User Inferfaces (GUI) found on modern laptops and music players as opponents to the Tangible User Interfaces (TUI) such as Ishii’s (saucily-named) Tangible Bits [3] and Ullmer’s metaDESK [4] by incorporating more ‘physical’ artefacts in an interaction. Yet the best example of a TUI is the computer mouse which is dragged in the direction the user wishes to position the cursor on the screen, a friendly companion to GUI systems. Although it might be tempting to see that tangible forms of interaction have their own special quality over and above traditional text-based or graphical forms as argued in Hornecker and Buur’s work [5]. I think Embedded Computing has its own unique value as a field without picking fights with pre-existing tech. As I understand it the field helps to explore how users can interact with physical ‘things’ in a way that aims to support actions users that act through the technology, rather than just on the technology by shifting focus from the ‘thing’ that’s being enacted on towards the body, action and the environment. As Ullmer says, “don’t be dogmatic” [6].

 

As I was honoured with the position to co-chair a committee of the theme ‘Theory and Philosophy’ in the prestigious ‘Tangible Computing: The Past, Present & Future’ conference, I thought it only fair to find papers that tickled my fancy on the subject.

Van Dijk et al. in this paper call for a debate on the different conceptual paradigms underlying the TEI community, highlighting areas where essential conceptual debate is severely lacking. Have no fear as Van Dijk brings the party with a brief summary of the practical, theoretical and ethical issues of the field with a bold style of writing that can be a refreshing change to the impersonal addressing of ideas that a researcher may disagree with.

This paper combats the traditional model of action-control mappings, missing out on the great opportunities provided by tangibles. Through an exploration of conceptual metaphor theory, Macaranas investigates our predispositions towards understanding control pairings (UP IS MORE + DOWN IS LESS) with the hope of providing insight into future interface design work.

Although slightly cheating here with the inclusion of workshops; if this theoretical conference is to be a success (and get funding for next year), it might be an idea to get our audience involved in the same theories they’ll be listening to, especially for the kinaesthetic learners among us. Aslan and van Dijk critically explore the theory of mind for a “resource for making sense of what is happening” and Hemmert aims to contribute towards the complex definition of embodiment through a structural lens.


[1] Ullmer, B. (2002). ‘Brygg Ullmer Thesis Defense’. Retrieved 10 November, 2015, URL: https://vimeo.com/136275460

[2] Dourish, P (2003) ‘Where the Action Is: The Foundations of Embodied Interaction’, MIT Press, New Ed edition, pp. 99 – 100

[3] Ishii, T. Ullmer, B. (1997) ‘Tangible Bits: Towards Seamless Interfaces between People, Bits and Atoms’, CHI ’97 Proceedings of the ACM SIGCHI Conference on Human factors in computing systems, pp. 234 – 241, URL: http://dl.acm.org/citation.cfm?id=258715

[4] Ullmer, B. Ishii, H. ‘The metaDESK: models and prototypes for tangible user interfaces’, UIST ’97 Proceedings of the 10th annual ACM symposium on User interface software and technology, pp. 223 -232, URL: http://dl.acm.org/citation.cfm?id=263551&CFID=679726724

[5] Hornecker, E., and Buur, J. (2006). Getting a grip ontangible interaction: a framework on physical space andsocial interaction. Proc. CHI 2006, ACM Press, 437-446, URL: http://dl.acm.org/citation.cfm?id=1124838

[6] Ullmer, B. (2002). ‘Brygg Ullmer Thesis Defense’. Retrieved 10 November, 2015, URL: https://vimeo.com/136275460

Leave a Reply

Your email address will not be published. Required fields are marked *