Tangible Computing: interfaces of the present

Tangible computing interfaces (TUIs) are the embodiment and manipulation of digital information through human senses, such as physical and bodily interaction. TUIs were envisioned in the 90s’ as a forward-thinking alternative beyond graphics user interfaces (GUIs), which were then (and still are) the most common form of HCI. Interaction with GUIs is through a single device, which does not take advantage of humans’ senses and skills.

Other avenues of HCI, such as ubiquitous computing, gather data from users to improve experience and interaction within their environment. The visibility of hardware is ideally invisible both physically and mentally, and the interaction with the data is hidden from the user. TUIs have opposing characteristics in both regards: data is manipulated through physical interaction with visible “Tangible Bits” [1].

To research current literature in Tangible Computing I explored the annual proceedings (2014 and 2015) of the tangible and embodied interaction (TEI) conference. This approach led to discovering tangible computing HCI laboratories, including the founding Tangible Media Group (TMG). Three papers have been chosen that exemplify both theoretical and practical TUIs.

MisTable: reach-through personal screens for tabletops [2]

This paper presents a novel interface that allows users to interact with and manipulate data that is represented in fog by reaching through the fog (where the data is represented) and dragging it onto a conventional horizontal interface. This paper was chosen due to the novelty of the reach-through interactions, and the proof-of-concept representation of data within fog. A demonstration of this TUI is shown in the following video:

Kinetic Blocks: Actuated Constructive Assembly for Interaction and Display [3]

Like other TMG work, this paper explores the theoretical underpinnings of TUI design by improving the possible uses of pin-based shape displays and ways to manipulate these, which are demonstrated in corresponding published video:

PhonoBlocks: A Tangible System for Supporting Dyslexic Children Learning to Read [4]

This paper uses pin-based objects (called PhonoBlocks) to support the learning of children with dyslexia. Specifically, children can move the PhonoBlocks until the word (data) presented on the corresponding screen is correct, and the colors of the PhonoBlocks change to represent the length each letter should be spoken — taking advantage of multiple senses. This was the most application-based TUI in my paper selection.

References

[1] H Ishii. “Tangible bits: beyond pixels”. 2008.
[2] D Plasencia, E Joye, S Subramanian. “MisTable: reach-through personal screens for tabletops”. 2014.
[3] P Schoessler, D Windham, D Leithinger, S Follmer, H Ishii. “Kinetic Blocks: Actuated Constructive Assembly for Interaction and Display”. 2015.
[4] A Antle, M Fan, E Cramer. “PhonoBlocks: A Tangible System for Supporting Dyslexic Children Learning to Read”. 2015.

Leave a Reply