Lost in Translation: Tangibles and the Language of Interaction

As awesome as GUIs may be, they’re so far removed from human senses for purposes of manipulation that we all end up, inevitably, developing new senses and intuitions about their operation. For even though most GUIs started from metaphors to physical objects (e.g. desktops, sliders, files,…etc), this pseudo-physical environment is only interacted with through a robot (AKA mouse pointer) that we control remotely with a device utilizing very little of our physical capabilities, and certainly none that we’d be otherwise used to. Instead of augmenting the information into a representation we can interact with, we usually reduce our senses into a sort of nerve connecting to a prosthesis. Tangible Computing is a push in the opposite direction; trying to bring the digital domain closer to our senses instead of teaching us new interactions suited for it [1].

Tangible User Interface (TUI)

A Tangible User Interface is a physical representation of digital information that a user can interact with directly in an intuitive, natural way. This cuts out, or greatly reduces, the typical step of learning the language of the digital interface; as the interface itself is translated into the language of the human body instead. Far from simply being a neat gimmick, this is a very powerful idea. By removing the cognitive load of this ‘translation’, the user gains a fluency (and possibly a measure of creative freedom) that is largely absent in modern UIs.

While Tangible computing and Ubiquitous computing may overlap in some ways, the main objective for each differs greatly. Ubiquitous computing attempts to hide computers in plain sight, trying to blend in and extend the unconscious [2]. Tangible computing, however, attempts to use what’s in plain sight to interact with computers, appropriating what objects we use in our daily lives and how we use these objects.

Awesome New Interfaces

It is usually best to demonstrate, rather than describe. The following are some projects I’d be interested in having presentations about in the conference, under the theme of ‘Awesome New Interfaces’

 

Objects held in hand, system responses and underlying embodied metaphors [3]
1) On the Other Hand: Embodied Metaphors for Interactions with Mnemonic Objects in Live Presentations

On the Other Hand develops a presentation system wherein presentation display is controlled using RFID-tagged mnemonic objects while user location is tracked on the stage. The presenter wears DataTouch, an NFC-enabled ring that can recognise the tagged objects. By picking up an object, the related presentation topic is brought on display. The project also employs a position tracker that helps the presenter literally “approach” and “walk through” different points in the topic. This allows for seamless and non-linear presenting, giving the presenter more room to improvise during live presentations. [3]

 

2) Materiable

Developed in MIT Tangible Media Group, Materiable extends shape changing interfaces from forming shapes for digital data to including the formed shape’s physical and material properties. To simulate deformable materials. Materiable interfaces are based on physical algorithms running on pin-based displays. The system responds to users direct touch interactions by calculating and showing changes in flexibility, elasticity and viscosity, offering the user a richer experience in forming perceptions of what materials are being rendered. [4]

 

A volume of air is pushed out of the enclosure and pinches off from the aperture of the nozzle, resulting in a ring of air directed at an object in 3D space [5]
AIREAL: A volume of air is pushed out of the enclosure and pinches off from the aperture of the nozzle, resulting in a ring of air directed at an object in 3D space [5]

3) AIREAL: Interactive Tactile Experiences in Free Air

AIREAL is yet another way for digital data to interact with users. It enables users to feel 3D virtual objects in free space and receive haptic feedback without having to wear or touch any physical device. The concept behind AIREAL relies on generating air-vortexes pointed in certain directions, using inexpensive and scalable technologies. [5]  


References:

[1] Ishii, H. and Ullmer, B., 1997, March. Tangible bits: towards seamless interfaces between people, bits and atoms. In Proceedings of the ACM SIGCHI Conference on Human factors in computing systems (pp. 234-241). ACM. http://dl.acm.org/citation.cfm?id=258715

[2] Weiser, Mark (November 1, 1996). Computer Science Challenges for the Next 10 Years. YouTube. Rutgers University. Retrieved 7 March 2016. https://www.youtube.com/watch?v=7jwLWosmmjE

[3] Hemmert, F. and Joost, G., 2016, February. On the Other Hand: Embodied Metaphors for Interactions with Mnemonic Objects in Live Presentations. In Proceedings of the TEI’16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction (pp. 211-217). ACM. http://dl.acm.org/citation.cfm?id=2839470

[4] Nakagaki, K., Vink, L., Counts, J., Windham, D., Leithinger, D., Follmer, S. and Ishii, H., 2016, May. Materiable: Rendering Dynamic Material Properties in Response to Direct Physical Touch with Shape Changing Interfaces. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (pp. 2764-2772). ACM. http://dl.acm.org/citation.cfm?doid=2858036.2858104

[5] Sodhi, R., Poupyrev, I., Glisson, M. and Israr, A., 2013. AIREAL: interactive tactile experiences in free air. ACM Transactions on Graphics (TOG), 32(4), p.134. http://dl.acm.org/citation.cfm?id=2462007

Leave a Reply

Your email address will not be published. Required fields are marked *