Music, HCI and ME

Introduction

In this blog post I look forward to share and discuss some of my understandings of HCI and Music Interaction. First, I will start with a brief background information about the history of HCI, and what are my research interests in Music and Technology, how can the two fields collaborate in order to benefit the digital civics agenda, and how does this fit into the world of HCI. Then I am going to Talk about Music Interaction: what it is? and how does it relate to HCI?

After that I will discuss in thoroughly two main issues of HCI that I picked on from my previous assigned readings which are Tangible Embodied Interactions (TUI) and Ambiguity.

My Background

 

Before coming to Newcastle university and joining the Openlab, I studied computer science and had my masters in computer and communication engineering. Alongside my university studies, I was learning music at the Lebanese College of music. Studying HCI, stimulated the desire for me to combine my background in Computer Science and Music in order to see what role technology and music can play in contributing to key issues of digital civics such as Health, Democracy or Education such as promoting participatory media or making use of online social media as a resource to empower novice musicians and people who want to learn to play music but are unable to get it to share their knowledge and to expand participation in music production or consumption.

What is HCI?

 

Human Computer Interaction (HCI) is a field which joins research and practice in order to design interactive systems [1]. It goes back to the 1980’s where it was still a subarea of computer science dealing with cognitive sciences and human factor engineering. Soon it then expanded to be a separate multidisciplinary field incorporating many other disciplines from phycology and cognitive sciences to social sciences, etc.…

HCI can be viewed as an umbrella of communities such as: Ubiquitous Computing, Tangible Embedded and Embodied Interactions, Human-Data-Interaction and many more, focusing attention towards the two twin goals of HCI which are usefulness and usability [10]. The sub-communities of HCI by themselves, so the UbiComp also integrates subareas such as mobile computing, wearable devices, geo-spatial information systems and so on [4].

 

Music Interaction and How is Music related to HCI

 

Another sub community within HCI is Music and Human Computer Interaction referred to as “Music Interaction” and it has its own yearly conference called NIME (New Interfaces for Musial Expression) where researchers and musician gather to exchange knowledge and introduce their new musical interface designs.

Although Music Interactions may be viewed as a sub area of HCI just like HCI may be seen as a sub area of computer science. In fact, Music interaction borrows a lot from the HCI manifesto, yet the music community has always had its own perspective on interactive systems. Moreover, Music interaction have been always contributing to the field of HCI like for example:

  • Zimmerman’s aspiration to hear himself playing the air guitar is believed to inspire the invention of the data glove, hand tracking technologies and VR systems [16].
  • Tangible interaction and touch-based tools are linked back to The Reactable project [15], which was motivated by the challenges of Music Interaction. This is not to claim the exclusive credit to Music Interaction however. [2,3]

Tangible Embedded and Embodied Interactions beyond GUI

 

One of the sub communities of HCI is TEI, however it is important to distinguish between Interface and Interaction design as well.

TUI

In HCI, the traditional human interfaces defined the human interaction in terms of “input” and “output” where the output represent the digital representation through (screen, speaker, etc..), whereas the input is the direct manipulation of the output through peripheral interfaces such as a mouse or a keyboard.

By 1990s, Hiroshi Ishii and his team introduced the TUI as a new form of human computer interaction that abstracts the manipulation of digital information through physical tangible objects.

I Like how the Tangible Media Group at MIT describe the TUI and how they compare it to the GUI by using the metaphorical iceberg as an example [17]:

A tangible user interface is like an iceberg: there is a portion of the digital that emerges beyond the surface of the water—into the physical realm—so that we may interact directly with it.
A tangible user interface is like an iceberg: there is a portion of the digital that emerges beyond the surface of the water—into the physical realm—so that we may interact directly with it.
A graphical user interface only lets us see information and interact with it indirectly, as if we were looking through the surface of the water to interact with the forms below.
A graphical user interface only lets us see information and interact with it indirectly, as if we were looking through the surface of the water to interact with the forms below.

 

 

 

 

 

 

 

 

 

 

 

 

Ishii and his team in the Media Lab, saw a limitation in the traditional common GUI was of human computer interaction. In GUI, multiple users on the same machine must share the same keyboard and mouse whereas TUI allow for collocated and collaborative work [6].

Tangibles can be useful in many different areas as [6] describe it, such as:

  • Information storage, retrieval, and manipulation
  • Information visualization
  • Modeling and simulation
  • Systems management, configuration, and control
  • Education, entertainment, and programming systems

Tangible Interactions

Computer science and HCI researchers were concerned about the role of tangible user interfaces back in the 90’s meanwhile researchers from other disciplines such as industrial design was concerned about making tech devices more usable.

Another form of interaction is the ‘Physical Computing’ or designing software-controlled interactive objects that can be controlled through sensors and actuators. A good example of physical computing would be Makey Makey (see video below, http://makeymakey.com)

Makey Makey

 

 

The term “Tangible Interaction” came to act as an umbrella for these multiple forms of interaction.

Hornecker and Buur outline Tangible interaction in four major principles [8]:

 

  1. Tangibility and materiality
  2. Physical embodiment of data
  3. Bodily interaction
  4. Embeddedness in real spaces and contexts.

They also argue that the term tangible interface excludes many important aspects of product design, in addition to the confusion around whether whole body interaction like haptic feedback and gesture interfaces are tangibles or not therefore the term Tangible Interaction seemed to be more appropriate and flexible.

Again, it is important to distinguish between interface design and interaction design, where interface design focuses on the design of the interface itself, interaction design bring the behavioural interaction and how people interact with the systems [7].

Furthermore, Tangible, Physical Embodied Computing stands out from other areas of HCI in that almost all other parts of HCI are user-centric, whereas, TEI focuses on the design of the interaction process.

Relation to my interest in music interaction

As the previous section talked about Tangibles User Interface and Tangible Interactions in HCI, I like to argue that TEI is one of the main contributors and prominent theme in Music Interaction. Musical instruments by themselves are artefacts, they are also interfaces that we manipulate to trigger the output sound.

Music production and Interaction uses most of our brain resources, therefore when designing for Music Interaction systems we should keep the focus on the task at hand which is music off course however in HCI GUIs are known to require attention from our cognitive senses and many things get obscured when they are represented through a Graphical User Interface. This is why music engineers for example still use physical equalizing control interfaces and refuse to use the mouse and keyboard because it is almost useless and stands in the way of the production process and can affect creativity. On the other hand, Tangible Computing could be used in tasks that require high levels of concentration because we can design objects that are of a physical nature and familiar to us thus consuming much less of our awareness, unlike the GUIs [9]. Therefore, Tangible Interaction makes a great candidate for contributing to the Music Interaction world. Which is why we see in NIME conferences the majority of novel contributions are TEIs.

The area of Tangible Interaction is very wide and overlaps with other issues in HCI, despite its unclear commercial value, it still attracts many big companies such as Microsoft where TEI 2009 was held in MR, Cambridge, UK.

However, one of the most successful examples of Tangible Interaction is aforementioned project ReacTable from Universitat Pompeu Fabra. Basically, it’s a Table that can play music, the surface of the table is the graphical interface and can be manipulated either through tangible or multi-touch interaction, it is used in many museums and by many famous artists and DJs and has won multiple awards.

 

Example of uses where TEI for in the music production context such as the haptic wave with people with different skills or capabilities and how tangible interfaces can act as a bridge

Another member of the tangible family are the haptic interfaces, I felt the urge to include it in this blog post after hearing about the haptic wave project from John’s last one-on-one session with me. The Haptic Wave is a project conducted by Atau Tanaka and Adam Parkinson from Goldsmiths, University of London. So, I read their paper [11], which won best paper award in Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, and the project is really inspiring.

Haptic Wave is a device that helps visually impaired audio engineers, musicians, podcasts producers, etc. by mapping the visual representations of audio signals to the haptic domain (see video below). In other words, just like the visual waveforms allow a user to see sound, Haptic Wave is a tool which allows people with sight impairment to feel the sound. This work is a great example of participatory design which serves the participation issue in the digital civics agenda. The development process of the Haptic Wave device spanned over three workshops: scoping and brainstorming with the participants, then testing and gathering feedback, then a final testing and responding to previous feedbacks, followed by six studio trials. The idea stemmed from the difficulties that visually impaired audio engineers face with the off-the-shelf accessibility tools such as screen readers, and the overwhelming unnecessary information they are forced to listen to in order to know what is on the screen, that made them opt for the keyboard and using keyboard shortcuts to achieve certain tasks, the mouse wasn’t even useful for them as they will have to listen to a readout of cursor position on the screen. The keyboard shortcut is a working option for them however, it can’t substitute the mouse for dragging or drawing shapes and curves for example.

Interestingly, the paper stresses on the importance of this design approach because the users who were actually partners in the research process, with their visual impairment, are in fact experts in their own impairment.

 

 

 

 

Finally, this section has outlined the diversity of Tangible Interactions and the taxonomy discourse that always happen in the HCI ecosystem. Furthermore, the area of research is wide and a lot of areas exist where TEI can be applied. Most of the projects so far have been targeting the Education and supporting learning in addition to domestic appliances, Interactive music Instruments and tools to support participation and decision making.

Ambiguity

 

Within the two main goals of HCI research which are designing for usefulness and usability, the majority of the HCI community adopt the idea of designing for affordances.

Affordance is a concept coined by James Gibson from the psychology discipline, he defines it as the action possibilities offered by the environment to the actor [12]. Norman, introduced the same notion to the HCI field as a key guideline for interaction designers [13]. It denotes the designers’ concerns to make the designed object self-descriptive and immediately obvious. For example, adding shadows to a button in a GUI to make indicate that it’s clickable.

Ambiguity on the other hand is more or less the nemesis to Affordance. It was often avoided by the HCI community as it contradicts with the main goals of HCI. However, Gaver had a different perspective for ambiguity and he argues that it should be embraced and used in the design process [5]. He divides ambiguity into three main classes:

Ambiguity of Information

It urges participants to question the truth of the information presented and challenge them to apply their knowledge in various ways.

Ambiguity of Context

This form of ambiguity is very common in art work where things could be interpreted differently based on their context. For instance, advertising companies that produces ringtones for mobile phones to be used by youth, however, they discovered that mother used the ringtones in order to soothe their babies.

Ambiguity of Relation

It impels people to think about their beliefs, values, and feelings toward a specific issue, by designing for an experience that is not necessarily pleasant or easy.

In the same paper, he also proposes a short manifesto for useful ambiguous designs which are as follows:

“Enhancing Ambiguity of Information:

  • Use imprecise representations to emphasise uncertainty.
  • Over-interpret data to encourage speculation.
  • Expose inconsistencies to create a space of interpretation.
  • Cast doubt on sources to provoke independent assessment.

Creating Ambiguity of Context:

  • Implicate incompatible contexts to disrupt preconceptions.
  • Block expected functionality to comment on familiar products.

Provoking Ambiguity of Relationship

  • Offer unaccustomed roles to encourage imagination.
  • Point out things without explaining why.
  • Introduce disturbing side effects to question responsibility.”

Relatively, Ambiguity could be also used in designing for Music Interaction. In fact, many musical instruments are ambiguous in their design, and it is very rare that someone who has never seen the instrument played before to be able to and use it in the right way form the first time. Therefore, designers can make study this ambiguity in order to develop better easy to learn music instruments.

The key point that Gaver is trying to make isn’t promoting bad ambiguous design because there is many confusing, meaningless and useless designs. Instead he suggests that we use ambiguous designs to in order to uncover many insights about the user’s behaviour and produce interactive designs that are provocative and engaging, in addition to challenging designers themselves to go further beyond the limits of technology.

Conclusion

In Essence, this post started with a small background about my previous studies as well as my research interests within HCI and Digital Civics, then I demonstrated my understanding of HCI as a massive network of disciplines that collaborate to produce knowledge which in turn contributes to the development of useful and usable interactive system, I then followed by introducing Music Interaction as a Major contributor to the HCI community. After that, I discussed two main issues of relevance from the HCI community based on my previously assigned reading throughout the module which are “Tangible Embedded and Embodied Interactions (TEI)” and “Ambiguity as a design practice in HCI”.

Finally, in the light of my research interests, Paul Marshal’s Paper [14] makes a good point about how it is still unclear which types of tangibles are the most effective for particular civic issues such as education.


References

  1. Stuart Reeves. 2015. Human-computer interaction as science. In Proceedings of The Fifth Decennial Aarhus Conference on Critical Alternatives (AA ’15). Aarhus University Press 73-84. DOI=http://dx.doi.org/10.7146/aahcc.v1i1.21296
  2. Buxton, W. (2008). My vision isn’t my vision: Making a career out of getting back to where I started. In T. Erickson & D. McDonald (Eds.), HCI remixed: Reflections on works that have influenced the HCI community (pp. 7–12). Cambridge, MA: MIT Press.
  3. Holland, S., Wilkie, K., Mulholland, P. and Seago, A., 2013. Music interaction: understanding music and human-computer interaction. In Music and Human-Computer Interaction (pp. 1-28). Springer London.
  4. Carroll, J.M. (2010). Conceptualizing a possible discipline of Human-Computer Interaction. Interacting with Computers, 22, 3-12.
  5. William W. Gaver, Jacob Beaver, and Steve Benford. 2003. Ambiguity as a resource for design. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’03). ACM, New York, NY, USA, 233-240. DOI=http://dx.doi.org/10.1145/642611.642653
  6. Ullmer and H. Ishii. 2000. Emerging frameworks for tangible user interfaces. IBM Syst. J. 39, 3-4 (July 2000), 915-931. DOI=http://dx.doi.org/10.1147/sj.393.0915
  7. Mads Vedel Jensen, Jacob Buur, and Tom Djajadiningrat. 2005. Designing the user actions in tangible interaction. In Proceedings of the 4th decennial conference on Critical computing: between sense and sensibility (CC ’05), Olav W. Bertelsen, Niels Olof Bouvin, Peter G. Krogh, and Morten Kyng (Eds.). ACM, New York, NY, USA, 9-18. DOI=http://dx.doi.org/10.1145/1094562.1094565
  8. Eva Hornecker and Jacob Buur. 2006. Getting a grip on tangible interaction: a framework on physical space and social interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’06), Rebecca Grinter, Thomas Rodden, Paul Aoki, Ed Cutrell, Robin Jeffries, and Gary Olson (Eds.). ACM, New York, NY, USA, 437-446. DOI=http://dx.doi.org/10.1145/1124772.1124838
  9. Dourish, Paul (2001): Where the Action Is: The Foundations of Embodied Interaction. MIT Press.
  10. Nickerson, R. & Landauer, T. Human-computer interaction: Background and issues. In Helander, M.G., Landauer, T.K. and Prabhu, P. (eds.), Handbook of Human- Computer Interaction, 2nd edition. Amsterdam, The Netherlands: Elsevier Science.
  11. Atau Tanaka and Adam Parkinson. 2016. Haptic Wave: A Cross-Modal Interface for Visually Impaired Audio Producers. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI ’16). ACM, New York, NY, USA, 2150-2161. DOI: https://doi.org/10.1145/2858036.2858304
  12. Gibson, James J. (1979): The Ecological Approach to Visual Perception.
  13. Norman, Donald A. (1999): Affordances, Conventions, and Design. In Interactions, 6 (3) pp. 38-41.
  14. Paul Marshall. 2007. Do tangible interfaces enhance learning?. In Proceedings of the 1st international conference on Tangible and embedded interaction (TEI ’07). ACM, New York, NY, USA, 163-170. DOI=http://dx.doi.org/10.1145/1226969.1227004
  15. Jordà, Sergi, Geiger, Günter, Alonso, Marcos and Kaltenbrunner, Martin (2007): The reacTable: exploring the synergy between live music performance and tabletop tangible interfaces. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 139-146
  16. Zimmerman, T. G., Lanier, J., Blanchard, C., Bryson, S., & Harvill, Y. (1986). A hand gesture interface device. In J. M. Carroll & P. P. Tanner (Eds.), Proceedings of the SIGCH conference on human factors in computing systems (CHI ’87) (pp. 189–192). New York: ACM.
  17. Tangible Media Group. 2016. Tangible Media Group. [ONLINE] Available at: http://tangible.media.mit.edu/vision/. [Accessed 12 December 2016].

 

Leave a Reply

Your email address will not be published. Required fields are marked *