Exploring a Taxonomy of Interaction in Interactive Sonification Systems

  • Visda GoudarziEmail author
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 1253)


This paper explores a variety of existing interactive sonification systems in the context of interactive sound art. In design of interactive sonification from technological standpoint, the stress is put on studying the usability and functionality of the systems. We explore the focus towards creative aspects of interaction in both technology development and sound creation stages. In some artistic sonifications, the control is in the hand of the technology creators, in some others in the hand of the artists, and sometimes in the hand of the performers or the audience members. The numerous relations and interactions between performers, composers, technologists, data domain scientists, environment and audiences make it difficult to classify the complex phenomenon of interactive sonification. Some challenges in such systems are the ownership of technical and aesthetic components, balancing engagement and interaction among different stakeholders (domain scientist, designer, composer, spectator, etc.) and encouraging audience engagement.


Auditory display Sonification Parameter mapping Interactive sonification Human computer interaction 


  1. 1.
    Antle, A.N., Corness, G., Bevans, A.: Springboard: designing image schema based embodied interaction for an abstract domain. In: Whole Body Interaction, pp. 7–18. Springer, London (2011)Google Scholar
  2. 2.
    Bau, O., Tanaka, A., Mackay, W.E.: The A20: musical metaphors for interface design. In: Proceedings of the International Conference on New Interfaces for Musical Expression, NIME (2008)Google Scholar
  3. 3.
    Bilda, Z., Edmonds, E., Candy, L.: Designing for creative engagement. Des. Stud. 29(6), 525–540 (2008)CrossRefGoogle Scholar
  4. 4.
    Bott, J.N., Crowley, J.G., LaViola, J.J., Jr.: One man band: a 3D gestural interface for collaborative music creation. In: Proceedings of the Virtual Reality Conference, VR 2009, pp. 273–274 (2009)Google Scholar
  5. 5.
    Bryan-Kinns, N., Hamilton, F.: Identifying mutual engagement. Behav. Inf. Technol. 31(2), 101–125 (2009)CrossRefGoogle Scholar
  6. 6.
    Cope, D.: Computer models of musical creativity. MIT Press, Cambridge (2005)Google Scholar
  7. 7.
    Coughlan, T., Johnson, P.: Interaction in creative tasks: ideation, representation and evaluation in composition. In: Proceedings of the ACM Human Factors in Computing Systems, CHI 2006 (2006)Google Scholar
  8. 8.
    Dahl, L., Herrera, J., Wilkerson, C.: Tweetdreams: making music with the audience and the world using real- time twitter data. In: Proceedings of the International Conference on New Interfaces for Musical Expression, NIME 2011, pp. 272–275 (2011)Google Scholar
  9. 9.
    Dixon, S.: Digital Performance: A History of New Media in Theater, Dance, Performance Art, and Installation. MIT Press, Cambridge (2007)CrossRefGoogle Scholar
  10. 10.
    Fiebrink, R., Trueman, D.: End-user machine learning in music composition and performance, Present. In: CHI 2012 Work. End-User Interact. with Intell. Auton. Syst. Austin, Texas, 6 May 2012, pp. 14–17 (2012)Google Scholar
  11. 11.
    Freeman, J., Godfrey, M.: Creative collaboration between audiences and musicians in Flock. Digital Creat. 21(2), 85–99 (2010)CrossRefGoogle Scholar
  12. 12.
    Goudarzi, V.: Designing an interactive audio interface for climate science. IEEE Multimedia 22(1), 41–47 (2015)CrossRefGoogle Scholar
  13. 13.
    Hermann, T., Hunt, A.: The discipline of interactive sonification. In: Proceedings of the International Workshop on Interactive Sonification, pp. 1–9, January 2004Google Scholar
  14. 14.
    Hermann, T., Nehls, A.V., Eitel, F., Barri, T., Gammel, M.: Tweetscapes - real-time sonification of twitter data streams for radio broadcasting. In: Proceedings of the International Conference on Auditory Display, ICAD 2012Google Scholar
  15. 15.
    Hermann, T., Ritter, H.: Sound and meaning in auditory data display. Proc. IEEE 92(4), 730–741 (2004)CrossRefGoogle Scholar
  16. 16.
    Hogg, B., Vickers, P.: Sonification abstraite/sonification concrete: An’aesthetic persepective space’for classifying auditory displays in the ars musica domain. In: Proceedings of the 12th International Conference on Auditory Display, London, UK (2006)Google Scholar
  17. 17.
    Jordà, S.: Multi-user instruments: models, examples and promises. In: Proceedings of the International Conference on New Interfaces for Musical Expression, NIME 2005, pp. 23–26 (2005)Google Scholar
  18. 18.
    Leplaitre, G., McGregor, I.: How to tackle auditory interface aesthetics? Discussion and case study. In: Proceedings of the 10th Meeting of the International Conference on Auditory Display, Sydney, Australia (2004)Google Scholar
  19. 19.
    McCartney, J.: Rethinking the computer music language: SuperCollider. Comput. Music J. 26(4), 61–68 (2002)MathSciNetCrossRefGoogle Scholar
  20. 20.
    Polansky, L.: Manifestation and Sonifiction (2002). Accessed 10 June 2020
  21. 21.
    Pressing, J.: Cybernetic issues in interactive performance systems. Comput. Music J. 14(1), 12–25 (1990)CrossRefGoogle Scholar
  22. 22.
    Roddy, S., Furlong, D.: Embodied aesthetics in auditory display. Organ. Sound 19(01), 70–77 (2014)CrossRefGoogle Scholar
  23. 23.
    Salter, C.L., Baalman, M.A., Moody-Grigsby, D.: Between mapping, sonification and composition: responsive audio environments in live performance. In: International Symposium on Computer Music Modeling and Retrieval, pp. 246–262. Springer Berlin Heidelberg, August 2007Google Scholar
  24. 24.
    Stowell, D., Plumbley, M.D., Bryan-Kinns, N.: Discourse analysis evaluation method for expressive musical interfaces. In: Proceedings of the International Conference on New Interfaces for Musical Expression, NIME 2008 (2008)Google Scholar
  25. 25.
    Suchman, L.A.: Plans and Situated Actions: The Problem of Human-machine Communication. Cambridge University Press, Cambridge (1987)Google Scholar
  26. 26.
    Vickers, P., Barrass, S.: Sonification design and aesthetics. In: Hermann, T., Hunt, A., Neuhoff, J.G. (eds.) The Sonification Handbook. Logos Verlag, Berlin (2011)Google Scholar
  27. 27.
    Wanderley, M.M., Orio, N.: Evaluation of input devices for musical expression: borrowing tools from HCI. Comput. Music J. 26(3), 62–76 (2002)CrossRefGoogle Scholar
  28. 28.
    Wolf, K.E., Oda, R.: MALLOMARCH: a live sonified performance with user interaction. In: Proceedings of the International Conference on Auditory Display, ICAD 2016 (2016)Google Scholar

Copyright information

© The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Switzerland AG 2021

Authors and Affiliations

  1. 1.Audio Arts and Acoustics DepartmentColumbia College ChicagoChicagoUSA

Personalised recommendations