ABSTRACT
Interactive technologies enable dancers to control the music in real-time with their movement. This paper presents the design and development of a model which takes as input a dancer’s movement and outputs music, structurally related to dance, with the use of machine learning techniques. Both the technical and artistic aspects of the model development are described in detail. In particular, the paper compares the use of machine learning techniques to traditional coding, in interactive dance and music applications. Moreover, it describes the significant discrimination between movement sonification and dance musification and explains why the model presented here falls into the second category. Special focus is given to the implications of the COVID-19 restrictions regarding the established collaboration with the dancer.
- Martin Anthony and Peter L Bartlett. 2009. Neural network learning: Theoretical foundations. cambridge university press.Google Scholar
- Ryan Aylward and Joseph A Paradiso. 2006. Sensemble: a wireless, compact, multi-user sensor system for interactive dance. In Proceedings of the 2006 conference on New interfaces for musical expression. Citeseer, 134–139. https://bit.ly/3ePJZuoGoogle Scholar
- Konstantinos Bakogiannis and George Cambourakis. 2017. Semiotics and memetics in algorithmic music composition. Technoetic Arts 15, 2 (2017), 151–161. https://doi.org/10.1386/tear.15.2.151_1Google ScholarCross Ref
- Christopher M Bishop. 2006. Pattern recognition and machine learning. springer.Google Scholar
- Antonio Camurri. 1995. Interactive dance/music systems. In Proc. of Intl. Computer Music Conference (ICMC’95), Banff. 245–252. http://hdl.handle.net/2027/spo.bbp2372.1995.075Google Scholar
- Antonio Camurri, Giovanni De Poli, Anders Friberg, Marc Leman, and Gualtiero Volpe. 2005. The MEGA project: Analysis and synthesis of multisensory expressive gesture in performing art applications. Journal of New Music Research 34, 1 (2005), 5–21. https://doi.org/10.1080/09298210500123895Google ScholarCross Ref
- Antonio Camurri, Giovanni De Poli, Marc Leman, and Gualtiero Volpe. 2001. A multi-layered conceptual framework for expressive gesture applications. In Proc. intl MOSART workshop, Barcelona. Citeseer, 1–6.Google Scholar
- Antonio Camurri and Gualtiero Volpe. 2011. Multimodal analysis of expressive gesture in music performance. In Musical robots and interactive multimodal systems, Jorge Solis and Kia Nq (Eds.). Springer, 47–66. https://doi.org/10.1007/978-3-642-22291-7_4Google Scholar
- Kameron R Christopher, Jingyin He, Raakhi Kapur, and Ajay Kapur. 2013. Kontrol: Hand Gesture Recognition for Music and Dance Interaction.. In NIME, Vol. 13. 267–270. https://doi.org/10.5281/zenodo.1178496Google Scholar
- Balandino Di Donato, Jamie Bullock, and Atau Tanaka. 2018. Myo Mapper: a Myo armband to OSC mapper. In Proceedings of the International Conference on New Interfaces for Musical Expression. 138–143. https://doi.org/10.5281/zenodo.1302705Google Scholar
- Florian Grond and Jonathan Berger. 2011. Parameter mapping sonification. In The sonification handbook, Thomas Hermann, Andy Hunt, and John G. Neuhoff (Eds.). Chapter 15, 363–397. https://bit.ly/3tK9Ho1Google Scholar
- Leontios J Hadjileontiadis. 2014. Conceptual blending in biomusic composition space: The “brainswarm” paradigm. In ICMC. 621–628. https://bit.ly/3ynMs70Google Scholar
- Ian Hattwick, Joseph W Malloch, and Marcelo M Wanderley. 2014. Forming Shapes to Bodies: Design for Manufacturing in the Prosthetic Instruments.. In NIME. 443–448. https://bit.ly/3eKjJRTGoogle Scholar
- Tobias Hildebrandt, Juergen Mangler, and Stefanie Rinderle-Ma. 2014. Something doesn’t sound right: Sonification for monitoring business processes in manufacturing. In 2014 IEEE 16th Conference on Business Informatics, Vol. 2. IEEE, 174–182. https://doi.org/10.1109/CBI.2014.12Google ScholarDigital Library
- Jung In Jung. 2018. Bridging Abstract Sound and Dance Ideas with Technology: Interactive Dance Composition as Practice-Based Research. In International Conference on Live Interfaces. 160–172. https://bit.ly/3eOi9yzGoogle Scholar
- Jung I Jung. 2019. CHOREOGRAPHIC SOUND COMPOSITION: Towards a poetics of restriction. Ph.D. Dissertation. University of Huddersfield. https://bit.ly/2RYZdE3Google Scholar
- Anna Källblad, Anders Friberg, Karl Svensson, and Elisabet Sjöstedt Edelholm. 2008. Hoppsa Universum–An interactive dance installation for children. In New Interfaces for Musical Expression (NIME), Genova, 2008. 128–133. https://bit.ly/3w6mmDjGoogle Scholar
- Simon Katan. 2016. Using Interactive Machine Learning to Sonify Visually Impaired Dancers’ Movement. In Proceedings of the 3rd International Symposium on Movement and Computing. 1–4. https://doi.org/10.1145/2948910.2948960Google ScholarDigital Library
- Chan Ji Kim. 2006. I. Composer and choreographer: A study of collaborative compositional process. II.“The Lotus Flower”, ballet music for chamber ensemble and two-channel audio. Ph.D. Dissertation. University of Florida. https://bit.ly/3oqDQrGGoogle Scholar
- Steven Landry and Myounghoon Jeon. 2020. Interactive sonification strategies for the motion and emotion of dance performances. Journal on Multimodal User Interfaces 14 (2020), 167–186. https://doi.org/10.1007/s12193-020-00321-3Google ScholarCross Ref
- Steven Landry, David Tascarella, Myounghoon Jeon, and S Maryam FakhrHosseini. 2016. Listen to your drive: Sonification architecture and strategies for driver state and performance. In Proceedings of the 22nd International Conference on Auditory Display (ICAD2016). International Community on Auditory Display, 1–3. https://doi.org/10.21785/icad2016.009Google ScholarCross Ref
- Fedor Lopukhov. 2002. Writings on ballet and music. Vol. 20. Univ of Wisconsin Press.Google Scholar
- Joseph Malloch. 2014. A Framework and Tools for Mapping of Digital Musical Instruments. Ph.D. Dissertation. McGill University (Canada). https://bit.ly/3bsAlvyGoogle Scholar
- Joseph Malloch, Ian Hattwick, and Marcelo Wanderley. 2013. Instrumented bodies: Prosthetic instruments for music and dance. J. Malloch. A Framework and Tools for Mapping of Digital Musical Instruments. Ph. D. Dissertation, McGill University, Montreal, Canada (2013).Google Scholar
- Paul H Mason. 2012. Music, dance and the total art work: choreomusicology in theory and practice. Research in dance education 13, 1 (2012), 5–24. https://doi.org/10.1080/14647893.2011.651116Google Scholar
- Joseph A Paradiso and Eric Hu. 1997. Expressive footwear for computer-augmented dance performance. In Digest of Papers. First International Symposium on Wearable Computers. IEEE, 165–166. https://doi.org/10.1109/ISWC.1997.629936Google ScholarCross Ref
- David Rokeby. 1995. Transforming Mirrors: Navigable Structures. In Critical Issues in Interactive Media, Simon Penny (Ed.). SUNY Press, 133–158. https://bit.ly/3tNSr1nGoogle Scholar
- Margaret Schedel, Phoenix Perry, and Rebecca Fiebrink. 2011. Wekinating 000000Swan: Using Machine Learning to Create and Control Complex Artistic Systems.. In NIME. Citeseer, 453–456. https://doi.org/10.5281/zenodo.1178151Google Scholar
- Julia H Schröder. 2017. Experimental relations between music and dance since the 1950s. In Music-Dance: Sound and Motion in Contemporary Discourse, Patrizia Veroliand Gianfranco Vinay (Eds.). Routledge, 141–156. https://doi.org/10.4324/9781315271996Google Scholar
- Benjamin Stahl and Balaji Thoshkahna. 2016. Design and evaluation of the effectiveness of a sonification technique for real time heart-rate data. Journal on Multimodal User Interfaces 10, 3 (2016), 207–219. https://doi.org/10.1007/s12193-016-0218-7Google ScholarCross Ref
- João Tragtenberg, Filipe Calegario, Giordano Cabral, and Geber L Ramalho. 2019. Towards the Concept of Digital Dance and Music Instruments.. In NIME’19. 89–94.Google Scholar
- Richard Vogl, Hamid Eghbal-Zadeh, Gerhard Widmer, and Peter Knees. 2018. GANs and Poses: An Interactive Generative Music Installation Controlled by Dance Moves. Interactive Machine-Learning for Music@ Exhibition at ISMIR. Paris, France (2018), 1–5. https://bit.ly/3tS03zMGoogle Scholar
- Duncan Alastair Hyatt Williams. 2016. Utility versus creativity in biomedical musification. Journal of Creative Music Systems 1 (2016), 1–13. Issue 1. https://doi.org/10.5920/JCMS.2016.02Google Scholar
- Min-Ling Zhang and Zhi-Hua Zhou. 2005. A k-nearest neighbor based algorithm for multi-label classification. In 2005 IEEE international conference on granular computing, Vol. 2. IEEE, 718–721. https://doi.org/10.1109/GRC.2005.1547385Google ScholarCross Ref
Index Terms
- The development of a dance-musification model with the use of machine learning techniques under COVID-19 restrictions
Recommendations
Machine Tango: An Interactive Tango Dance Performance
TEI '19: Proceedings of the Thirteenth International Conference on Tangible, Embedded, and Embodied InteractionIn Argentine tango, dancers typically respond to fixed musical recordings with improvised movements, each movement emerging in a wordless dialog between leader and follower. In the interactive work Machine Tango, this relation between dancers and music ...
Interactive Tango Milonga: designing internal experience
MOCO '15: Proceedings of the 2nd International Workshop on Movement and ComputingThe Argentine tango concept of connection refers to the experience of complete synchronicity between self, partner, and music. This paper presents Interactive Tango Milonga, an interactive system giving tango dancers agency over music in order to ...
Establishing a musical channel of communication between dancers and musicians in computer-mediated collaborations in dance performance
NIME '07: Proceedings of the 7th international conference on New interfaces for musical expressionIn this demonstration, I exemplify how a musical channel of communication can be established in computer-mediated interaction between musicians and dancers in real time. This channel of communication uses a software library implemented as a library of ...
Comments