Skip to main content

Brain Computer Interface, Visual Tracker and Artificial Intelligence for a Music Polyphony Generation System

  • Conference paper
  • First Online:
Human-Computer Interaction – INTERACT 2021 (INTERACT 2021)

Abstract

In the Brain Computer Interface domain, studies on EEG represent a huge field of interest. Interactive systems that exploit low cost electroencephalographs to control machines are gaining momentum. Such technologies can be useful in the field of music and assisted composition. In this paper, a system that aims to generate four-part polyphonies is proposed. An artificial intelligence algorithm permits to generate polyphonies based on the N. Slonimsky’s theory by elaborating data coming from a Leap Motion device, to detect user’s hand movement, and a five-channel EEG signal detection device.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    In Fig. 1, the octave between the C note under the pentagram and the C note in the second upper space has been chosen and is represented in the Principal Tones section.

References

  1. Slonimsky, N.: 1984 – Thesaurus of Scales and Melodic Patterns. C. Scribner, New York (1947). Schirmer Books, s division of Macmillan Publishing Co., Inc., New York

    Google Scholar 

  2. Ultraleap. Leap Motion Controller. https://www.ultraleap.com/product/leap-motion-controller/. Accessed April 2021

  3. Emotiv Insight 5-channel mobile EEG. https://www.emotiv.com/product/emotiv-insight-5-channel-mobile-eeg/. Accessed April 2021

  4. Emotiv Cortex API. https://emotiv.gitbook.io/cortex-api/. Accessed April 2021

  5. Keras Framework. https://keras.io/. Accessed April 2021

  6. Max/Msp. https://cycling74.com/. Accessed April 2021

Download references

Acknowledgments

The authors acknowledge partial support of the following projects: PON Casa delle tecnologie emergenti di Matera “CTEMT” (CUP I14E20000020001), Servizi Locali 2.0, PON ARS01_00876 Bio-D, PON ARS01_00821 FLET4.0, PON ARS01_00917 OK-INSAID, H2020 PASSPARTOUT.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Carmelo Ardito .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 IFIP International Federation for Information Processing

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Ardito, C., Colafiglio, T., Di Noia, T., Di Sciascio, E. (2021). Brain Computer Interface, Visual Tracker and Artificial Intelligence for a Music Polyphony Generation System. In: Ardito, C., et al. Human-Computer Interaction – INTERACT 2021. INTERACT 2021. Lecture Notes in Computer Science(), vol 12936. Springer, Cham. https://doi.org/10.1007/978-3-030-85607-6_39

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-85607-6_39

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-85606-9

  • Online ISBN: 978-3-030-85607-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics