Exploring Pitch and Timbre through 3D Spaces: Embodied Models in Virtual Reality as a Basis for Performance Systems Design

Richard Graham, Brian Bridges, Christopher Manzione, William Brent

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Our paper builds on an ongoing collaboration between theorists and practitioners within the computer music community, with a specific focus on three-dimensional environments as an incubator for performance systems design. In particular, we are concerned with how to provide accessible means of controlling spatialization and timbral shaping in an integrated manner through the collection of performance data from various modalities from an electric guitar with a multichannel audio output. This paper will focus specifically on the combination of pitch data treated within tonal models and the detection of physical performance gestures using timbral feature extraction algorithms. We discuss how these tracked gestures may be connected to concepts and dynamic relationships from embodied cognition, expanding on performative models for pitch and timbre spaces. Finally, we explore how these ideas support connections between sonic, formal and performative dimensions. This includes instrumental technique detection scenes and mapping strategies aimed at bridging music performance gestures across physical and conceptual planes.
LanguageEnglish
Title of host publicationUnknown Host Publication
Pages157-162
Publication statusE-pub ahead of print - 15 May 2017
EventInternational Conference on New Interfaces for Musical Expression (NIME) 2017 - Aalborg University, Denmark
Duration: 4 Apr 2017 → …

Publication series

NameArchive of NIME Proceedings
ISSN (Print)2220-4806

Conference

ConferenceInternational Conference on New Interfaces for Musical Expression (NIME) 2017
Period4/04/17 → …

Fingerprint

Virtual reality
Systems analysis
Computer music
Feature extraction

Keywords

  • Gesture
  • embodied
  • schemas
  • mapping
  • metaphor
  • spatialization
  • timbre
  • feature
  • tracking

Cite this

Graham, R., Bridges, B., Manzione, C., & Brent, W. (2017). Exploring Pitch and Timbre through 3D Spaces: Embodied Models in Virtual Reality as a Basis for Performance Systems Design. In Unknown Host Publication (pp. 157-162). (Archive of NIME Proceedings).
Graham, Richard ; Bridges, Brian ; Manzione, Christopher ; Brent, William. / Exploring Pitch and Timbre through 3D Spaces: Embodied Models in Virtual Reality as a Basis for Performance Systems Design. Unknown Host Publication. 2017. pp. 157-162 (Archive of NIME Proceedings).
@inproceedings{2691d2d8d4844e8f9a1a7cd744623c1f,
title = "Exploring Pitch and Timbre through 3D Spaces: Embodied Models in Virtual Reality as a Basis for Performance Systems Design",
abstract = "Our paper builds on an ongoing collaboration between theorists and practitioners within the computer music community, with a specific focus on three-dimensional environments as an incubator for performance systems design. In particular, we are concerned with how to provide accessible means of controlling spatialization and timbral shaping in an integrated manner through the collection of performance data from various modalities from an electric guitar with a multichannel audio output. This paper will focus specifically on the combination of pitch data treated within tonal models and the detection of physical performance gestures using timbral feature extraction algorithms. We discuss how these tracked gestures may be connected to concepts and dynamic relationships from embodied cognition, expanding on performative models for pitch and timbre spaces. Finally, we explore how these ideas support connections between sonic, formal and performative dimensions. This includes instrumental technique detection scenes and mapping strategies aimed at bridging music performance gestures across physical and conceptual planes.",
keywords = "Gesture, embodied, schemas, mapping, metaphor, spatialization, timbre, feature, tracking",
author = "Richard Graham and Brian Bridges and Christopher Manzione and William Brent",
note = "Printable proceedings available at: http://homes.create.aau.dk/dano/nime17_proceedings.pdf These proceedings will also be archived at http://www.nime.org/archives/ Reference text: [1]W. Brent. timbreID. Web. <http://williambrent.conflations.com>. Accessed January 2017. [2] B. Bridges and R. Graham, Electroacoustic Music as Embodied Cognitive Praxis: Denis Smalley’s theory of spectromorphology as an implicit theory of embodied cognition. Electroacoustic Music Studies Proceedings: EMS15 The Art of Electroacoustic Music, 2015. [3] M. Frigo and S. Johnson. FFTW library. Web. <http://www.fftw.org> Accessed January 2017. [4] R. Graham. Expansion of Electronic Guitar Performance Practice through the Application and Development of Interactive Digital Music Systems. Ph.D. Thesis. Ulster University, Northern Ireland, 2012. [5] R. Graham and B. Bridges. Gesture and Embodied Metaphor in Spatial Music Performance Systems Design. In Proc. NIME Conference, pp. 581–584, Goldsmiths, London, 2014. [6] R. Graham and B. Bridges. Managing Musical Complexity with Embodied Metaphors. In Proc. NIME Conference, pp. 103-106. LSU, Louisiana, 2015. [7] J. Grey. Multidimensional perceptual scaling of musical timbres. Journal of the Acoustical Society of America, 61, 5. 1977. [8] A. Jensenius. Action–Sound: Developing Methods and Tools to Study Music-Related Body Movement. PhD Thesis, University of Oslo, 2007 [9] M. Johnson. The Meaning of the Body: Aesthetics of Human Understanding. University of Chicago Press, Chicago , 2007 [10] O. L{\"a}hdeoja. Une approche de l’instrument augment{\'e}: La guitare {\'e}lectrique. PhD thesis, Ecole Doctorale {\'E}sth{\'e}tique, Sciences et Technologies des Arts, Paris 8, 2010. [11] G. Lakoff and M. Johnson. Metaphors We Live By. Chicago, University of Chicago Press, 1980. [12] F. Lerdahl. Tonal Pitch–space. Oxford University Press, Oxford, 2001. [13] D. Levitin. Control parameters for musical instruments: a foundation for new mappings of gesture to sound. Organised Sound, 7(2), 171-189. 2002. [14] P. Milgram et al. Augmented Reality: A class of displays on the reality- virtuality continuum. Telemanipulator and Telepresence technologies 2351, 34. pp. 282-292. 1994. [15] K. Patton. Morphological notation for interactive electroacoustic music, Organised Sound, 12(2), pp. 123–128. 2007. [16] L. Reboursi{\`e}re, et al. Multimodal guitar: atoolbox for augmented guitar performances. In Proc. NIME Conference 2010. [17] L. Reboursi{\`e}re et al. Left and right-hand guitar playing techniques detection. In Proc. NIME Conference, 2012. [18] C. Reynolds. Flocks, Herds and Schools: A distributed behavioral model. SIGGRAPH, 21, 4, pp. 25–34, 1987. [19] D. Smalley. Spectromorphology: explaining sound–shapes. Organised Sound, 2, 2, pp. 107–126, 1997. [20] D. Smalley. Space-form and the Acousmatic Image. Organised Sound, 12, 1, pp. 35-58, 2007. [21] K. Vetter. helmholtz~ finds the pitch. Web. <http://www.katjaas.nl/helmholtz/helmholtz.html>. Accessed January 2017.",
year = "2017",
month = "5",
day = "15",
language = "English",
series = "Archive of NIME Proceedings",
pages = "157--162",
booktitle = "Unknown Host Publication",

}

Graham, R, Bridges, B, Manzione, C & Brent, W 2017, Exploring Pitch and Timbre through 3D Spaces: Embodied Models in Virtual Reality as a Basis for Performance Systems Design. in Unknown Host Publication. Archive of NIME Proceedings, pp. 157-162, International Conference on New Interfaces for Musical Expression (NIME) 2017, 4/04/17.

Exploring Pitch and Timbre through 3D Spaces: Embodied Models in Virtual Reality as a Basis for Performance Systems Design. / Graham, Richard; Bridges, Brian; Manzione, Christopher; Brent, William.

Unknown Host Publication. 2017. p. 157-162 (Archive of NIME Proceedings).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - Exploring Pitch and Timbre through 3D Spaces: Embodied Models in Virtual Reality as a Basis for Performance Systems Design

AU - Graham, Richard

AU - Bridges, Brian

AU - Manzione, Christopher

AU - Brent, William

N1 - Printable proceedings available at: http://homes.create.aau.dk/dano/nime17_proceedings.pdf These proceedings will also be archived at http://www.nime.org/archives/ Reference text: [1]W. Brent. timbreID. Web. <http://williambrent.conflations.com>. Accessed January 2017. [2] B. Bridges and R. Graham, Electroacoustic Music as Embodied Cognitive Praxis: Denis Smalley’s theory of spectromorphology as an implicit theory of embodied cognition. Electroacoustic Music Studies Proceedings: EMS15 The Art of Electroacoustic Music, 2015. [3] M. Frigo and S. Johnson. FFTW library. Web. <http://www.fftw.org> Accessed January 2017. [4] R. Graham. Expansion of Electronic Guitar Performance Practice through the Application and Development of Interactive Digital Music Systems. Ph.D. Thesis. Ulster University, Northern Ireland, 2012. [5] R. Graham and B. Bridges. Gesture and Embodied Metaphor in Spatial Music Performance Systems Design. In Proc. NIME Conference, pp. 581–584, Goldsmiths, London, 2014. [6] R. Graham and B. Bridges. Managing Musical Complexity with Embodied Metaphors. In Proc. NIME Conference, pp. 103-106. LSU, Louisiana, 2015. [7] J. Grey. Multidimensional perceptual scaling of musical timbres. Journal of the Acoustical Society of America, 61, 5. 1977. [8] A. Jensenius. Action–Sound: Developing Methods and Tools to Study Music-Related Body Movement. PhD Thesis, University of Oslo, 2007 [9] M. Johnson. The Meaning of the Body: Aesthetics of Human Understanding. University of Chicago Press, Chicago , 2007 [10] O. Lähdeoja. Une approche de l’instrument augmenté: La guitare électrique. PhD thesis, Ecole Doctorale Ésthétique, Sciences et Technologies des Arts, Paris 8, 2010. [11] G. Lakoff and M. Johnson. Metaphors We Live By. Chicago, University of Chicago Press, 1980. [12] F. Lerdahl. Tonal Pitch–space. Oxford University Press, Oxford, 2001. [13] D. Levitin. Control parameters for musical instruments: a foundation for new mappings of gesture to sound. Organised Sound, 7(2), 171-189. 2002. [14] P. Milgram et al. Augmented Reality: A class of displays on the reality- virtuality continuum. Telemanipulator and Telepresence technologies 2351, 34. pp. 282-292. 1994. [15] K. Patton. Morphological notation for interactive electroacoustic music, Organised Sound, 12(2), pp. 123–128. 2007. [16] L. Reboursière, et al. Multimodal guitar: atoolbox for augmented guitar performances. In Proc. NIME Conference 2010. [17] L. Reboursière et al. Left and right-hand guitar playing techniques detection. In Proc. NIME Conference, 2012. [18] C. Reynolds. Flocks, Herds and Schools: A distributed behavioral model. SIGGRAPH, 21, 4, pp. 25–34, 1987. [19] D. Smalley. Spectromorphology: explaining sound–shapes. Organised Sound, 2, 2, pp. 107–126, 1997. [20] D. Smalley. Space-form and the Acousmatic Image. Organised Sound, 12, 1, pp. 35-58, 2007. [21] K. Vetter. helmholtz~ finds the pitch. Web. <http://www.katjaas.nl/helmholtz/helmholtz.html>. Accessed January 2017.

PY - 2017/5/15

Y1 - 2017/5/15

N2 - Our paper builds on an ongoing collaboration between theorists and practitioners within the computer music community, with a specific focus on three-dimensional environments as an incubator for performance systems design. In particular, we are concerned with how to provide accessible means of controlling spatialization and timbral shaping in an integrated manner through the collection of performance data from various modalities from an electric guitar with a multichannel audio output. This paper will focus specifically on the combination of pitch data treated within tonal models and the detection of physical performance gestures using timbral feature extraction algorithms. We discuss how these tracked gestures may be connected to concepts and dynamic relationships from embodied cognition, expanding on performative models for pitch and timbre spaces. Finally, we explore how these ideas support connections between sonic, formal and performative dimensions. This includes instrumental technique detection scenes and mapping strategies aimed at bridging music performance gestures across physical and conceptual planes.

AB - Our paper builds on an ongoing collaboration between theorists and practitioners within the computer music community, with a specific focus on three-dimensional environments as an incubator for performance systems design. In particular, we are concerned with how to provide accessible means of controlling spatialization and timbral shaping in an integrated manner through the collection of performance data from various modalities from an electric guitar with a multichannel audio output. This paper will focus specifically on the combination of pitch data treated within tonal models and the detection of physical performance gestures using timbral feature extraction algorithms. We discuss how these tracked gestures may be connected to concepts and dynamic relationships from embodied cognition, expanding on performative models for pitch and timbre spaces. Finally, we explore how these ideas support connections between sonic, formal and performative dimensions. This includes instrumental technique detection scenes and mapping strategies aimed at bridging music performance gestures across physical and conceptual planes.

KW - Gesture

KW - embodied

KW - schemas

KW - mapping

KW - metaphor

KW - spatialization

KW - timbre

KW - feature

KW - tracking

M3 - Conference contribution

T3 - Archive of NIME Proceedings

SP - 157

EP - 162

BT - Unknown Host Publication

ER -

Graham R, Bridges B, Manzione C, Brent W. Exploring Pitch and Timbre through 3D Spaces: Embodied Models in Virtual Reality as a Basis for Performance Systems Design. In Unknown Host Publication. 2017. p. 157-162. (Archive of NIME Proceedings).