Accessibility and dimensionality: enhanced real time creative independence for digital musicians with quadriplegic cerebral palsy

Frank Lyons, Brian Bridges, Brendan McCloskey

Research output: Contribution to journalArticle

Abstract

Inclusive music activities for people with physical disabilities commonly emphasise facilitated processes [1], based both on constrained gestural capabilities, and on the simplicity of the available interfaces. Inclusive music processes employ consumer controllers, computer access tools and/or specialized digital musical instruments (DMIs). The first category reveals a design ethos identified by the authors as artefact multiplication – many sliders, buttons, dials and menu layers; the latter types offer ergonomic accessibility through artefact magnification. We present a prototype DMI that eschews artefact multiplication in pursuit of enhanced real time creative independence. We reconceptualise the universal click-drag interaction model via a single sensor type, which affords both binary and continuous performance control. Accessibility is optimized via a familiar interaction model and through customized ergonomics, but it is the mapping strategy that emphasizes transparency and sophistication in the hierarchical correspondences between the available gesture dimensions and expressive musical cues. Through a participatory and progressive methodology we identify an ostensibly simple targeting gesture rich in dynamic and reliable features: (1) contact location; (2) contact duration; (3) momentary force; (4) continuous force, and; (5) dyad orientation. These features are mapped onto dynamic musical cues, most notably via new mappings for vibrato and arpeggio execution.
LanguageEnglish
Pages24-27
JournalProceedings of the International Conference on NIME, Louisiana, USA. June 2015
Volume1
Publication statusPublished - 1 Jun 2015

Fingerprint

Musical instruments
Ergonomics
Computer music
Transparency
Interfaces (computer)
Drag
Controllers
Sensors

Keywords

  • Accessibility
  • bespoke design
  • cerebral palsy
  • customized mappings
  • dimensionality
  • expressivity
  • feature extraction.

Cite this

@article{3bd3be24923e4cebbda7fe9c7e085e14,
title = "Accessibility and dimensionality: enhanced real time creative independence for digital musicians with quadriplegic cerebral palsy",
abstract = "Inclusive music activities for people with physical disabilities commonly emphasise facilitated processes [1], based both on constrained gestural capabilities, and on the simplicity of the available interfaces. Inclusive music processes employ consumer controllers, computer access tools and/or specialized digital musical instruments (DMIs). The first category reveals a design ethos identified by the authors as artefact multiplication – many sliders, buttons, dials and menu layers; the latter types offer ergonomic accessibility through artefact magnification. We present a prototype DMI that eschews artefact multiplication in pursuit of enhanced real time creative independence. We reconceptualise the universal click-drag interaction model via a single sensor type, which affords both binary and continuous performance control. Accessibility is optimized via a familiar interaction model and through customized ergonomics, but it is the mapping strategy that emphasizes transparency and sophistication in the hierarchical correspondences between the available gesture dimensions and expressive musical cues. Through a participatory and progressive methodology we identify an ostensibly simple targeting gesture rich in dynamic and reliable features: (1) contact location; (2) contact duration; (3) momentary force; (4) continuous force, and; (5) dyad orientation. These features are mapped onto dynamic musical cues, most notably via new mappings for vibrato and arpeggio execution.",
keywords = "Accessibility, bespoke design, cerebral palsy, customized mappings, dimensionality, expressivity, feature extraction.",
author = "Frank Lyons and Brian Bridges and Brendan McCloskey",
note = "Reference text: [1]T. Anderson, C. Smith. Composability: widening participation in music making for people with disabilities via music software and controller solutions. In: Proceedings of the 2nd International conference on assistive technologies, April 11-12, 1996, New York: ACM, pp. 110-116. [2] A. Aziz, B. Hayden, C. Warren, S. Follmer. The flote: an instrument for people with limited mobility. In: Proceedings of the 10th ACM conference on computers and accessibility, Halifax, Nova Scotia 2008, p. 295. [3] B. Cappelen, A-P. Andersson. Musicking tangibles for empowerment. In: Proceedings of the 13th International conference on computers helping people with special needs, ICCHP 2012. LNCS, p. 255. [4] G. Doherty, T. Anderson, M. Wilson, G. Faconti. A control centred approach to designing interaction with novel devices. In: Proceedings of HCI International Conference on Universal Access in human computer interaction, New Orleans 2001. Lawrence Erlbaum Associates, p. 287. [5] J. Goodman, P. Langdon, J. Clarkson. Formats for user data in inclusive design. In: C. Stephanidis (ed.) Universal access in HCI part 1, LNCS 4554. Springer-Verlag 2007, pp. 117-126. [6] D. Hall. Musical acoustics, (2nd ed.) California: Brooks-Cole 1991. [7] A. Hunt, M. Wanderley, R. Kirk. Towards a model for instrumental mapping in expert musical interaction. In: Proceedings of the International computer music conference, ICMA 2000, pp. 209-212. [8] A. Hunt, R. Kirk, M. Neighbour. Multiple media interfaces for music therapy. IEEE Multimedia, 11(3) July-September 2004, pp. 50-58. [9] F. Hwang, S. Keates, P. Langdon, J. Clarkson. Mouse movements of motion-impaired users: a sub-movement analysis. In: Proceedings of the 6th international conference on computers and accessibility. New York: ACM 2004, p. 102. [10] P. Juslin. Communicating emotion in music performance: a review and theoretical framework. In: Music and emotion: theory and research, New York: OUP 2001, p. 317. [11] L. Kessous, D. Arfib. Bimanuality in alternate musical instruments. In: Proceedings of the International conference on new interfaces for musical expression, NIME 2003, p. 141. [12] D. Meckin, N. Bryan-Kinns. MoosikMasheens: music, motion and narrative with young people who have complex needs. NIME 2014 workshop on accessibility. [13] K. Price, A. Sears. Performance-based functional assessment: an algorithm for measuring physical capabilities. In: Proceedings of the 10th ACM Conference on Computers and Accessibility, New York: ACM 2008, pp. 217-224. [14] J. Rovan, M. Wanderley, S. Dubnov, P. Depalle. Instrumental gesture mapping strategies as expressivity determinants in computer music performance. In: Proceedings of the International Kansei technology of emotion workshop [online], Paris: IRCAM 1997. [15] T. Swingler. The invisible keyboard in the air: an overview of the educational, therapeutic and creative applications of the EMS Soundbeam. In: Proceedings of the 2nd European conference on disability, virtual reality and associated technology, Sweden 1998, University of Reading: ECDVRAT, p. 255. [16] T. Ungvary, R. Vertegaal. Cognition and physicality in musical cyberinstruments. In: M. Wanderley and M. Battier (eds.) Trends in Gestural Control of Music, Paris: IRCAM 2000, pp. 371-386. [17] D. van Nort, M. Wanderley, P. Depalle. Mapping control structures for sound synthesis: functional and topological perspectives. In: Computer Music Journal 38(3), pp. 6-22, MIT Press 2014. [18] M. Wanderley, J. Viollet, F. Isart, X. Rodet. On the choice of transducer technologies for specific musical functions. In: Proceedings of the International computer music conference, Berlin 2000. [19] A. Wennerstrom. The music of everyday speech: prosody and discourse analysis, New York: OUP 2001, p.vii. [20] D. Wessel, M. Wright. Problems and prospects for intimate musical control of computers. In: Computer music journal 26(3) Fall 2002 Cambridge, MA: MIT Press, pp. 11-22. [21] C. Williams. Unintentional intrusive participation in multimedia interactive environments. In: Proceedings of the 7th International conference on disability, virtual reality and associated technology, Portugal 2008. University of Reading: ICDVRAT, p.205. [22] L. Wyse, D. Nguyen. Instrumentalising synthesis models. In: Proceedings of the International conference on new interfaces for musical expression, NIME 2010, pp. 140-143.",
year = "2015",
month = "6",
day = "1",
language = "English",
volume = "1",
pages = "24--27",

}

Accessibility and dimensionality: enhanced real time creative independence for digital musicians with quadriplegic cerebral palsy. / Lyons, Frank; Bridges, Brian; McCloskey, Brendan.

Vol. 1, 01.06.2015, p. 24-27.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Accessibility and dimensionality: enhanced real time creative independence for digital musicians with quadriplegic cerebral palsy

AU - Lyons, Frank

AU - Bridges, Brian

AU - McCloskey, Brendan

N1 - Reference text: [1]T. Anderson, C. Smith. Composability: widening participation in music making for people with disabilities via music software and controller solutions. In: Proceedings of the 2nd International conference on assistive technologies, April 11-12, 1996, New York: ACM, pp. 110-116. [2] A. Aziz, B. Hayden, C. Warren, S. Follmer. The flote: an instrument for people with limited mobility. In: Proceedings of the 10th ACM conference on computers and accessibility, Halifax, Nova Scotia 2008, p. 295. [3] B. Cappelen, A-P. Andersson. Musicking tangibles for empowerment. In: Proceedings of the 13th International conference on computers helping people with special needs, ICCHP 2012. LNCS, p. 255. [4] G. Doherty, T. Anderson, M. Wilson, G. Faconti. A control centred approach to designing interaction with novel devices. In: Proceedings of HCI International Conference on Universal Access in human computer interaction, New Orleans 2001. Lawrence Erlbaum Associates, p. 287. [5] J. Goodman, P. Langdon, J. Clarkson. Formats for user data in inclusive design. In: C. Stephanidis (ed.) Universal access in HCI part 1, LNCS 4554. Springer-Verlag 2007, pp. 117-126. [6] D. Hall. Musical acoustics, (2nd ed.) California: Brooks-Cole 1991. [7] A. Hunt, M. Wanderley, R. Kirk. Towards a model for instrumental mapping in expert musical interaction. In: Proceedings of the International computer music conference, ICMA 2000, pp. 209-212. [8] A. Hunt, R. Kirk, M. Neighbour. Multiple media interfaces for music therapy. IEEE Multimedia, 11(3) July-September 2004, pp. 50-58. [9] F. Hwang, S. Keates, P. Langdon, J. Clarkson. Mouse movements of motion-impaired users: a sub-movement analysis. In: Proceedings of the 6th international conference on computers and accessibility. New York: ACM 2004, p. 102. [10] P. Juslin. Communicating emotion in music performance: a review and theoretical framework. In: Music and emotion: theory and research, New York: OUP 2001, p. 317. [11] L. Kessous, D. Arfib. Bimanuality in alternate musical instruments. In: Proceedings of the International conference on new interfaces for musical expression, NIME 2003, p. 141. [12] D. Meckin, N. Bryan-Kinns. MoosikMasheens: music, motion and narrative with young people who have complex needs. NIME 2014 workshop on accessibility. [13] K. Price, A. Sears. Performance-based functional assessment: an algorithm for measuring physical capabilities. In: Proceedings of the 10th ACM Conference on Computers and Accessibility, New York: ACM 2008, pp. 217-224. [14] J. Rovan, M. Wanderley, S. Dubnov, P. Depalle. Instrumental gesture mapping strategies as expressivity determinants in computer music performance. In: Proceedings of the International Kansei technology of emotion workshop [online], Paris: IRCAM 1997. [15] T. Swingler. The invisible keyboard in the air: an overview of the educational, therapeutic and creative applications of the EMS Soundbeam. In: Proceedings of the 2nd European conference on disability, virtual reality and associated technology, Sweden 1998, University of Reading: ECDVRAT, p. 255. [16] T. Ungvary, R. Vertegaal. Cognition and physicality in musical cyberinstruments. In: M. Wanderley and M. Battier (eds.) Trends in Gestural Control of Music, Paris: IRCAM 2000, pp. 371-386. [17] D. van Nort, M. Wanderley, P. Depalle. Mapping control structures for sound synthesis: functional and topological perspectives. In: Computer Music Journal 38(3), pp. 6-22, MIT Press 2014. [18] M. Wanderley, J. Viollet, F. Isart, X. Rodet. On the choice of transducer technologies for specific musical functions. In: Proceedings of the International computer music conference, Berlin 2000. [19] A. Wennerstrom. The music of everyday speech: prosody and discourse analysis, New York: OUP 2001, p.vii. [20] D. Wessel, M. Wright. Problems and prospects for intimate musical control of computers. In: Computer music journal 26(3) Fall 2002 Cambridge, MA: MIT Press, pp. 11-22. [21] C. Williams. Unintentional intrusive participation in multimedia interactive environments. In: Proceedings of the 7th International conference on disability, virtual reality and associated technology, Portugal 2008. University of Reading: ICDVRAT, p.205. [22] L. Wyse, D. Nguyen. Instrumentalising synthesis models. In: Proceedings of the International conference on new interfaces for musical expression, NIME 2010, pp. 140-143.

PY - 2015/6/1

Y1 - 2015/6/1

N2 - Inclusive music activities for people with physical disabilities commonly emphasise facilitated processes [1], based both on constrained gestural capabilities, and on the simplicity of the available interfaces. Inclusive music processes employ consumer controllers, computer access tools and/or specialized digital musical instruments (DMIs). The first category reveals a design ethos identified by the authors as artefact multiplication – many sliders, buttons, dials and menu layers; the latter types offer ergonomic accessibility through artefact magnification. We present a prototype DMI that eschews artefact multiplication in pursuit of enhanced real time creative independence. We reconceptualise the universal click-drag interaction model via a single sensor type, which affords both binary and continuous performance control. Accessibility is optimized via a familiar interaction model and through customized ergonomics, but it is the mapping strategy that emphasizes transparency and sophistication in the hierarchical correspondences between the available gesture dimensions and expressive musical cues. Through a participatory and progressive methodology we identify an ostensibly simple targeting gesture rich in dynamic and reliable features: (1) contact location; (2) contact duration; (3) momentary force; (4) continuous force, and; (5) dyad orientation. These features are mapped onto dynamic musical cues, most notably via new mappings for vibrato and arpeggio execution.

AB - Inclusive music activities for people with physical disabilities commonly emphasise facilitated processes [1], based both on constrained gestural capabilities, and on the simplicity of the available interfaces. Inclusive music processes employ consumer controllers, computer access tools and/or specialized digital musical instruments (DMIs). The first category reveals a design ethos identified by the authors as artefact multiplication – many sliders, buttons, dials and menu layers; the latter types offer ergonomic accessibility through artefact magnification. We present a prototype DMI that eschews artefact multiplication in pursuit of enhanced real time creative independence. We reconceptualise the universal click-drag interaction model via a single sensor type, which affords both binary and continuous performance control. Accessibility is optimized via a familiar interaction model and through customized ergonomics, but it is the mapping strategy that emphasizes transparency and sophistication in the hierarchical correspondences between the available gesture dimensions and expressive musical cues. Through a participatory and progressive methodology we identify an ostensibly simple targeting gesture rich in dynamic and reliable features: (1) contact location; (2) contact duration; (3) momentary force; (4) continuous force, and; (5) dyad orientation. These features are mapped onto dynamic musical cues, most notably via new mappings for vibrato and arpeggio execution.

KW - Accessibility

KW - bespoke design

KW - cerebral palsy

KW - customized mappings

KW - dimensionality

KW - expressivity

KW - feature extraction.

M3 - Article

VL - 1

SP - 24

EP - 27

ER -