Abstract
This paper describes an auditory display system for smart city data for Dublin City, Ireland. It introduces and describes the different layers of the system and outlines how they operate individually and interact with one another. The system uses a deep learning model called a variational autoencoder to generate musical content to represent data points. Further data-to-sound mappings are introduced via parameter mapping sonification techniques during sound synthesis and post-processing. Conceptual blending and music theory provide frameworks, which govern the design of the system. The paper ends with a discussion of the design process that contextualizes the contribution, highlighting the interdisciplinary nature of the project, which spans data analytics, music composition and human-computer interaction.
| Original language | English |
|---|---|
| Title of host publication | International Conference on Auditory Display 25-28 June 2021 |
| Pages | 105-110 |
| Number of pages | 6 |
| DOIs | |
| Publication status | Published (in print/issue) - 25 Jun 2021 |
UN SDGs
This output contributes to the following UN Sustainable Development Goals (SDGs)
-
SDG 11 Sustainable Cities and Communities
Keywords
- sonification
- auditory display
- smart city
- deep learning
- conceptual blending
- mapping
- IoT
Fingerprint
Dive into the research topics of 'The Design of a Smart City Sonification System Using a Conceptual Blending and Musical Framework, Web Audio and Deep Learning Techniques'. Together they form a unique fingerprint.-
Mapping for meaning: the embodied sonification listening model and its implications for the mapping problem in sonic information design
Roddy, S. & Bridges, B., 30 Jun 2020, In: Journal on Multimodal User Interfaces. 14, 2, p. 143-151 9 p.Research output: Contribution to journal › Article › peer-review
Open AccessFile19 Link opens in a new tab Citations (Scopus)316 Downloads (Pure) -
Addressing the Mapping Problem in Sonic Information Design through Embodied Image Schemata, Conceptual Metaphors, and Conceptual Blending
Roddy, S. & Bridges, B., 10 Oct 2018, In: Journal of Sonic Studies. 17Research output: Contribution to journal › Article › peer-review
Open AccessFile -
Sound, Ecological Affordances and Embodied Mappings in Auditory Display
Roddy, S. & Bridges, B., 17 Jul 2018, New Directions in Third Wave Human–Computer Interaction . Filimowicz, M. & Tzankova, V. (eds.). 1 ed. Basel, Switzerland: Springer, Vol. 2. p. 231-258 27 p. (Human–Computer Interaction Series).Research output: Chapter in Book/Report/Conference proceeding › Chapter › peer-review
Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver