Atau Tanaka
Designing Gestures for Continuous Sonic Interaction
Tanaka, Atau; Di Donato, Balandino; Zbyszynski, Michael; Roks, Geert
Authors
Contributors
Marcelo Queiroz
Editor
Anna Xamb� Sed�
Editor
Abstract
This paper presents a system that allows users to quickly try different ways to train neural networks and temporal modeling techniques to associate arm gestures with time varying sound. We created a software framework for this, and designed three interactive sounds and presented them to participants in a workshop based study. We build upon previous work in sound-tracing and mapping-by-demonstration to ask the participants to design gestures with which to perform the given sounds using a multimodal, inertial measurement (IMU) and muscle sensing (EMG) device. We presented the user with four techniques for associating sensor input to synthesizer parameter output. Two were classical techniques from the literature, and two proposed different ways to capture dynamic gesture in a neural network. These four techniques were: 1.) A Static Position regression training procedure, 2.) A Hidden Markov based temporal modeler, 3.) Whole Gesture capture to a neural network, and 4.) a Windowed method using the position-based procedure on the fly during the performance of a dynamic gesture. Our results show trade-offs between accurate, predictable reproduction of the source sounds and exploration of the gesture-sound space. Several of the users were attracted to our new windowed method for capturing gesture anchor points on the fly as training data for neural network based regression. This paper will be of interest to musicians interested in going from sound design to gesture design and offers a workflow for quickly trying different mapping-by-demonstration techniques.
Citation
Tanaka, A., Di Donato, B., Zbyszynski, M., & Roks, G. (2019, June). Designing Gestures for Continuous Sonic Interaction. Presented at International Conference on New Interfaces for Musical Expression, Porto Alegre, Brazil
Presentation Conference Type | Conference Paper (Published) |
---|---|
Conference Name | International Conference on New Interfaces for Musical Expression |
Start Date | Jun 3, 2019 |
End Date | Jun 6, 2019 |
Acceptance Date | Mar 15, 2019 |
Online Publication Date | Jun 4, 2019 |
Publication Date | 2019 |
Deposit Date | Aug 9, 2021 |
Publicly Available Date | Aug 9, 2021 |
Pages | 180-185 |
Series Title | NIME |
Book Title | Proceedings of the International Conference on New Interfaces for Musical Expression |
DOI | https://doi.org/10.5281/zenodo.3672916 |
Public URL | http://researchrepository.napier.ac.uk/Output/2791942 |
Publisher URL | http://www.nime.org/proceedings/2019/nime2019_paper036.pdf |
Files
Designing Gestures For Continuous Sonic Interaction
(381 Kb)
PDF
Publisher Licence URL
http://creativecommons.org/licenses/by/4.0/
Copyright Statement
Published under a the Creative Commons Attribution 4.0 International License (CC BY 4.0).
You might also like
Scottish Mountain Soundscapes
(2024)
Presentation / Conference Contribution
The digital Foley: what Foley artists say about using audio synthesis
(2024)
Presentation / Conference Contribution
Building an Embodied Musicking Dataset for co-creative music-making
(2024)
Presentation / Conference Contribution
Remote Acoustic Soundscape evaluation using Ambisonic recordings
(2023)
Presentation / Conference Contribution
Downloadable Citations
About Edinburgh Napier Research Repository
Administrator e-mail: repository@napier.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2024
Advanced Search