Dr Balandino Di Donato B.DiDonato@napier.ac.uk
Lecturer
Douglas Bowman
Editor
Luke Dahl
Editor
Myo Mapper is a free and open source cross-platform application to map data from the gestural device Myo armband into Open Sound Control (OSC) messages. It represents a ‘quick and easy’ solution for exploring the Myo’s potential for realising new interfaces for musical expression. Together with details of the software, this paper reports some applications in which Myo Mapper has been successfully used and a qualitative evaluation. We then proposed guidelines for using Myo data in interactive artworks based on insight gained from the works described and the evaluation. Findings show that Myo Mapper empowers artists and non-skilled developers to easily take advantage of Myo data high-level features for realising interactive artistic works. It also facilitates the recognition of poses and gestures beyond those included with the product by using third-party interactive machine learning software.
Presentation Conference Type | Conference Paper (Published) |
---|---|
Conference Name | International Conference on New Interfaces for Musical Expression |
Start Date | Jun 3, 2018 |
End Date | Jun 6, 2018 |
Acceptance Date | Mar 15, 2018 |
Online Publication Date | Jun 3, 2018 |
Publication Date | 2018 |
Deposit Date | Aug 9, 2021 |
Publicly Available Date | Aug 9, 2021 |
Pages | 138-143 |
Book Title | Proceedings of the International Conference on New Interfaces for Musical Expression |
ISBN | 978-1-949373-99-8 |
DOI | https://doi.org/10.5281/zenodo.1302705 |
Keywords | Myo armband, mapping, feature extraction, EMG, hand gestures recognition, interactive machine learning |
Public URL | http://researchrepository.napier.ac.uk/Output/2791907 |
Publisher URL | http://www.nime.org/proceedings/2018/nime2018_paper0030.pdf |
Myo Mapper: A Myo Armband To OSC Mapper
(767 Kb)
PDF
Publisher Licence URL
http://creativecommons.org/licenses/by/4.0/
Copyright Statement
Licensed under a Creative Commons Attribution 4.0 International License (CC BY 4.0).
Human-Sound Interaction: Towards a Human-Centred Sonic Interaction Design approach
(2020)
Presentation / Conference Contribution
Improvising through the senses: a performance approach with the indirect use of technology
(2018)
Journal Article
Gesture-Timbre Space: Multidimensional Feature Mapping Using Machine Learning and Concatenative Synthesis
(2021)
Presentation / Conference Contribution
HarpCI, Empowering Performers to Control and Transform Harp Sounds in Live Performance
(2019)
Journal Article
About Edinburgh Napier Research Repository
Administrator e-mail: repository@napier.ac.uk
This application uses the following open-source libraries:
Apache License Version 2.0 (http://www.apache.org/licenses/)
Apache License Version 2.0 (http://www.apache.org/licenses/)
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2024
Advanced Search