Next Article in Journal
TrainAR: A Scalable Interaction Concept and Didactic Framework for Procedural Trainings Using Handheld Augmented Reality
Next Article in Special Issue
Musical Control Gestures in Mobile Handheld Devices: Design Guidelines Informed by Daily User Experience
Previous Article in Journal
A Systematic Literature Review on the User Experience Design for Game-Based Interventions via 3D Virtual Worlds in K-12 Education
Previous Article in Special Issue
Using High-Performance Computers to Enable Collaborative and Interactive Composition with DISSCO
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

FeelMusic: Enriching Our Emotive Experience of Music through Audio-Tactile Mappings

1
Department of Engineering Mathematics, University of Bristol, Bristol BS8 1TH, UK
2
School of Psychological Science, University of Bristol, Bristol BS8 1TH, UK
*
Author to whom correspondence should be addressed.
Multimodal Technol. Interact. 2021, 5(6), 29; https://doi.org/10.3390/mti5060029
Submission received: 5 March 2021 / Revised: 21 May 2021 / Accepted: 24 May 2021 / Published: 31 May 2021
(This article belongs to the Special Issue Musical Interactions)

Abstract

We present and evaluate the concept of FeelMusic and evaluate an implementation of it. It is an augmentation of music through the haptic translation of core musical elements. Music and touch are intrinsic modes of affective communication that are physically sensed. By projecting musical features such as rhythm and melody into the haptic domain, we can explore and enrich this embodied sensation; hence, we investigated audio-tactile mappings that successfully render emotive qualities. We began by investigating the affective qualities of vibrotactile stimuli through a psychophysical study with 20 participants using the circumplex model of affect. We found positive correlations between vibration frequency and arousal across participants, but correlations with valence were specific to the individual. We then developed novel FeelMusic mappings by translating key features of music samples and implementing them with “Pump-and-Vibe”, a wearable interface utilising fluidic actuation and vibration to generate dynamic haptic sensations. We conducted a preliminary investigation to evaluate the FeelMusic mappings by gathering 20 participants’ responses to the musical, tactile and combined stimuli, using valence ratings and descriptive words from Hevner’s adjective circle to measure affect. These mappings, and new tactile compositions, validated that FeelMusic interfaces have the potential to enrich musical experiences and be a means of affective communication in their own right. FeelMusic is a tangible realisation of the expression “feel the music”, enriching our musical experiences.
Keywords: haptics and haptic interfaces; audio-tactile mapping; affective tactile stimulation haptics and haptic interfaces; audio-tactile mapping; affective tactile stimulation
Graphical Abstract

Share and Cite

MDPI and ACS Style

Haynes, A.; Lawry, J.; Kent, C.; Rossiter, J. FeelMusic: Enriching Our Emotive Experience of Music through Audio-Tactile Mappings. Multimodal Technol. Interact. 2021, 5, 29. https://doi.org/10.3390/mti5060029

AMA Style

Haynes A, Lawry J, Kent C, Rossiter J. FeelMusic: Enriching Our Emotive Experience of Music through Audio-Tactile Mappings. Multimodal Technologies and Interaction. 2021; 5(6):29. https://doi.org/10.3390/mti5060029

Chicago/Turabian Style

Haynes, Alice, Jonathan Lawry, Christopher Kent, and Jonathan Rossiter. 2021. "FeelMusic: Enriching Our Emotive Experience of Music through Audio-Tactile Mappings" Multimodal Technologies and Interaction 5, no. 6: 29. https://doi.org/10.3390/mti5060029

APA Style

Haynes, A., Lawry, J., Kent, C., & Rossiter, J. (2021). FeelMusic: Enriching Our Emotive Experience of Music through Audio-Tactile Mappings. Multimodal Technologies and Interaction, 5(6), 29. https://doi.org/10.3390/mti5060029

Article Metrics

Back to TopTop