Bodily Expression Support for Creative Dance Education by Grasping-Type Musical Interface with Embedded Motion and Grasp Sensors †
Abstract
:1. Introduction
2. Related Works
3. Proposed System
3.1. Design
3.2. Sound Generation Mechanism
4. Experiments
4.1. Creative Activity Experiment
4.1.1. Objective
4.1.2. Setting
- Step 1:
- ExplanationFirst, we explained the specifications of TwinkleBall. In the MIDI, there were 128 possible program sounds as the defining musical instrument sounds. Each subject selected a sound number for the output sound to be used with his/her motions.
- Step 2:
- Creative dance themeEach subject considered a theme for the creative dance.
- Step 3:
- Step Dance performanceThe subject danced twice: (1) the subject grasped TwinkleBall without sound (i.e., mute TwinkleBall), (2) the subject grasped TwinkleBall with sound. We observed one minute per dance. The theme (from Step 2) chosen by the subjects was the same in both dance experiments. To reduce order effects, we conducted experiments with two groups of five people. Half of the subjects received (1) first followed by condition (2); the other half received condition (2) first followed by condition (1).
- Step 4:
- Step Completing questionnairesFinally, the subjects completed questionnaires orally for a qualitative evaluation. The questions were as follows:
- Q.1 Do you think you could dance along to the theme without sound?
- Q.2 Do you think you could dance along to the theme with sound?
- Q.3 Did your motion and sound correspond when there was sound?
- Q.4 Was the sound by TwinkleBall useful to determine successive motions while dancing?
- Q.5 Did you feel TwinkleBall restrained your dance?
4.1.3. Evaluation Methods
4.2. Movement Accuracy Experiment
4.2.1. Objective
4.2.2. Setting
- Step 1:
- Adjustment to target speedThe subjects performed circular hand motions and adjusted until achieving the target acceleration (low, 10 m/s2; middle, 14 m/s2; high, 18 m/s2).
- Step 2:
- Maintaining target speedAfter confirming that the target acceleration was reached, the experimenter requested the subjects to maintain the hand motion constant for five seconds. The subjects performed hand motion twice for each of the three accelerations: (1) subject grasped TwinkleBall without sound (i.e., mute TwinkleBall) and (2) subject grasped TwinkleBall with sound. To reduce order effects, we conducted the experiments with two groups of five people. Half of the subjects performed (1) first followed by condition (2); the other half performed condition (2) first followed by condition (1).
4.2.3. Evaluation Method
5. Results and Discussions
5.1. Creative Activity Experiment
5.2. Movement Accuracy Experiment
6. Conclusions
Supplementary Materials
Acknowledgments
Author Contributions
Conflicts of Interest
References
- Yonezawa, M. The Investigations on the Teachers’ Attitudes to Dance in the Face of Scholastic Requirement of Dance in Middle Schools in Heisei 24 (2012) Academic Year; Studies in Humanities of Kanagawa University: Yokohama, Japan, 2012; pp. 53–80. [Google Scholar]
- Hosotani, Y.; Tamura, N. Basic Social Skills promoted by dance class at Shikoku University: Focusing on the subjects of a study involving problem-solving. Bull. Shikoku Univ. 2012, 37, 77–90. [Google Scholar]
- National Institute for Educational Policy Research. Creation of Evaluation Criteria, Reference Materials Middle School Health and Physical Education for the Improvement of Device Evaluation Method, etc; Kyoiku-Shuppan: Tokyo, Japan, 2011; ISBN 4316300512. [Google Scholar]
- Terayama, Y.; Hosokawa, E. Teaching and Viewpoints on “Improvised Expression” in the Study of Bodily Expression and Creative Dance. Res. J. JAPEW 2011, 27, 21–38. [Google Scholar]
- Miyamoto, O. Evaluation of dance performance by students in a creative class. Bull. Middle Sch. Attach. Ochanomizu Univ. 2005, 34, 65–86. [Google Scholar]
- Naugle, K.M.; Coombes, S.A.; Cauraugh, J.H.; Janelle, C.M. Influence of Emotion on the Control of Low-Level Force Production. Res. Q. Exerc. Sport 2012, 83, 353–358. [Google Scholar] [CrossRef] [PubMed]
- Wallbott, H.G. Bodily expression of emotion. Eur. J. Soc. Psychol. 1998, 28, 879–896. [Google Scholar] [CrossRef]
- Mendes, R.M.; Barbosa, R.I.; Salmon, C.E.G.; Rondinoni, C.; Escorsi-Rosset, S.; Delsim, J.C.; Barbieri, C.H.; Mazzer, N. Auditory stimuli from a sensor glove model modulate cortical audiotactile integration. Neurosci. Lett. 2013, 584, 33–37. [Google Scholar] [CrossRef] [PubMed]
- Camponogara, I.; Turchet, L.; Carner, M.; Marchioni, D.; Cesari, P. To hear or not to hear: Sound availability modulates sensory-motor integration. Front. Neurosci. 2016, 10, 22. [Google Scholar] [CrossRef] [PubMed]
- Laban, R.; Lawrence, F.C. Effort; MacDonald and Evans: London, UK, 1947. [Google Scholar]
- Bartenieff, I. Body Movement: Coping with the Environment; Routledge: London, UK, 1980. [Google Scholar]
- Yamaguchi, T.; Kadone, H. Supporting creative dance performance by grasping-type musical interface. In Proceedings of the 2014 IEEE International Conference on Robotics and Biomimetics (ROBIO), Bali, Indonesia, 5–10 December 2014; pp. 919–924. [Google Scholar]
- Sato, N.; Imura, S.; Nunome, H.; Ikegami, Y. The motion characteristic of expert street dancers during performance. Nagoya J. Health, Phys. Fit. Sports 2011, 34, 35–39. [Google Scholar]
- Sato, N.; Nunome, H.; Ikegami, Y. Motion characteristics in hip hop dance underlying subjective evaluation of the performance. In Proceedings of the 30th Annual Conference of Biomechanics in Sports, Melbourne, Australia, 2–6 July 2012; pp. 17–20. [Google Scholar]
- Nakamura, A.; Tabata, S.; Ueda, T.; Kiyofuji, S.; Kuno, Y. Dance training system with active vibro-devices and a mobile image display. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Edmonton, AB, Canada, 2–6 August 2005; pp. 3075–3080. [Google Scholar]
- Yang, U.; Kim, G.J. Implementation and evaluation of “Just follow me”: An immersive VR-based motion training system. Presence 2002, 11, 304–323. [Google Scholar] [CrossRef]
- Baek, S.; Lee, S.; Kim, G.J. Motion retargeting and evaluation for VR-based training of free motions. Vis. Comput. 2003, 19, 222–242. [Google Scholar] [CrossRef]
- Fujimoto, M.; Terada, T.; Tsukamoto, M. A Dance Training System that Maps Self-Images onto an Instruction Video. In Proceedings of the International Conference on Advances in Computer-Human Interactions, Valencia, Spain, 30 January–4 February 2012; pp. 309–314. [Google Scholar]
- Tedenuma, M.; Maekawa, T.; Inoue, M.; Harada, I.; Iwadate, Y.; Shiba, M. Development of an Interactive Dance System Suitable to “Kansei (Emotional Expression)” and Confirmation of the Support Effect for Image Transmission. Trans. Virtual Real. Soc. Jpn. 2003, 7, 495–502. [Google Scholar]
- Fujimoto, M.; Fujita, N.; Terada, T.; Tsukamoto, M. Lighting Choreographer: Design and Implementation of a Wearable LED Performance System. Trans. Virtual Real. Soc. Jpn. 2011, 16, 517–525. [Google Scholar]
- Medeiros, C.B.; Wanderley, M.M. A Comprehensive Review of Sensors and Instrumentation Methods in Devices for Musical Expression. Sensors 2014, 14, 13556–13591. [Google Scholar] [CrossRef] [PubMed]
- Paradiso, J.; Hsiao, K.Y.; Hu, E. Interactive music for instrumented dancing shoes. In Proceedings of the International Computer Music Conference (ICMC), Beijing, China, 22–28 October 1999; pp. 453–456. [Google Scholar]
- Tanaka, A. Musical technical issue in using interactive instrument technology with application to the BioMuse. In Proceedings of the International Computer Music Conference (ICMC), Tokyo, Japan, 10–15 September 1993; pp. 124–126. [Google Scholar]
- Françoise, J.; Alaoui, S.F.; Schiphorst, T.; Bevilacqua, T. Vocalizing Dance Movement for Interactive Sonification of Laban Effort Factors. In Proceedings of the Designing Interactive Systems, Vancouver, BC, Canada, 21–25 June 2014; pp. 1079–1082. [Google Scholar]
- Landry, S.; Ryan, J.; Jeon, M. Design issues and consideration for dance-based sonification. In Proceedings of the International Conference on Auditory Display, New York, NY, USA, 22–25 June 2014; pp. 35–38. [Google Scholar]
- Oh, J.; Herrera, J.; Bryan, N.J.; Dahl, L.; Wang, G. Evolving the mobile phone orchestra. In Proceedings of the New Interfaces for Musical Expression, Sydney, Australia, 15–18 June 2010; pp. 82–87. [Google Scholar]
- Essl, G.; Müller, A. Designing mobile musical instruments and environments with urMus. In Proceedings of the New Interfaces for Musical Expression, Sydney, Australia, 15–18 June 2010; pp. 76–81. [Google Scholar]
- Yamaguchi, T.; Hashimoto, S. Grasping interface with photo sensor for a musical instrument. In Proceedings of the HCI International, San Diego, CA, USA, 19–24 July 2009; pp. 542–547. [Google Scholar]
- Yamaguchi, T.; Kobayashi, T.; Ariga, A.; Hashimoto, S. TwinkleBall: A wireless musical interface for embodied sound media. In Proceedings of the New Interfaces for Musical Expression, Sydney, Australia, 15–18 June 2010; pp. 116–119. [Google Scholar]
- Nakata, T.; Mori, T.; Sato, T. Quantitative Analysis of Impression of Robot Bodily Expression based on Laban Movement Theory. J. Robot. Soc. Jpn. 2001, 19, 252–259. [Google Scholar] [CrossRef]
- Edagawa, K.; Kawasaki, M. Beta phase synchronization in the frontal-temporal-cerebellar network during auditory-to-motor rhythm learning. Sci. Rep. 2017, 7, 1–7. [Google Scholar]
- Sawyer, R.K. Education for innovation. Think. Skills Creativity 2006, 1, 41–48. [Google Scholar] [CrossRef]
- Scholz, D.S.; Rhode, S.; Großbach, M.; Rollnik, J.; Altenmuller, E. Moving with music for stroke rehabilitation: A sonification feasibility study. Ann. N. Y. Acad. Sci. 2015, 1337, 69–76. [Google Scholar] [CrossRef] [PubMed]
- Shimokakimoto, T.; Miura, A.; Suzuki, K. bioToys: Biofeedback toys for playful and self-determined physiotherapeutic activities. Artif. Life Robot. 2014, 19, 150–156. [Google Scholar] [CrossRef]
Themes | ||||
---|---|---|---|---|
My neighbor Totoro | Bamboo shoot | Sleepy, but I cannot sleep | Storm | Feeling when it rains |
Fun | I want relaxation | Gorilla | Deadline | Running women |
© 2017 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Yamaguchi, T.; Kadone, H. Bodily Expression Support for Creative Dance Education by Grasping-Type Musical Interface with Embedded Motion and Grasp Sensors. Sensors 2017, 17, 1171. https://doi.org/10.3390/s17051171
Yamaguchi T, Kadone H. Bodily Expression Support for Creative Dance Education by Grasping-Type Musical Interface with Embedded Motion and Grasp Sensors. Sensors. 2017; 17(5):1171. https://doi.org/10.3390/s17051171
Chicago/Turabian StyleYamaguchi, Tomoyuki, and Hideki Kadone. 2017. "Bodily Expression Support for Creative Dance Education by Grasping-Type Musical Interface with Embedded Motion and Grasp Sensors" Sensors 17, no. 5: 1171. https://doi.org/10.3390/s17051171
APA StyleYamaguchi, T., & Kadone, H. (2017). Bodily Expression Support for Creative Dance Education by Grasping-Type Musical Interface with Embedded Motion and Grasp Sensors. Sensors, 17(5), 1171. https://doi.org/10.3390/s17051171