Next Article in Journal
Effects of Mindfulness Meditation on Conscious and Non-Conscious Components of the Mind
Next Article in Special Issue
Multiple Sensors Based Hand Motion Recognition Using Adaptive Directed Acyclic Graph
Previous Article in Journal
Improved Gender Recognition during Stepping Activity for Rehab Application Using the Combinatorial Fusion Approach of EMG and HRV
Previous Article in Special Issue
Fall Detection for Elderly from Partially Observed Depth-Map Video Sequences Based on View-Invariant Human Activity Representation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Tangible User Interface and Mu Rhythm Suppression: The Effect of User Interface on the Brain Activity in Its Operator and Observer

1
Advanced Business Center, Dai Nippon Printing Co., Ltd., Tokyo 1628001, Japan
2
Graduate School of Integrated Frontier Science, Kyushu University, Fukuoka 8128581, Japan
3
Research Fellow of Japan Society for the Promotion of Science, Tokyo 1020083, Japan
4
Department of Multimedia, Cultural Bureau, Musée de Louvre, Paris 75058, France
5
Department of Human Science, Faculty of Design, Kyushu University, Fukuoka 8158540, Japan
*
Author to whom correspondence should be addressed.
Appl. Sci. 2017, 7(4), 347; https://doi.org/10.3390/app7040347
Submission received: 20 February 2017 / Revised: 20 March 2017 / Accepted: 28 March 2017 / Published: 31 March 2017
(This article belongs to the Special Issue Human Activity Recognition)

Abstract

:
The intuitiveness of tangible user interface (TUI) is not only for its operator. It is quite possible that this type of user interface (UI) can also have an effect on the experience and learning of observers who are just watching the operator using it. To understand the possible effect of TUI, the present study focused on the mu rhythm suppression in the sensorimotor area reflecting execution and observation of action, and investigated the brain activity both in its operator and observer. In the observer experiment, the effect of TUI on its observers was demonstrated through the brain activity. Although the effect of the grasping action itself was uncertain, the unpredictability of the result of the action seemed to have some effect on the mirror neuron system (MNS)-related brain activity. In the operator experiment, in spite of the same grasping action, the brain activity was activated in the sensorimotor area when UI functions were included (TUI). Such activation of the brain activity was not found with a graphical user interface (GUI) that has UI functions without grasping action. These results suggest that the MNS-related brain activity is involved in the effect of TUI, indicating the possibility of UI evaluation based on brain activity.

1. Introduction

Tangible user interface (TUI) is a type of UI that allows a person to interact with digital information through the physical environment [1]. This type of UI involves the use of tangible objects that bridges the gap between digital information and physical space, and is characterized by its UI function based on the manipulation (e.g., grasping and moving) of that objects [2]. Because TUI accepts specific user actions on the objects as inputs, the result of each action can be easily predicted, which is thought to bring more intuitiveness in comparison with other UIs. By combining physical and digital representations, TUI provides more fun as well as intuitiveness to its users. Its applications in the design community include those focused on user experience (e.g., [3,4]) and learning at museums [5].
The intuitiveness of various TUIs provided at museums is likely to have an effect on the experience and learning of not only those who are operating them, but also of those who are just watching it. TUI can make observers understand the purpose of each user action more easily, because the user action on the tangible object is visible.
Such characteristics have led to its applications to education (e.g., [6,7,8]), providing more fertile environment for education than conventional GUI [9]. For example, as proposed by Hornecker et al. [10], TUI allows a variety of interaction styles and also has known social aspects. Its application to collaboration was proposed early on (e.g., [11,12]). TUI is also known to have an effect on its observers. When a person wants to use a UI for the first time, he/she often starts by watching someone actually operating it. Through such an observation to learn the operation, he/she can more easily find an interest in the operation and will become more motivated to operate the UI by himself/herself. Therefore, it is important to evaluate the intuitiveness of a UI for the observer.
There are a lot of UI evaluation methods, including those based on subjective, behavioral, and psychological approaches (e.g., [13,14,15]). However, most studies to date have focused on the effect on the user. The effect on observers has been studied about their behaviors or experiences, but it has only been investigated about physiological responses in few cases [16,17,18].
In the present study, we take a neuroscientific approach based on a brain function that is activated by observing the action of others. There are a set of neurons that fire both by performing and observing the same action. These are referred to as mirror neurons, first identified in specific areas in the brain of macaque monkey [19].
It has been reported that mirror neurons play an important role in the understanding of the intention of an action of others [20]. In humans, fMRI studies have shown that an MNS-like function involves a plurality of brain areas. MNS in human is far more complex than that in macaque monkey, and has been reported to be involved in imitation and empathy [21] as well as understanding of the purpose of an action of others [22].
EEG has been proposed as a convenient means to monitor the activity of MNS [23]. Performing and observing an action both suppress the alpha band rhythm in the central sulcus [24]. The alpha band rhythm around the central sulcus is called mu rhythm. Mu rhythm suppression is enhanced by observing a goal-directed action or performing a social task [25,26]. In particular, the band power in 10–12 Hz was previously reported as a relatively sensitive indicator of the motor cortex activity [25,27].
One characteristic of TUI is that it involves physical actions of the user to grasp a tangible object and the like, which are readily visible to its observers (e.g., [9,28]). On the other hand, it is known that MNS is activated by the observation of the action of others. MNS also responds to the movement of hands and tools shown in an introductory video for museum [29]. In particular, it shows a pronounced response to the action of grasping an object [30]. With regard to the action of the operator of UI, such a grasping action makes TUI different from GUI, a more common type of UI [2]. What kind of effect on MNS is caused by watching others operating TUI, in comparison with GUI?
By monitoring the brain activity related to MNS during the operation of different UIs (i.e., different information acquisition processes), it will be possible to evaluate the effect of each UI in terms of empathy, understanding of intentions, and the like. Because UI is primarily designed for its users, there is a non-negligible effect on a person actually operating it, besides its observer. Therefore, in the present study, we conducted two types of experiment to understand the effect of TUI on both its operator and observer in terms of the MNS-related brain activity. First, we examined the effect of the observation of UI on the MNS-related brain activity in the observer (Experiment 1). Next, we investigated the same brain activity in the operator (Experiment 2). Based on the results of both experiments, the effect that was characteristic of TUI on the brain activity could be addressed.

2. Materials and Methods

2.1. Experiment 1

In Experiment 1, the brain activity in the observer watching the UI operation from behind was monitored to investigate the effect on the observer.

2.1.1. Subjects

The subjects were 15 right-handed students (21.9 ± 1.2 years old). All subjects signed the informed consent form. The present study was approved by the Ethical Committee of Kyushu University. No subject had a history of a psychiatric or neurological disorder.

2.1.2. Experimental Conditions

In the present study, each subject sat behind and watched an assistant (hereafter actor) operating a TUI.
One of the art appreciation systems provided at the Louvre DNP Museum Lab was simplified into three different experimental TUIs composed of a screen and either tangible objects or a touch panel with object thumbnails. The effect of TUI was investigated in three conditions, i.e., using these different types of UI (two TUI and one GUI; Figure 1).
We adopted two TUI conditions to compare presence/absence of corresponding relation between the selected object and the result, and third GUI condition to compare presence/absence of the grasping motion.
In the first condition, a set of small porcelain models (a total of eight different porcelains) was adopted as UI (TUI/OBJECT condition). In each task, the actor grasped one model and moved it onto a holder in front of the screen. As a result, the screen displayed a porcelain picture that corresponded to the model. Then the actor hided the porcelain picture from the screen by returning the model to the initial position. The actor conducted this task for all of the eight models (i.e., a total of eight tasks of displaying/hiding a porcelain picture).
In the second condition, eight identical can models were used as UI (TUI/CAN condition). The actor conducted a total of eight tasks of displaying/hiding a porcelain picture as in the first condition, except that the appearance of each model was not predicative of the corresponding porcelain picture.
In the third condition, thumbnails of eight different porcelains provided on the touch panel served as UI (GUI condition). In each task, the actor touched one thumbnail with a finger, as a result of which the screen displayed a porcelain picture that corresponded to the thumbnail. Then, the actor moved only a hand onto the holder in front of the screen. After that, the actor moved the hand and touched the same thumbnail to hide the porcelain picture from the screen. Then the actor returned the hand to the holder. The actor conducted a total of eight tasks of displaying/hiding a porcelain picture as in the other conditions.

2.1.3. Experimental Procedure

Each subject put on an EEG electrode cap and sat 2 m behind the actor (Figure 2). During the first 70 s, the subject stayed at rest looking at a fixation point displayed on the screen. Then, the actor started to operate the UI. The operation time for each picture was about 8.7 s, and a total of eight operation tasks were conducted which included all the eight different pictures in a random order. This set of tasks (about 70 s) was repeated once again. EEG was recorded while the subject was at rest and watching the actor’s manipulation, and data from the resting state and the second set were used for the subsequent analysis. We carried out the above process for each of the three conditions, wherein the order of these conditions was counter-balanced between the subjects. Each subject was instructed to sit still throughout the EEG measurement watching the hand of the actor during the manipulation and the screen during each picture displayed on the screen.

2.1.4. EEG Measurement and Analysis

A 64-channel Ag-AgCl electrode cap, an EGI NetAmps 200 amplifier, and Netstation acquisition software (Electric Geodesics, Inc., Eugene, OR, USA) were used for the EEG measurement. The average of all channels was used as the reference, and sampling was done at 250 Hz, with a band-pass filter ranging from 0.3 to 100 Hz. FFT was applied to EEG segments (4.09 s interval, 1024 points, Hanning window), which were overlapped by 50% (EMSE Data Editor 5.3, Source Signal Imaging, Inc., La Mesa, CA, USA). Data from the first 10 s were excluded from the study for reason of transients in attention. The average power in 10–12 Hz was adopted as the mu rhythm power. Each power value obtained was converted into a logarithm to ensure the normality of the data. EMSE Suite Data Editor 5.3 Release Candidate 3 (Source Signal Imaging, Inc.) was used for data analysis. Mu rhythm suppression in the observer watching the actor’s action was adopted as the index of MNS.
Mu rhythm suppression in each condition was calculated using data from the resting state as reference. Because the sensorimotor area showed mu rhythm suppression, EEG was analyzed in two ROIs: left central (LC) (4 electrodes around C3) and right central (RC) (4 electrodes around C4).
The main effects of conditions (the TUI/OBJECT condition, the TUI/CAN condition, and the GUI condition) and brain areas (LC and RC), and their interactions were tested by two-way ANOVA. For multiple comparisons, we used modified sequentially rejective bonferroni procedure. Before significance testing, outliers in each subject were identified by Smirnov-Grubbs test (p < 0.01). All the outlier channels were excluded from the subsequent statistical analyses.

2.2. Experiment 2

In Experiment 2, the brain activity in the operator actually operating UI was monitored to study the effect on the operator.

2.2.1. Subjects

The subjects were 18 right-handed students (22.1 ± 1.57 years old). Other details were the same as described above for Experiment 1.

2.2.2. Experimental Conditions

An ACTION condition, which involved the user action but no visual information as a results of each actions, was included in addition to the three conditions in Experiment 1 (i.e., four conditions in total). The fourth condition was adopted to compare presence/absence of visual feedback of the operation result. The eight identical can models in the TUI/CAN condition were also used in this condition (Figure 3). The operator conducted a total of eight tasks as in the TUI/CAN condition, except that no porcelain picture was displayed on the screen.

2.2.3. Experimental Procedure

Each subject put on an EEG electrode cap and sat in front of the screen. The operator stayed at rest for the first 60 s and then started to operate the UI. The operation time for each picture was about 13 s, and a total of eight operation tasks were conducted which included all the eight different pictures in a random order. This set of tasks (about 104 s) was repeated once again. EEG of the operator at rest and during the two sets of tasks was recorded, and data from the resting state and the second set were used for the subsequent analysis. We carried out the above process for each of the four conditions, wherein the order of these conditions was counter-balanced between the subjects.
The subject was instructed to operate to a sound stimulus presented at a constant rhythm, for the sake of the consistency of the movement. The subject was also instructed to sit as still as possible during the EEG measurement and to keep looking at the manipulating hand during the manipulation and at the screen during a picture was displayed, so that the eye movement would be in line with the operation process. The subject went through enough explanation and exercise before the experiment.

2.2.4. EEG Measurement and Analysis

For comparison with Experiment 1, the 10–12 Hz component of EEG was analyzed as in Experiment 1.
Mu rhythm suppression in the operator was adopted as the index of brain activities. The main effects of conditions (the TUI/OBJECT condition, the TUI/CAN condition, the GUI condition, the ACTION condition) and brain areas (LC and RC), and their interactions were tested by two-way ANOVA. The other analysis procedure was the same as Experiment 1.

3. Results

3.1. Experiment 1

Mu rhythm suppression in each brain area due to UI was examined by comparing the power values in the mu band in the resting state and during the operation, by one sample t-test (Figure 4). As a result, RC in the TUI/CAN condition showed a significant difference (t (14) = −2.46; p = 0.028). In contrast, no brain area showed a significant difference in the TUI/OBJECT condition or the GUI condition. Based on the data from the resting state and the data from the operation, mu rhythm suppression was determined for LC and RC in different conditions (TUI/OBJECT condition, TUI/CAN condition, and GUI condition; Figure 4).
Next, the main effect of conditions (TUI/OBJECT condition, TUI/CAN condition, and GUI condition) and brain areas (LC and RC), and their interactions were tested by two-way ANOVA. The ANOVA results confirmed an interaction between condition and brain area (F(2, 28) = 4.245; p = 0.025; η2p = 0.233), and a main effect of condition in RC (F(2, 28) = 4.234; p = 0.025; η2p = 0.232). By t-test, there was a tendency that the TUI/CAN condition resulted in a higher suppression in comparison with other two conditions (t (14) = 2.50; p = 0.077 in TUI/CAN vs. TUI/OBJECT, t (14) = 2.02; p = 0.077 in TUI/CAN vs. GUI).

3.2. Experiment 2

As in Experiment 1, mu rhythm suppression in each brain area due to UI was examined by comparing the power values in the mu band in the resting state and during the operation, by one sample t-test (Figure 5). As a result, there are significant differences (t (17) = −3.30; p = 0.004 in TUI/CAN at RC, t (17) = −2.50; p = 0.023 in TUI/CAN at LC, t (17) = −2.32; p = 0.033 in TUI/OBJECT at LC, t (17) = −2.79; p = 0.013 in TUI/OBJECT at RC). In contrast, in the GUI condition and the ACTION condition, no brain area showed a significant difference. Based on the data from the resting state and the data from the operation, mu rhythm suppression was determined for LC and RC in different conditions (TUI condition, CAN condition, GUI condition, and ACTION condition; Figure 5).
Then, the main effect between conditions (the TUI/OBJECT condition, the TUI/CAN condition, the GUI condition, and the ACTION condition) and between brain areas (LC and RC) were tested by two-way ANOVA. The ANOVA results confirmed a main effect of condition (F(2.3, 39.12) = 6.572; p = 0.002; η2p = 0.279). By t-test, it was shown that the TUI/OBJ condition and the TUI/CAN condition resulted in a significant suppression in comparison with the GUI condition and the ACTION condition (t (17) = 3.66; p = 0.012 in TUI/OBJECT vs. GUI, t (17) = 3.65; p = 0.012 in TUI/OBJECT vs. ACTION, t (17) = 3.12; p = 0.019 in TUI/CAN vs. GUI, t (17) = 2.74; p = 0.042 in TUI/CAN vs. ACTION).

4. Discussion

In Experiment 1, we found that the brain activity in the observer varied depending on the type of UI (Figure 4).
In the TUI/CAN condition, watching the UI operation resulted in an elevated activity in RC in the observer in comparison with the resting state. This is consistent with the report that watching the action of others results in mu rhythm suppression in the somatosensory cortex, which roughly corresponds to RC [23,31]. Therefore, it is suggested that the brain activity was induced in this MNS-related brain area. Among brain areas, only RC showed the activity in this experiment. We suppose this was because the user action took place in the left visual hemifield of the observer [29,32] rather than the observer imagined imitating the right hand action of the operator as a result of watching it. This activity tended to be higher in the TUI/CAN condition in comparison with the TUI/OBJECT condition and the GUI condition, indicating that it was characteristic of the TUI/CAN condition. The MNS activity reflects the immediate goal of an action [22]. Thus we had expected mu rhythm to be suppressed in the TUI/OBJECT condition, but found the suppression only in the TUI/CAN condition. It has been reported that the observation of a highly unfamiliar action results in an elevated activity in the motor cortex [33,34]. In the TUI/CAN condition in the present study, the result of each action is unpredictable, which is not seen in ordinary UIs. We suppose this caused the elevated activity in the sensorimotor area.
On the other hand, the response of the observer did not seem to differ between the TUI/OBJECT condition and the GUI condition, in spite of the presence/absence of the grasping. In this experiment, any of the two conditions did not even show activation in the same brain area, indicating that they did not activate the MNS. It has been reported that the observation of a familiar task results in a lower activity in the inferior frontal gyrus and the premotor cortex in comparison with an unfamiliar task [35], and that repeated grasping of the same object leads to weaker mu rhythm suppression [36]. We suppose that the repeated stimulus presentation in the present study hindered the effect of the grasping action from being detected.
In Experiment 2, we found that the brain activity in the operator also varied depending on the type of UI (Figure 5).
Areas around the somatosensory cortex were active in the TUI/OBJECT condition and the TUI/CAN condition, suggesting the possible induction of a brain activity related to the brain activity seen in Experiment 1 [23,31]. The brain areas that were active in this experiment included not only LC, an area activated by performing a right hand action, but also RC. The visual information of each user action was presented primarily in the left visual hemifield, whereas the visual information of the output as the result of the action was presented primarily in the right visual hemifield. Processing of such visual information possibly had some effect [29,32]. These activities tended to be higher in the TUI/OBJECT condition in comparison with the GUI condition and the ACTION condition. The TUI/OBJECT condition and the GUI condition differed in that only the former involved the grasping action. In an fMRI study, the action of reaching for an object and grasping it and the action of only reaching for that object differed in that the former resulted in an elevated activity in the anterior intraparietal sulcus (aIPS; [37]). In our experiment, the aIPS activity due to grasping probably had an effect on the sensorimotor area. The TUI/OBJECT condition and the ACTION condition involved the same user action, and only differed in that the former displayed a porcelain picture on the screen as the result of each action. The elevated activity in LC and RC in the TUI/OBJECT condition, in comparison with the ACTION condition, suggests that the visual output as the result of each action had some effect on the activity of sensorimotor area.
Similar results were also obtained in the TUI/CAN condition, except that RC showed a particularly pronounced activity in the TUI/CAN condition, as seen in Experiment 1. It is possible that the unpredictability of the result of each action enhanced the observer’s attention [33,34].
In the present study, we showed that MNS in the observer was more active in the TUI/CAN condition than in the GUI condition: i.e., the effect of TUI on its observers was demonstrated based on the brain activity.
We had also expected to detect the effect of the grasping action [2] as an MNS activity in the TUI/OBJECT condition, in comparison with the GUI condition. However, we did not obtain such a result in the present study. One possibility is that the observer recognized the TUI as UI from the movement of the objects, but failed to pay attention to the action of “grasping” the objects. On the other hand, in the TUI/CAN condition, the unpredictability of the result of each action seemed to affect the MNS-related brain activity. We suppose that the factor of unfamiliarity attracted the attention and interest of the observer.
However, the brain activity in the operator did not show a difference between the TUI/CAN condition and the TUI/OBJECT condition. These conditions involved the same action as the ACTION condition, but activated the brain activity in the somatosensory cortex because they were provided with a screen to output the results. In comparison with the GUI condition, the TUI/CAN condition and the TUI/OBJECT condition resulted in an elevated activity in the same brain area because they involved the “grasping” action.
It is suggested that the same brain activity defined by mu rhythm suppression in this area is activated in the observer not only by a hand action but also by the addition of an unpredictable nature to UI, reflecting the degree of interest.
Altogether, the mu rhythm suppression reported in the present study might represent a brain activity that is activated by the combination of UI function, grasping action, and interest in UI, but not by either of them alone. Therefore the mu rhythm suppression in LC and RC can be used as an index in the evaluation of “grasping” action-based TUIs.

5. Conclusions

In this research, we adopted the following approaches for TUI, which have been hardly studied so far. One is an approach from brain activity and the other is an approach from the influence on surrounding observers.
To understand the possible effect of TUI, the present study focused on the mu rhythm suppression in the sensorimotor area reflecting execution and observation of action. By investigating the brain activity, we showed the possibility that a TUI involving a goal-directed action can activate a sensorimotor area that reflects MNS, both in the operator and the observer. We thought it indicated a part of the effectiveness of TUI from the influence on brain activity. It is one of the significance to positively adopt TUI.
Moreover, we also showed the possibility that the monitoring of MNS-related brain activities in the observer can be a novel UI evaluation method.

Acknowledgments

The study described in this paper is supported by the Louvre-DNP Museum Lab project.

Author Contributions

Kazuo Isoda, Kana Sueyoshi, Ryo Miyamoto, Ichiro Hisanaga and Shigekazu Higuchi conceived study conception and design; Kazuo Isoda, Kana Sueyoshi, Ryo Miyamoto, Ichiro Hisanaga, Stéphanie Orlic, and Shigekazu Higuchi contributed acquisition of data; Kazuo Isoda, Kana Sueyoshi, Ryo Miyamoto, Yeon-kyu Kim, Yuki Nishimura, Yuki Ikeda and Shigekazu Higuchi analyzed and interpreted data; Kazuo Isoda, Kana Sueyoshi, Ryo Miyamoto and Shigekazu Higuchi wrote the manuscript.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Ishii, H.; Ullmer, B. Tangible bits: Towards seamless interfaces between people, bits and atoms. In Proceedings of the Conference on Human Factors in Computing Systems-Proceedings, Atlanta, GA, USA, 22–27 March 1997; pp. 234–241. [Google Scholar]
  2. Fitzmaurice, G.W. Graspable User Interfaces. Ph.D. Thesis, University of Toronto, Toronto, ON, Canada, 1996. [Google Scholar]
  3. Baskinger, M.; Gross, M. Tangible interaction = form + computing. Interactions 2010, 17, 6–11. [Google Scholar] [CrossRef]
  4. Van Den Hoven, E.; Frens, J.; Aliakseyeu, D.; Martens, J.B.; Overbeeke, K.; Peters, P. Design research & Tangible Interaction. In Proceedings of the First International Conference on Tangible and Embedded Interaction, Baton Rouge, LA, USA, 15–17 February 2007; pp. 109–115. [Google Scholar]
  5. Wakkary, R.; Muise, K.; Tanenbaum, K.; Hatala, M.; Kornfeld, L. Situating approaches to interactive museum guides. Mus. Manag. Curatorship 2008, 23, 367–383. [Google Scholar] [CrossRef]
  6. Stanton, D.; Bayon, V.; Neale, H.; Ghali, A.; Benford, S.; Cobb, S.; Ingram, R.; O’Malley, C.; Wilson, J.; Pridmore, T. Classroom collaboration in the design of tangible interfaces for storytelling. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Seattle, WA, USA, 31 March–5 April 2001; pp. 482–489. [Google Scholar]
  7. Antle, A.N. The cti framework: Informing the design of tangible systems for children. In Proceedings of the First International Conference on Tangible and Embedded Interaction, TEI’07, Baton Rouge, LA, USA, 15–17 February 2007; pp. 195–202. [Google Scholar]
  8. O’Malley, C.; Fraser, D.S. Literature Review in Learning with Tangible Technologies; A NESTA Futurelab: Bristol, UK, 2004; pp. 1–48. [Google Scholar]
  9. Shaer, O.; Hornecker, E. Tangible user interfaces: Past, present, and future directions. Found. Trends Hum. Comput. Interact. 2009, 3, 1–137. [Google Scholar] [CrossRef]
  10. Hornecker, E.; Buur, J. Getting a grip on tangible interaction: A framework on physical space and social interaction. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems, QC, Canada, 22–27 April 2006; pp. 437–446. [Google Scholar]
  11. Arias, E.; Eden, H.; Fischer, G. Enhancing communication, facilitating shared understanding, and creating better artifacts by integrating physical and computational media for design. In Proceedings of the Conference on Designing Interactive Systems: Processes, Practices, Methods, and Techniques, DIS, New York, NY, USA, 18–20 August 1997; pp. 1–12. [Google Scholar]
  12. Suzuki, H.; Kato, H. In Interaction-level support for collaborative learning: Algoblock—An open programming language. In Proceedings of the First International Conference on Computer Support for Collaborative Learning, Bloomington, IN, USA, 17–20 October 1995. [Google Scholar]
  13. Fitzmaurice, G.W.; Buxton, W. An empirical evaluation of graspable user interfaces: Towards specialized, space-multiplexed input. In Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems, Atlanta, GA, USA, 22–27 March 1997; pp. 43–50. [Google Scholar]
  14. Patten, J.; Ishii, H. A comparison of spatial organization strategies in graphical and tangible user interfaces. In Proceedings of the DARE 2000 on Designing Augmented Reality Environments, Elsinore, Denmark, 12–14 April 2000; pp. 41–50. [Google Scholar]
  15. Zuckerman, O.; Gal-Oz, A. To tui or not to tui: Evaluating performance and preference in tangible vs. Graphical user interfaces. Int. J. Hum. Comput. Stud. 2013, 71, 803–820. [Google Scholar] [CrossRef]
  16. Reeves, S.; Benford, S.; O’Malley, C.; Fraser, M. Designing the spectator experience. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Portland, OR, USA, 2–7 April 2005; pp. 741–750. [Google Scholar]
  17. Peltonen, P.; Kurvinen, E.; Salovaara, A.; Jacucci, G.; Ilmonen, T.; Evans, J.; Oulasvirta, A.; Saarikko, P. It’s mine, don’t touch!: Interactions at a large multi-touch display in a city centre. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Florence, Italy, 5–10 April 2008; pp. 1285–1294. [Google Scholar]
  18. Ichino, J.; Isoda, K.; Ueda, T.; Satoh, R. Effects of the display angle on social behaviors of the people around the display: A field study at a museum. In Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing, San Francisco, CA, USA, 27 February 27–2 March 2016; pp. 26–37. [Google Scholar]
  19. Gallese, V.; Fadiga, L.; Fogassi, L.; Rizzolatti, G. Action recognition in the premotor cortex. Brain 1996, 119 Pt 2, 593–609. [Google Scholar] [CrossRef] [PubMed]
  20. Rizzolatti, G.; Fogassi, L.; Gallese, V. Neurophysiological mechanisms underlying the understanding and imitation of action. Nat. Rev. Neurosci. 2001, 2, 661–670. [Google Scholar] [CrossRef] [PubMed]
  21. Iacoboni, M. Imitation, empathy, and mirror neurons. Annu. Rev. Psychol. 2009, 60, 653–670. [Google Scholar] [CrossRef] [PubMed]
  22. Iacoboni, M. Neural mechanisms of imitation. Curr. Opin. Neurobiol. 2005, 15, 632–637. [Google Scholar] [CrossRef] [PubMed]
  23. Pineda, J.A. The functional significance of mu rhythms: Translating “seeing” and “hearing” into “doing”. Brain Res. Brain Res. Rev. 2005, 50, 57–68. [Google Scholar] [CrossRef] [PubMed]
  24. Gastaut, H.J.; Bert, J. EEG changes during cinematographic presentation; moving picture activation of the EEG. Electroencephalogr. Clin. Neurophysiol. 1954, 6, 433–444. [Google Scholar] [CrossRef]
  25. Muthukumaraswamy, S.D.; Johnson, B.W.; McNair, N.A. Mu rhythm modulation during observation of an object-directed grasp. Brain Res. Cognit. Brain Res. 2004, 19, 195–201. [Google Scholar] [CrossRef] [PubMed]
  26. Oberman, L.M.; McCleery, J.P.; Ramachandran, V.S.; Pineda, C. EEG evidence for mirror neuron activity during the observation of human and robot actions: Toward an analysis of the human qualities of interactive robots. Neurocomputing 2007, 70, 2194–2203. [Google Scholar] [CrossRef]
  27. Pfurtscheller, G.; Neuper, C.; Krausz, G. Functional dissociation of lower and upper frequency mu rhythms in relation to voluntary limb movement. Clin. Neurophysiol. 2000, 111, 1873–1879. [Google Scholar] [CrossRef]
  28. Ishii, H. The tangible user interface and its evolution. Commun. ACM 2008, 51, 32–36. [Google Scholar] [CrossRef]
  29. Isoda, K.; Sueyoshi, K.; Ikeda, Y.; Nishimura, Y.; Hisanaga, I.; Orlic, S.; Kim, Y.K.; Higuchi, S. Effect of the hand-omitted tool motion on mu rhythm suppression. Front. Hum. Neurosci. 2016, 10. [Google Scholar] [CrossRef] [PubMed]
  30. Rizzolatti, G.; Matelli, M. Two different streams form the dorsal visual system: Anatomy and functions. Exp. Brain Res. 2003, 153, 146–157. [Google Scholar] [CrossRef] [PubMed]
  31. Oberman, L.M.; Pineda, J.A.; Ramachandran, V.S. The human mirror neuron system: A link between action observation and social skills. Soc. Cogn. Affect. Neurosci. 2007, 2, 62–66. [Google Scholar] [CrossRef] [PubMed]
  32. Shmuelof, L.; Zohary, E. Dissociation between ventral and dorsal fmri activation during object and action recognition. Neuron 2005, 47, 457–470. [Google Scholar] [CrossRef] [PubMed]
  33. Beilock, S.L.; Lyons, I.M.; Mattarella-Micke, A.; Nusbaum, H.C.; Small, S.L. Sports experience changes the neural processing of action language. PNAS 2008, 105, 13269–13273. [Google Scholar] [CrossRef] [PubMed]
  34. Van Elk, M.; Van Schie, H.T.; Zwaan, R.A.; Bekkering, H. The functional role of motor activation in language processing: Motor cortical oscillations support lexical-semantic retrieval. Neuroimage 2010, 50, 665–677. [Google Scholar] [CrossRef] [PubMed]
  35. Vogt, S.; Buccino, G.; Wohlschlager, A.M.; Canessa, N.; Shah, N.J.; Zilles, K.; Eickhoff, S.B.; Freund, H.J.; Rizzolatti, G.; Fink, G.R. Prefrontal involvement in imitation learning of hand actions: Effects of practice and expertise. Neuroimage 2007, 37, 1371–1383. [Google Scholar] [CrossRef] [PubMed]
  36. Perry, A.; Bentin, S. Mirror activity in the human brain while observing hand movements: A comparison between EEG desynchronization in the mu-range and previous fmri results. Brain Res. 2009, 1282, 126–132. [Google Scholar] [CrossRef] [PubMed]
  37. Frey, S.H.; Vinton, D.; Norlund, R.; Grafton, S.T. Cortical topography of human anterior intraparietal cortex active during visually guided grasping. Cognit. Brain Res. 2005, 23, 397–405. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Settings of different conditions in Experiment 1: (a) tangible user interface (TUI)/OBJECT condition; (b) TUI/CAN condition; (c) graphical user interface (GUI) condition.
Figure 1. Settings of different conditions in Experiment 1: (a) tangible user interface (TUI)/OBJECT condition; (b) TUI/CAN condition; (c) graphical user interface (GUI) condition.
Applsci 07 00347 g001
Figure 2. Two regions of interest (ROIs) were defined and respective electrode sites were pooled: left central (16, 20, 21, 22) and right central (41, 49, 50, 51).
Figure 2. Two regions of interest (ROIs) were defined and respective electrode sites were pooled: left central (16, 20, 21, 22) and right central (41, 49, 50, 51).
Applsci 07 00347 g002
Figure 3. Settings of different conditions in Experiment 2: (a) TUI/OBJECT condition; (b) TUI/CAN condition and ACTION condition (in the ACTION condition, no porcelain picture was displayed on the screen); (c) GUI condition.
Figure 3. Settings of different conditions in Experiment 2: (a) TUI/OBJECT condition; (b) TUI/CAN condition and ACTION condition (in the ACTION condition, no porcelain picture was displayed on the screen); (c) GUI condition.
Applsci 07 00347 g003
Figure 4. Mu rhythm suppression (LC and RC) in different conditions in Experiment 1. Asterisk (* p < 0.05) means the significant mu rhythm suppression from baseline to observation of each actions. Daggers (+ p < 0.10) mean the tendency in comparison with other conditions at RC.
Figure 4. Mu rhythm suppression (LC and RC) in different conditions in Experiment 1. Asterisk (* p < 0.05) means the significant mu rhythm suppression from baseline to observation of each actions. Daggers (+ p < 0.10) mean the tendency in comparison with other conditions at RC.
Applsci 07 00347 g004
Figure 5. Mu rhythm suppression (LC and RC) in different conditions in Experiment 2. Asterisks (** p < 0.01, * p < 0.05) mean the significant mu rhythm suppression from baseline to observation of each actions. Sharps (# p < 0.05) mean the significant difference between conditions.
Figure 5. Mu rhythm suppression (LC and RC) in different conditions in Experiment 2. Asterisks (** p < 0.01, * p < 0.05) mean the significant mu rhythm suppression from baseline to observation of each actions. Sharps (# p < 0.05) mean the significant difference between conditions.
Applsci 07 00347 g005

Share and Cite

MDPI and ACS Style

Isoda, K.; Sueyoshi, K.; Miyamoto, R.; Nishimura, Y.; Ikeda, Y.; Hisanaga, I.; Orlic, S.; Kim, Y.-k.; Higuchi, S. Tangible User Interface and Mu Rhythm Suppression: The Effect of User Interface on the Brain Activity in Its Operator and Observer. Appl. Sci. 2017, 7, 347. https://doi.org/10.3390/app7040347

AMA Style

Isoda K, Sueyoshi K, Miyamoto R, Nishimura Y, Ikeda Y, Hisanaga I, Orlic S, Kim Y-k, Higuchi S. Tangible User Interface and Mu Rhythm Suppression: The Effect of User Interface on the Brain Activity in Its Operator and Observer. Applied Sciences. 2017; 7(4):347. https://doi.org/10.3390/app7040347

Chicago/Turabian Style

Isoda, Kazuo, Kana Sueyoshi, Ryo Miyamoto, Yuki Nishimura, Yuki Ikeda, Ichiro Hisanaga, Stéphanie Orlic, Yeon-kyu Kim, and Shigekazu Higuchi. 2017. "Tangible User Interface and Mu Rhythm Suppression: The Effect of User Interface on the Brain Activity in Its Operator and Observer" Applied Sciences 7, no. 4: 347. https://doi.org/10.3390/app7040347

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop