Next Article in Journal
A Hybrid Approach to Semantic Digital Speech: Enabling Gradual Transition in Practical Communication Systems
Previous Article in Journal
Combining Design Neurocognition Technologies and Neural Networks to Evaluate and Predict New Product Designs: A Multimodal Human–Computer Interaction Study
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Comparing Physiological Synchrony and User Copresent Experience in Virtual Reality: A Quantitative–Qualitative Gap

by
Daojun Gong
1,2,3,4,
Haoming Yan
5,
Ming Wu
2,
Yimin Wang
2,6,7,
Yifu Lei
1,3,4,
Xuewen Wang
1,3,4 and
Ruowei Xiao
2,*
1
State Key Laboratory of Flexible Electronics (LOFE) and Institute of Flexible Electronics (IFE), Northwestern Polytechnical University, 127 West Youyi Road, Xi’an 710072, China
2
School of Design, Southern University of Science and Technology, Shenzhen 518055, China
3
Shaanxi Key Laboratory of Flexible Electronics, Northwestern Polytechnical University, 127 West Youyi Road, Xi’an 710072, China
4
MIIT Key Laboratory of Flexible Electronics (KLoFE), Northwestern Polytechnical University, 127 West Youyi Road, Xi’an 710072, China
5
United Imaging Healthcare Co., Ltd., Shanghai 201807, China
6
Faculty of Humanities and Arts, Macau University of Science and Technology, Macau SAR, China
7
School of Animation and Digital Arts, Communication University of China, Beijing 100024, China
*
Author to whom correspondence should be addressed.
Electronics 2025, 14(6), 1129; https://doi.org/10.3390/electronics14061129
Submission received: 11 February 2025 / Revised: 9 March 2025 / Accepted: 11 March 2025 / Published: 13 March 2025

Abstract

:
As technology-mediated social interaction in virtual environments prevails, recent Human–Computer Interaction (HCI) studies have suggested incorporating biosensory information cues that reveal users’ inner states to facilitate social information sharing and augment copresent experience. Physiological synchrony is believed to be engaged in several important processes of copresent experience. However, what impact different biosensory cues have on physiological synchrony and users’ copresent experience remains underinvestigated. This study selected a virtual reality (VR) electronic dance music setting and integrated five different biosignals, namely, power of electromyography (pEMG), galvanic skin response (GSR), heart rate (HR), respiration effort (RE), and oxyhemoglobin saturation by pulse oximetry (SpO2). A non-randomized controlled experiment with 67 valid participants and five baseline data providers revealed that GSR enhanced physiological synchrony significantly. However, semi-structure interviews with 10 participants indicated that RE and HR provided the strongest user-perceived copresence, which implies an intriguing gap between quantitative and qualitative analysis results. Five design implications were further generated and discussed in details for the future design and development of virtual copresent experience based on biosensory information cues.

1. Introduction

The COVID-19 pandemic has accelerated the prevalence of technology-mediated social interactions across a variety of application scenarios like office, education, entertainment, and so forth [1,2]. Large-scale virtual and hybrid copresent events have become widespread, e.g., the social virtual reality platforms SteamVR, RecRoom, and VRchat [3], the COLDPLAY VR concert [4], the Online IEEE VR Conference 2021 [5], etc. As supportive technology, virtual reality (VR) allows for hyper-spatiotemporal and multi-sensory communication, thus resulting in higher social presence, motivation, and enjoyment [6]. However, current virtual social interaction relies largely on avatar-based paradigms that emphasize one-to-one replication of explicit social cues such as verbal communication, facial expressions, body postures, etc. Thus, it results in potential risks such as the uncanny valley effect [7,8], stereotyping [9], and various forms of social abuse [2,10,11]. To this end, the HCI research community has proposed alternative paradigms and solutions by incorporating biosensory cues that instead reveal users’ inner states [9,12]. These signals, derived from physiological data, are transformed into user-perceived social information, which can enhance interpersonal communication and interoceptive awareness [12,13,14]. Empirical evidence suggests explicit social cues such as scripted gestures elevate cognitive load [15], whereas bio-sensory cues like synchronized heart rates implicitly enhance authenticity through subconscious emotional alignment [16].
Physiological synchrony is defined as the co-occurrence and interdependence of physiological status between individuals, which encompasses the association of their behavioral, physiological, and emotional activities over time [17,18]. This phenomenon can result from the primitive process that incorporates the mirror neuron system to mimic the physiological states of others [13,19]. It has emerged as a crucial factor in technology-mediated social experiences and served as one of the key physiological indicators reflecting the sense of copresence, empathy and sympathy [9,16,19]. However, most current HCI research relies solely on subjective user feedback, lacking comparative studies that explore the relationship between objective physiological synchrony and subjective user evaluation [9,12,16,20].
This study aimed to acquire empirical knowledge and preliminary insight for leveraging multi-source biosensory information as social cues to enhance technology-mediated copresent experience. We grounded on and extended existing research [9,12] by collecting multiple types of factual biosensory data and comparing their effects on users’ objective physiological synchrony and subjective copresent experience. Specifically, we selected the VR electronic dance music (EDM) scenario and conducted a non-randomized controlled experiment together with semi-structured interviews to measure and compare between users’ physiological synchrony and their subjective feedback in response to different biosensory cues. Explicitly, we intended to answer the following two research questions (RQs): First, can one’s physiological synchrony be enhanced by presenting another person’s biosignals in virtual copresent activities? (RQ1) And if yes, can enhanced physiological synchrony affect subjective copresent experience, i.e., user-perceived copresence, user-perceived synchrony? (RQ2). As a manipulation check, RQ1 evaluates whether biosensor integration effectively modulates physiological synchrony prior to analyzing its downstream effects on copresence, i.e., RQ2. The overall research structure and flow are shown in Figure 1 below.
By addressing the two RQs above, our contributions are twofold:
  • We collected and compared five different biosensory cues (pEMG, GSR, HR, RE, SpO2) and their respective effects on users’ copresent experience in virtual environment, all based on real biodata.
  • We combined both quantitative measurement of physiological synchrony and qualitative evaluation of user subjective feedback to form a more comprehensive understanding for utilizing biosensory information as social cues, resulting in five design implications.

2. Background

Copresence is fundamental to all social interactions. The concept of social presence, introduced by Short, Williams, and Christie [21] was initially developed to compare face-to-face interactions with various communication media. They defined social presence as “the degree of salience of the other person in the interaction and the consequent salience of the interpersonal relationships” (p. 65) [21]. Social presence arises from copresence, which serves as its essential foundation, as Oh et al. noted: “Social presence requires a copresent entity that appears to be sentient [22]”. Moving beyond traditional notions of presence as merely “being there”, researchers have further refined copresence as the experience of “being together”, emphasizing a deeper psychological connection between minds [23]. In social interactions in VR, copresence, defined as the perceived sense of sharing an environment with others, is indispensable [21,24]. This shared experience forms the basis for numerous positive social outcomes, including the development of empathy, by enabling people to understand and respond to others’ emotional states and intentions [22,25,26,27,28].
In virtual environments, users interact mainly through digital avatars that replicate explicit social cues like verbal communication, facial expressions, postures, and gestures to make virtual interactions feel more engaging and socially immersive [11]. Although high-precision animation improves visual realism and improves cognitive-behavioral reactions [29,30,31,32], the impact of avatars on the copresent experience remains debated. Hyper-realism may evoke aversive responses, potentially hindering accurate social judgments [7,8]. Compared to explicit social cues, implicit social cues that reveal and reflect people’s internal psychophysiological states offer an alternative paradigm for technology-mediated social interaction and information sharing [33]. As one major source of implicit cues, biosensory information has the potential to enhance interpersonal communication and interoceptive awareness [12,14,34]. This approach is believed to be able to mitigate biases related to race, gender, and class, thus safeguarding social norms and equality [1,9].
Although fundamental to the study of social behavior, assessing copresent experiences affected by biosensory cues is challenged by the difficulty of quantification [35]. One of the main approaches is synchrony measurement, which has been proposed as a robust assessment of social connectedness and copresent experience in digital interactions. In general, physiological synchrony refers to the temporal coordination of physiological processes between individuals, typically measured through the autonomic nervous system (ANS) activity. It reflects emotional empathy, wherein individuals’ ANS activities tend to synchronize when experiencing the same emotional state [35,36,37]. Therefore, physiological synchrony can be calculated by analyzing the temporal alignment of physical signals between interacting individuals as defined. It serves as a cornerstone for understanding interpersonal dynamics.
Synchrony plays a pivotal role in establishing social connectedness and fostering group cohesion. This phenomenon has been explored extensively across contexts such as dyadic interactions, group collaborations, and parent–infant relationships. For example, brain-to-brain synchrony has been linked to mutual gaze and positive affect among affiliative partners [38], while physiological synchrony significantly enhances group cohesion during collaborative tasks [39]. In parent–infant interactions, synchrony is fundamental to emotional regulation and the development of social bonds through rhythmic exchanges such as touch and vocalization [40]. The dynamics of synchrony are shaped by social motives and contextual influences. Prosocial individuals demonstrate heightened synchrony in cooperative environments [41], whereas competitive or playful interactions yield diverse synchrony patterns tied to emotional states [42]. This study aligns with existing research by focusing on dyadic interactions rather than group copresence, enabling a precise analysis of physiological synchrony.
Current research identifies five major biosignals actively entangled with social interactions, thus rendering themselves particularly suitable for measuring physiological synchrony. Heart rate (HR), a reliable indicator of ANS activity, has been shown to reflect stress and emotional fluctuations during interactions, with studies demonstrating significant associations between HR synchrony and perceived social presence in mediated communication [16,19]. Galvanic skin response (GSR) measuring skin conductance, provides insights into emotional arousal and intensity of interactions, with higher GSR synchrony observed in emotionally charged shared experiences [16]. Respiration effort (RE) offers data on relaxation or tension levels, with research indicating higher RE synchrony in cooperative tasks, suggesting its role as a proxy for shared emotional and cognitive states [43]. Oxyhemoglobin saturation by pulse oximetry (SpO2) has been utilized to gauge physiological engagement, with studies in virtual reality settings showing associations between SpO2 synchrony and elevated levels of copresent experience, particularly in activities demanding high concentration and interaction [13]. Lastly, power of electromyography (pEMG) captures muscle activity, especially facial expressions, with research in dyadic gaming and virtual environments demonstrating correlations between pEMG synchrony and perceived emotional alignment [44]. Collectively, these findings suggest that physiological synchrony, across various biosignal channels, provides multi-faceted perspectives for gauging the quality of technology-mediated copresent experience.
As for analytical methods of physiological synchrony, windowed cross-correlation (WCC) has significantly enhanced the ability to measure temporal coordination, distinguishing genuine synchrony from coincidental alignments [45]. Compared to static cross-correlation or global synchrony measures, WCC’s sliding-window approach allows for a dynamic examination of synchrony fluctuations over time [45]. This is of particular importance in interactive contexts, where synchrony often shifts in response to momentary changes in emotional or behavioral states. By providing detailed temporal analysis, WCC reveals dynamic patterns of synchrony that traditional methods fail to capture. This research builds on these advancements, employing WCC for a rigorous quantitative analysis while integrating qualitative insights gained from user interviews. By aligning with recent computational and behavioral approaches [46], the study investigates physiological synchrony across varied biosignal channels under the same virtual copresent context to enhance ecological validity [47]. We believe that this integrated approach provides a comprehensive perspective on the role of physiological synchrony in dyadic interactions, where copresence is mediated through biosensory cues.

3. Methods

3.1. Overview

To investigate the impact of biosensory cues, we first invited five baseline participants (all healthy university students at the average age of 22, two females, three males) to watch a VR EDM video, during which their physiological data were collected via five different channels, i.e., pEMG, GSR, HR, RE, SpO2. These data were then visualized and integrated into the same VR EDM video. Subsequently, we recruited 76 volunteers through advertisements and outreach programs, ensuring diversity in age, gender, and background and then conducted the main experiment. The main experiment exposed the participants to both the experimental and control conditions in a counterbalanced order, by asking them to watch the EDM video with and without visualized biodata, respectively. The non-randomized crossover design (Condition A → B vs. B → A) was strategically adopted to balance operational feasibility and ecological validity. Given the time-intensive biosensor/VR setup procedures, full randomization risked introducing fatigue-related confounders from prolonged waiting periods, while the fixed counterbalanced order preserved naturalistic interaction continuity. This approach aligns with established guidelines for pragmatic trials under resource constraints while maintaining methodological rigor. We collected the same five types of physiological signals in both conditions and measured their physiological synchrony between the baseline participants’ data. Based on the quantitative analysis results, 10 participants were further selected for follow-up semi-structured interviews to gain deeper insights into the effects of biosensory cues on user-perceived copresence and synchrony. Each participant completed the study in approximately 30 min, including both the physiological experiment and semi-structured interviews. The physiological experiment was conducted over a span of five days to accommodate all 76 participants. This study was conducted in accordance with the Declaration of Helsinki and approved by the Institutional Review Board (IRB) of the Southern University of Science and Technology (decision number 20240200, approval date 7 May 2024). All participants provided written informed consent prior to participation, and their rights and privacy were fully protected.

3.2. Participants

Prior to the experiment, each participant was provided with a detailed account of the experiment purpose, the use of collected biodata and other privacy and security-related explanations. The participants were informed that they could withdraw from the experiment at any time. Written informed consent was obtained before data collection to ensure that participants fully understood the study procedures, potential risks, and benefits. Upon completion of the experiment, compensation in the form of CNY 50 was provided (for children participants, an additional gift was included).
A diverse sample of participants was recruited to ensure the generalizability of the results and to minimize sample selection bias. From 76 participants in the main experiment, valid data were collected from 67 subjects (9 were excluded due to inaccuracy). The validated participants ranged in age from 8 to 60 (mean = 25.09, SD = 9.82), with an almost equal gender distribution: 33 males (49.25%), 33 females (49.25%), and 1 identifying as other (1.49%). Their educational backgrounds were diverse: 8 had primary education (11.94%), 2 secondary (2.99%), 28 undergraduate or associate degrees (40.5%), and 29 master’s degrees or higher (43.2%). Regarding VR experience, 24 had no prior experience (35.82%), 36 had limited use (53.73%), 6 used VR occasionally (8.96%), and 1 participant, a professional VR developer, used VR frequently (1.49%). Participants were screened based on the following criteria: (1) no history of motion sickness or epilepsy; (2) no consumption of stimulants (e.g., caffeine, alcohol) within 24 h prior to the experiment.

3.3. VR EDM Setting

The overall EDM VR experience consisted of a resting state induction and an EDM session. The resting state induction leveraged the “Eyes On a Crosshair” (EO-F) method to standardize participants’ attention and emotions [48,49], during which a 50 s animation featured a central black cross simulating breathing before fading (Figure 2A). As for the following EDM session, a 300 s 360-degree Youtube video clip (https://www.youtube.com/watch?v=DbUrKkb4JJ4 (accessed on 2 December 2023)) Abel Meyer’s 360-degree VR performance at Minimal Attack in Buenos Aires) were used as our media source. This high-arousal context was strategically selected as electronic dance music (EDM) induces rapid autonomic nervous system fluctuations [50], enabling reliable detection of multi-source physiological synchrony—a critical requirement for addressing RQ1’s manipulation check while creating ecologically valid conditions for RQ2’s copresence evaluation [51,52]. We combined the two sessions using Adobe Premiere Pro v24.6 (Adobe Inc., San Jose, CA, USA). and built a 6:50 min video for VR viewing via the PICO Developer platform v1.3.0 (PICO Interactive, Qingdao, China).
Compared with calming activities such as meditation, EDM provides greater sensory stimulation that can evoke more salient emotional changes in participants immediately and dynamically [50,53,54]. This allowed us to quickly immerse users in an engaging virtual copresent environment and assess physiological synchrony more effectively. The EDM video was first utilized for collecting physiological data from five baseline participants during their viewing. We further visualized these data and synchronized them with the video’s timeline. As a result, the original video was used in the control condition of the main experiment, as shown in Figure 2B, while the video overlaid with the visualized biosensory cues was used in the experimental condition, as shown in Figure 2C.
Specifically note that we did not use real-time biosensory data in our main experiment but pre-recorded data instead, mostly due to the consideration of avoiding possible bias introduced by excessive variables. This operational definition of copresence focused on participants’ perception of mutual physiological engagement with another human agent through biosignal exchange, rather than visual co-location with entities in the 360° video. Data collection methods are explained in the next subsection. Below, we further detail how we visualized the data collected from baseline participants and rendered them into biosensory cues.
As shown in Figure 3, we referred to established biosensory representations and decided to present pEMG and GSR as line graphs [55,56,57], HR as a beating heart symbol, RE as concentric circles [58], and SpO2 as a changing number in a ring [59,60]. These kinds of basic chart-based visualizations were adopted primarily to minimize the extra impact that complex data presentations potentially had on our participants. In addition, we also refined our data visualization approach to enhance representational fidelity. For instance, the variations in SpO2 concentration were minimal, making it challenging to discern these differences using conventional line graphs or dynamic plots. Consequently, we adjusted the visual precision to ensure clearer interpretation. Last but not least, we opted not to use heart rate variability (HRV) as a visualization form because the complex fluctuations—such as RR interval, which measure the time between consecutive R-wave peaks in the electrocardiogram (ECG), and the amplitude shifts between successive cardiac cycles—are not easily perceptible through a visual analysis.
The aforementioned biosensory cues were displayed sequentially for 60 s each in the following order: RE, pEMG, GSR, HR, and SpO2. The transparency of the overlay layer was reduced to 85%, allowing the participants to recognize the numbers and labels clearly while not interfering with their ability to watch the video. We displayed the signals in the lower middle part of the viewer’s screen, so as to ensure that the participants would not overlook these signals, which could compromise the validity of the measurements.

3.4. Biosensory Data Collection and Analysis

Before proceeding with the experiment, each participant was given a brief introduction about the five biosensory signals and their corresponding meaning [61,62,63]. They were then told that there would be a remote viewer watching the VR EDM alongside them, represented by the biosensory data. Before starting the formal experiment session, participants were allowed to ask the experimenter questions about any aspects of the experimental setting they found confusing.
We used the same set of non-invasive surface physiological sensors to collect both the baseline and the main experiment participants’ biosignals during the VR EDM experience. The sensor set included (1) a strap-on respiration sensor (ElasTech (ElasTech Ltd., Ningbo, China)), (2) a dual-channel pEMG sensor module kit (Sichiray (Sichiray Technologies, Wuxi, China)), (3) a GSR module kit (Sichiray (Sichiray Technologies, Wuxi, China)), and (4) a fingertip digital SpO2 and HR detection module (Tonglian Chuangke (Tonglian Chuangke Medical Equipment Co., Shenzhen, China)). All were modified for optimal performance. As shown in Figure 4A, GSRs were collected from the right index and middle fingers, pEMG from the right biceps brachii, HR and SpO2 from the left fingertip, and Resp from a sensor between the chest and abdomen. The overall setup is shown in Figure 4B.
To enhance data comparability across participants and minimize the influence of individual physiological differences, a preprocessing step was employed to normalize the collected data before analysis. Given that the absolute values of RE, pEMG and GSR varied significantly among subjects, and that relative changes in these signals more accurately reflected psychophysiological states, normalization was performed by scaling these three signals to a range of 0 to 1. This preprocessing step ensured that inter-subject variations in signal amplitude did not introduce bias into subsequent analyses.
Following data normalization, we examined the impact of biosensory cues on physiological synchronization. To quantify synchronization patterns, we computed correlations between participants’ physiological responses and baseline signals, assessing whether the presence of biosensory cues led to significant increases in synchronization. Given the potential presence of time lags in physiological responses and their dynamic variations, we employed the Weighted Cross-Correlation (WCC) analysis method [64]. As stated in Section 2, compared with traditional correlation coefficients or a full-segment cross-correlation analysis, WCC enables a more refined assessment of synchronization dynamics across different time windows.
The fundamental principle of WCC involves segmenting two signals, X = { x 1 , x 2 , , x N } and Y = { y 1 , y 2 , , y N } , both of length N and sampled at intervals of s, into overlapping sub-windows, each of length w m a x and with a time interval of s. These sub-windows, denoted as Wx ( k ) and Wy ( k ) for k ranging from τ m a x + 1 to N τ m a x w m a x , are then paired in the following way:
Wx ( k ) = { x k , x k + 1 , x k + 2 , , x k + w m a x } , if τ 0 , { x k τ , x k + 1 τ , x k + 2 τ , , x k + w m a x τ } , if τ > 0 ,
Wy ( k ) = { y k , y k + 1 , y k + 2 , , y k + w m a x } , if τ 0 , { y k τ , y k + 1 τ , y k + 2 τ , , y k + w m a x τ } , if τ > 0 ,
and the cross–correlation between Wx ( k ) and Wy ( k ) is expressed as:
r τ ( k ) = 1 w m a x i = 1 w m a x ( Wx i ( k ) μ ( Wx ( k ) ) ) ( Wy i ( k ) μ ( Wy ( k ) ) ) sd ( Wx ( k ) ) sd ( Wy ( k ) ) ,
where μ ( · ) and s d ( · ) represent mean value and standard deviation, respectively.
In this work, for each k ranging from τ m a x + 1 to N τ m a x w m a x , we calculated the maximum of the cross–correlation:
r m a x ( k ) = max τ { r τ ( k ) } ,
and then, the mean value of every r m a x ( k ) was calculated as
r * = 1 N 2 τ m a x w m a x k r m a x ( k ) ,
which is a scalar representing the correlation between X and Y .
For each participant, the r * value between their five physiological responses and the baseline signals was calculated both in the presence and absence of biosensory cues. A statistical analysis was then conducted to examine the differences in these values of r * . If a significant difference in the r * of a particular physiological response was observed after introducing biosensory cues, it suggested that such cues may have influenced physiological synchrony among participants for that specific response.

3.5. Semi-Structured Interview

One month after the physiological experiment, we conducted a follow-up semi-structured interview to supplement and further explain the quantitative analysis result. This time interval was primarily due to the necessity to first analyze the collected physiological data for identifying the characteristic users with particularly high or low level of synchrony. We ultimately invited 10 users out of the 67 valid participants for semi-structured interviews: 4 expert users (P1, P2, P3, P4) and 6 characteristic users (P5, P6, P7, P8 identified with high physiological synchrony, and P9, P10 with low physiological synchrony). Expert users included 2 university professors (P1 specializes in sound engineering and DJ performance, P4 in interaction design) and 2 doctoral students (P2 specializes in VR marketing and P3 in technology-mediated communication for the elderly).
Prior to each interview, we carried out a dedicated review session by presenting the previously used physiological visualizations again, to ensure the participants could recall their experiences from the previous experiment session. The semi-structured interview enclosed four main questions: (1) General Experience: How do your subjective experience and feelings differ with and without the presence of biosensory signals? Why? (2) User-Perceived Copresence: Among the five biosensory signals (i.e., pEMG, HR, SpO2, RE, HR), which one(s) best conveyed the presence of the other (simulated) viewer? Why? (3) User-Perceived Synchrony: By which biosensory signal(s) do you feel most synchronized with the other viewer? Why? (4) User Preferences: What representations of the biosensory information do you prefer or recommend in order to improve the sense of copresence and why? All interviews were fully video- and audio-recorded after acquiring the participants’ acknowledgment.
We transcribed the recorded contents into texts, which were then proofread by one author before the thematic analysis. MAXQDA v24 software (https://www.maxqda.com) (accessed on 6 July 2024) was used to assist our qualitative analysis and established a preliminary coding framework after independently coding the first three interviews. Following Braun and Clarke’s guidelines [65], we conducted independent coding to enhance inter-rater reliability. Discrepancies were later resolved through group discussions, eventually settling at a consensus rate exceeding 99%.

4. Results

4.1. Quantitative Results

According to its definition, the calculation of WCC eliminates modal differences between channels, and it reflects the temporal synchronization of signal changes. With w m a x = 10 s and τ m a x = 2 s [64], the WCC analysis results for these five signals, compared to the baseline signal, are presented in Figure 5. The WCC of the RE signal exhibited clear periodicity in the lag direction, consistent with its characteristic oscillatory pattern. In contrast, the pEMG signal displayed irregularities, resulting in a degree of randomness in its WCC outcomes. Most of the other signals were generally smooth, with responses occurring only at specific moments. As a result, these signals remained relatively stable in the lag direction, while showing noticeable variations in the elapsed time direction.
Based on the WCC analysis, the r * values between each signal and the baseline signal were calculated to assess the overall correlation, accounting for potential slight misalignment in time. Figure 6 presents the frequency distribution histograms of r * for the five physiological responses. Notably, the introduction of biosensory cues led to a significant increase in the r * between participants’ GSR and the baseline signal, with an average improvement exceeding 0.02. We found that GSR, which is highly sensitive to emotional arousal and changes in the sympathetic nervous system, effectively captured the physiological synchrony enhanced by biosensory cues. In contrast, RE, HR, and SpO2 remained relatively stable, being more influenced by an individual’s basal metabolic rate and overall autonomic nervous system state, making them less responsive to short-term changes in biosensory cues. The pEMG signals, which are influenced by subtle facial muscle activities, exhibit complex variations during social or emotional interactions, resulting in irregular synchrony patterns due to significant inter-individual differences.
Using WCC analysis to address RQ1, we found that presenting others’ GSR signals could increase participants’ physiological synchrony. However, HR, RE, pEMG, and SpO2 presentations did not consistently reflect synchrony, with pEMG particularly exhibiting irregularity and randomness. The impact of biosensory cues on different physiological signals varied considerably, likely due to differences in sensitivity and response mechanisms to external cues.

4.2. Qualitative Results

Our open-coding process yielded 26 distinct codes in total, clustered into five major themes through affinity diagramming as shown in Figure 7. In addition, a network analysis (Figure 8) examined code relationships within themes, with line thickness indicating co-occurrence frequency (F). Central nodes like “Signal Meaning”, “Intuitiveness” and “Interactability” were the most frequently mentioned codes. The five themes, ranked by mention frequency, were (1) Biosignals (F = 55), (2) Experience Quality (F = 27), (3) User Perception (F = 19), (4) Potential Integration (F = 15), and (5) Individual Differences (F = 13).
The Biosignals theme, which emerged as the most dominant (F = 55), highlighted key physiological signals such as respiration (15) and heart rate (14). These two signals were repeatedly referenced by participants as the most preferred types of biosensory cues. The subcategories of Signal Representation and Signal Meaning within the broader Biosignals theme illustrated how participants interpreted and made sense of biosensory data, emphasizing their representational and interpretive role in shaping their perceptions of copresent experiences.
The Experience Quality theme (F = 27) featured experiential perspectives of how participants captured and valued the overall copresent experiences, e.g., intuitiveness, interactability, sense of rhythm, and relatedness, among which intuitiveness stood out as a critical factor. Participants frequently mentioned that signal feedback and interfaces must be “intuitive” to comprehend and respond to.
User Perception (F = 19) focused on the affective and cognitive responses elicited during the experience. The coexistence of positive reactions, like novelty and curiosity, alongside negative reactions, such as confusion and tension, indicated that user perception was nuanced and situationally dependent.
The Potential Integration theme (F = 15) encompassed multimodal feedback, sonification, AI agents, and wearable devices. They were mentioned as potential means to enhance the representation of biosignals, improve perceived synchrony and enrich user interaction. Although this theme was referenced less frequently, its presence indicated a forward-looking interest in expanding interaction modalities. Some participants suggested that avatars could be used to visualize biosignals, making their representation more intuitive and interactive. However, others raised concerns regarding their applicability in virtual social settings.
Individual Differences (F = 13) referred to factors such as user preference, video content, and attention allocation, reflecting how personal traits and context influenced subjective feedback towards the copresent experience.
Social Network Analysis, grounded in Graph Theory, served as a robust statistical tool for evaluating information flow and uncovering relationships within data [2]. Nodes in the diagram symbolize individual codes, while edges depict the frequency of co-occurrence between them, with line thickness visually indicating the strength of these associations. The network analysis diagram in Figure 8 offers a structured representation of co-occurrences among codes, mapping the underlying relationships observed in user responses.
The network analysis revealed that Signal Meaning, Sense of Rhythm, and Confusion served as central nodes within the overall network structure, highlighting their pivotal role in shaping users’ understanding of biosensory cues. These concepts exhibited strong interconnections with other thematic categories, indicating that users’ perception of biosignals was heavily influenced by their intuitiveness and representation format. Specifically, the magnitude of signal change was closely linked to users’ comprehension of its meaning, with larger fluctuations typically perceived as more informative and enhancing the perceived synchronization with a remote viewer. Furthermore, rhythmic presentation emerged as a critical factor, exhibiting strong associations with Signal Meaning, Interactability, and Multimodal Feedback, suggesting that rhythmic signal presentation facilitates the establishment of synchrony.
Conversely, Confusion formed a crucial connecting pathway within the network, linking Signal Meaning while also being influenced by Video Content, Novelty, and Attention Allocation. This suggests that when the presentation of biosensory cues lacks a coherent structure or deviates from users’ expectations, it can induce confusion, ultimately diminishing the copresence experience. Additionally, Individual Differences played a significant role in users’ interpretation of biosensory cues. This finding underscores substantial individual variability and context-dependency in the perception of physiological signals, emphasizing the need for future designs to incorporate more flexible presentation strategies that accommodate diverse user preferences and application scenarios.
Addressing RQ2, our qualitative analysis revealed that while GSR significantly enhanced physiological synchrony (RQ1), user-perceived copresent experience and synchrony relied more on RE and HR signals, which were oftentimes described as “intuitive” by the participants. This suggests enhanced physiological synchrony does not necessarily translate directly to higher user-perceived copresence and synchrony (further explained in Section 5.1). Our subsequent discussion presents major findings synthesized from both quantitative and qualitative analyses, extracting five design implications examined in relation to existing work and established theories.

5. Discussion

5.1. Quantitative–Qualitative Response Gap

As our most important observation, we identified the inconsistency between the collected biosignals and subjective user feedback. The quantitative analysis of participants’ biodata indicated that GSR outperformed the other four types of biosignals regarding the physiological synchrony, while in our semi-structured interview, all participants listed RE and HR as the top two indicators of user-perceived copresence and synchronization. This intriguing gap between the quantitative and the qualitative user responses contradicted with our early premise and some existing research [14,43,66]. Counter-intuitively, there may not necessarily exist a positive correlation between physiological synchrony and user-perceived copresence. However, it is yet too early to assert that they are two independent factors as they incorporates a rather complex, multilayered process that traverses both the consciousness and the subconsciousness.
How should the biosensory information be presented and conveyed to the users? How will the users’ interpretation of the information affect the resulting copresent experience? And more essentially, must the sense of copresence be built upon a full understanding of the meaning of biosensory information, or does it merely require a moderate level of emotional perception or sensory experience? It still entails sophisticated research and further exploration for the biosensory cues and their representations. Specifically, this research echoed some previous insights from Curran et al. [9] regarding the biosensory cue representations that although they depend on the specific signal types and application contexts, a visual number, graph, or icon may not be a preferable presentation. This phenomenon may be explained through neurocognitive mechanisms of implicit social processing. Implicit biosensory cues (e.g., ambient lighting linked to galvanic skin responses) facilitate subconscious social bonding through physiological entrainment in the insular cortex [67], bypassing explicit cognitive appraisal of social stereotypes. This aligns with predictive coding theories suggesting low-level physiological alignment precedes conscious social perception [68], effectively circumventing the uncanny valley effect inherent in avatar-based interactions while maintaining affective resonance. This also led us to our next two design implications.

5.2. Multimodal, Hybrid Data Representations

As for the biosensory cues representation, seven out of ten participants (for brevity, we use the notation 7/10. The same applies throughout the text) suggested a multimodal stimulus, ranging from audio (P1, P2, P7), electrical stimulation (P1), and haptic feedback (P8), all the way to smart devices e.g., mood light, airbags, etc. (P10). It implies the opportunities as well as challenges of integrating a wide range of external technologies such as sonification, wearable computing, the Internet of Things, etc. The pairing between a biosignal and its representation often depends on the following: (1) Will it impose extra physical or cognitive loads on users? In particular, the visual representation should be cautiously designed, as porting conventional a WIMP (windows, icons, menus, pointers)-based GUI into VR has often been proved unfavorable by previous studies [69,70,71]. (2) Does it well contextualize the use of the biosignal within its target scenario in a non-intrusive and consistent way? Different copresent activities may exhibit varying social characteristics and demands (e.g., familiar social interactions, interactions with strangers, or mixed interactions), often requiring the selection of appropriate embodied cue expressions based on application-specific needs. For instance, additional auditory information might distract users in contexts such as a symphony concert, whereas in other scenarios, it could have the opposite effect. A such example is “jogging over a distance” [70]. Instead of visual cues, Souchet et al. leveraged spatialized audio to express heart rate data from a geographically distant jogger, thus enhancing the feeling that someone else was jogging together with you.
Once again, we would like to stress that we remain cautious about the use of avatars and similar paradigms, such as emojis. Although they are considered easy to combine with biosensory cues and convey explicit social cues like facial expressions and body postures, one of our expert interviewees (P4) expressed his concern that an avatar with a numerical display floating above or around its head may “look more like an NPC in a game than a real human being” and thus increases the risk of depersonalization in VR social interaction. Fundamentally, the effects of VR on dissociative experiences appear to be exacerbated by an individual’s higher propensity for immersion in the virtual environment [72]; certain personal traits can also increase the inevitability of depersonalization, such as socially anxious individuals. If, on a social level, the user is burdened with the “unrealness” of the experience—feeling an overwhelming sense of virtuality caused by the “as-if” quality and uncanniness (for example, perceiving social partners as fake humans), the negative effects of depersonalization can be further amplified, further weakening the sense of presence [73].
To specifically note that, a few participants (P3, P6, P10) also emphasized that the representation should provide perceivable changes in data magnitude, regardless of different sensory channels. This may restrict the types of biosignals to the ones that react to the stimulus more sensitively.

5.3. Cognitive Empathy and Emotional Relatedness

A significant portion (4/7) of user-perceived confusion resulted from the unclear signal meaning. Compared with raw physiological data, the participants clearly preferred some high-level, meaningful interpretations of the data. This somehow explained why all participants voted for RE and HR, the meanings of which were the most familiar and understandable. Another important aspect is the underlying personal identity associated with the signals. As P2 said, “It triggered my curiosity, making me wonder whom these signals belong to, or even it might be a dog”. P10’s opinion further stressed this: “It would interest me more if the data came from my friend, my family or someone I know, than from a sheer stranger”. The participants could not make sense of the biodata or would even “question about its authenticity” (P8, P7), if they could not relate it to themselves or other real people. Previous research by Lee et al. [12] suggested that users were intrigued by the biodata of strangers in VR social networking but were reluctant to disclose their own biodata, reflecting a general curiosity about others’ biosignals alongside privacy concerns regarding their own. However, in our observation, no participants raised comments related to data privacy. Cultural and regional differences in attitudes toward privacy may act as a big contributor and influence how biosensory data are perceived and shared within social VR environments. We believe this may also be due to the nature of our setting, where, although biosensory cues can render additional social information, they do not inherently disclose individual identity. Moreover, the deliberate use of raw data from unfamiliar physiological signals may, at times, be interpreted as a form of data obfuscation.
How accurately does the sense of copresence require people to understand others’ psychophysiological states? Or does it still, on some occasions, allow for a certain degree of ambiguity to exist? Similar to our study, Curran et al. leveraged visualized electrodermal activity (EDA) data as a biosensory cue for digitally mediated empathy [9]. Their result suggested that the introduction of biosensory cues did not always guarantee an improved accuracy of people’s evaluations of others’ emotions; however, “Accuracy is neither wholly representative of real-life empathic processes, nor is it the sole important measure to track for improved mediated social interaction and understanding” [9]. It can be implied that being able to accurately grasp the meaning of others’ physiological signals is not necessarily a prerequisite for people to experience a sense of copresence through biosensory cues. Psychologists have argued that cognitive empathy and emotional empathy are two intertwined processes that both play an irreplaceable role in prosocial behaviors [74]. On the one hand, this underpins our observation that it entails a minimum level of both rational understanding and emotional relatedness. On the other hand, it is worth reconsidering information accuracy as a spectrum that ranges from ambiguous raw data to precise information when designing the biosensory cue representations.

5.4. Bidirectional Regulation Between Users and Biosignals

We identified from many participants (P1, P2, P3, P4, P5, P9, 6/10) the need for an increased interactability. As P9 put it, “I once paid attention to it (referring to the biosignals), and tried to control it; But as long as I realized that I could not, I just gave it up”. Our observation reinforced the significance of user-perceived autonomy [75] and the sense of control [76], as P1 suggested, “at least I need to be able to choose whose signals I am looking at”. Drawing from self-determination theory (SDT) [77], our findings indicate that when individuals are unable to fully believe their choices depend on their autonomous decision-making (i.e., the degree to which an individual perceives their actions as a result of their own free will, without external interference), it becomes very difficult for them to feel psychologically free and intrinsically motivated [77]. Similarly, our stance on user control, resonating with Shneiderman’s early HCI principles [78], advocates that if technological systems are unable to confer a strong sense of ownership to users and empower them as autonomous initiators with full control over their user interface interactions, it can pose significant barriers to enhancing feature utilization and decrease overall interface quality and user experience [76].
On the other hand, biosensory cues like RE were said be able to regulate users via their sense of rhythm and increase physiological synchrony (P3, P4, P5, P8). Biofeedback has been practiced and has already proven its potential in many VR mindful training [16,79,80,81]. Some studies have also used biosensory cues that are different from those in our studies and have obtained positive results. For example, Salminen et al. [16] pointed out that user responses on empathy-related electroencephalography (EEG) frontal asymmetry evoked higher self-reported empathy towards the other user than responses on respiratory activation. Compared to biofeedback, some HCI studies focus rather on utilizing biosensory cues particularly as an “interactive probe” in social contexts, so as to evoke awareness of others’ situations and states [9,12]. Interestingly, P4 suggested that physiological synchrony may not be limited to the conventional digital biosignals like the ones used in this study. Other biochemical mechanisms such as semiochemicals, e.g., menstrual synchrony caused by pheromone (a disputable but widely perceived phenomenon [82]), should also be taken in to account as design alternatives for enhancing synchronization in VR copresent experience.

5.5. Individual Bias in Cognition and Media Use

Participants’ personal preferences in media contents and use styles also had an impact on their willingness to socialize as well as how they would allocate their attention. As P10 put it: “I would like to interact with other audience in an idol EDM as we share a similar hobby and taste of music. But techno is not my thing”. Also, two participants (P6 and P10) mentioned that they preferred visual information to multimodal representation of biodata, as other information channels may be “distracting and adding extra cognitive load”. We conclude that there may be no “one-size-fits-all” solution to address this particularity that varies from individual to individual. Therefore, it can be important to provide users with options to personalize their data representations and user interfaces.

6. Limitations and Future Work

One limitation of this study stems from our efforts to rule out potential bias introduced by extra representation of biosensory cues. To achieve this, we employed simplified visualizations of the raw biosignal data during the experiment. However, presenting the data in a basically chart-based visualization form may have hindered participants’ comprehension of the biosensory information, which was reflected in the predominantly negative feedback during user interviews, particularly for signals other than HR and RP. Additionally, the current limitations of sensor technology required both the baseline biosignal provider and participants to be in a resting state when recording physiological data. This restriction limited our ability to capture realistic physiological responses, especially for pEMG, potentially explaining its weaker performance in the physiological synchrony analysis. As for the effects of different music and scene styles on participants’ physiological signals, we plan to analyze user preferences through intergroup experiments. Furthermore, while our sample encompassed diverse educational backgrounds, the higher proportion of undergraduate students may limit generalizability to populations with homogeneous educational experiences.
We plan to extend this work by integrating multi-channel biosensory cues that combine HR, RP, and GSR. By comparing scenarios with and without avatar-based representations of these cues, we aim to explore how different biosensory cue representations can significantly enhance technology-mediated copresence experiences. In forthcoming experiments, we intend to employ a variety of musical genres and environmental settings, and we will conduct a between-group experiment to assess the influence of user preferences on the outcomes of the research. Additionally, future studies could explore biosensory integration in contexts requiring low arousal but high social coordination (e.g., virtual classrooms or telehealth consultations), where adaptive cue modulation may enhance user engagement without overwhelming cognitive load. Furthermore, we will focus on conducting user-centered evaluations in real-world application environments to assess the practical effect of these cues, refining our approach to better support natural, immersive social interactions. Future studies could also adopt stratified sampling strategies to better represent varied educational demographics.

7. Conclusions

Recently, the HCI community has shown strong interest in integrating biosensory information with virtual environments. This sort of technology-mediated social interaction and sharing is believed to alleviate some of the drawbacks of conventional avatar and verbal cues-based communication and offers an alternative paradigm for those less (socially or physically) expressive users to connect with other people. Despite some on-going attempts, this area still calls for more empirical research efforts to systematically establish actionable guidelines and evaluation metrics.
To pave the way, this research studied and compared the effects of five different biosensory cues on users’ copresent experience, by combining both quantitative and qualitative evaluations. Our finding spotted an intriguing gap between objective physiological synchrony and subjective user feedback. Explicitly, highly synchronized physiological signals did not always guarantee an enhanced user-perceived copresence, highlighting the complex interplay between quantifiable data and perceived emotional states. Based on this observation, we further proposed five design implications to provide evidence-based insight into better design and full-fledged applications of biosensory cues for future social sharing experience.

Author Contributions

Conceptualization, D.G., Y.W., X.W. and R.X.; methodology, H.Y., Y.W., Y.L., X.W. and R.X.; formal analysis, H.Y.; investigation, D.G., Y.W. and R.X.; data curation, H.Y. and R.X.; writing—original draft, D.G., H.Y. and Y.W.; writing—review and editing, D.G., M.W., Y.W., X.W. and R.X.; visualization, Y.W.; supervision, X.W. and R.X.; project administration, R.X.; funding acquisition, R.X. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by Ruowei Xiao’s personal research start-up funds from the Southern University of Science and Technology (grant No. 65/Y01656112). This work was also supported by the National Natural Science Foundation of China (62288102, 62371397) and the Fundamental Research Funds for the Central Universities.

Institutional Review Board Statement

The physiological experiment was conducted over a span of five days to accommodate all 76 participants. This study was conducted in accordance with the Declaration of Helsinki and approved by the Institutional Review Board (IRB) of the Southern University of Science and Technology (decision number 20240200, approval date 7 May 2024). All participants provided written informed consent prior to participation, and their rights and privacy were fully protected.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The raw data supporting the conclusions of this article will be made available by the authors on request.

Acknowledgments

Due to the requirement of double-blinded review process, the information of grant provider and institution are anonymized.

Conflicts of Interest

Author Haoming Yan was employed by the company United Imaging Healthcare Co., Ltd. The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflicts of interest.

References

  1. Wang, Y.; Dai, Y.; Chen, S.; Wang, L.; Hoorn, J.F. Multiplayer online battle arena (MOBA) games: Improving negative atmosphere with social robots and AI teammates. Systems 2023, 11, 425. [Google Scholar] [CrossRef]
  2. Wang, Y.; Gong, D.; Xiao, R.; Wu, X.; Zhang, H. A Systematic Review on Extended Reality-Mediated Multi-User Social Engagement. Systems 2024, 12, 396. [Google Scholar] [CrossRef]
  3. Breese, J.L.; Fox, M.A.; Vaidyanathan, G. Live Music Performances and the Internet of Things. Issues Inf. Syst. 2020, 21, 179–188. [Google Scholar] [CrossRef]
  4. Andrews, T.M. Concerts are canceled, so Coldplay, John Legend and Keith Urban are playing right in your living rooms. The Washington Post, 17 March 2020. [Google Scholar]
  5. Moreira, C.; Simões, F.P.; Lee, M.J.; Zorzal, E.R.; Lindeman, R.W.; Pereira, J.M.; Johnsen, K.; Jorge, J. Toward VR in VR: Assessing engagement and social interaction in a virtual conference. IEEE Access 2022, 11, 1906–1922. [Google Scholar] [CrossRef]
  6. Mystakidis, S. Metaverse. Encyclopedia 2022, 2, 486–497. [Google Scholar] [CrossRef]
  7. Shin, M.; Kim, S.J.; Biocca, F. The uncanny valley: No need for any further judgments when an avatar looks eerie. Comput. Hum. Behav. 2019, 94, 100–109. [Google Scholar] [CrossRef]
  8. Hepperle, D.; Purps, C.F.; Deuchler, J.; Wölfel, M. Aspects of visual avatar appearance: Self-representation, display type, and uncanny valley. Vis. Comput. 2022, 38, 1227–1244. [Google Scholar] [CrossRef] [PubMed]
  9. Curran, M.T.; Gordon, J.R.; Lin, L.; Sridhar, P.K.; Chuang, J. Understanding digitally-mediated empathy: An exploration of visual, narrative, and biosensory informational cues. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, New York, NY, USA, 4–9 May 2019; pp. 1–13. [Google Scholar] [CrossRef]
  10. Maloney, D.; Freeman, G.; Robb, A. A Virtual Space for All: Exploring Children’s Experience in Social Virtual Reality. In Proceedings of the CHI PLAY’20: The Annual Symposium on Computer-Human Interaction in Play, Virtual Event, 2–4 November 2020. [Google Scholar] [CrossRef]
  11. Hennig-Thurau, T.; Aliman, D.N.; Herting, A.M.; Cziehso, G.P.; Linder, M.; Kübler, R.V. Social interactions in the metaverse: Framework, initial evidence, and research roadmap. J. Acad. Mark. Sci. 2022, 50, 1295–1317. [Google Scholar] [CrossRef]
  12. Lee, S.; El Ali, A.; Wijntjes, M.; Cesar, P. Understanding and designing avatar biosignal visualizations for social virtual reality entertainment. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems, New York, NY, USA, 29 April–5 May 2022; pp. 1–15. [Google Scholar] [CrossRef]
  13. Järvelä, S. Physiological Synchrony and Affective Protosocial Dynamics. Ph.D. Thesis, University of Helsinki, Helsinki, Finland, 2020. [Google Scholar] [CrossRef]
  14. Järvelä, S.; Cowley, B.; Salminen, M.; Jacucci, G.; Hamari, J.; Ravaja, N. Augmented virtual reality meditation: Shared dyadic biofeedback increases social presence via respiratory synchrony. ACM Trans. Soc. Comput. 2021, 4, 1–19. [Google Scholar] [CrossRef]
  15. Dar, S.; Bernardet, U. When Agents Become Partners: A Review of the Role the Implicit Plays in the Interaction with Artificial Social Agents. Multimodal Technol. Interact. 2020, 4, 81. [Google Scholar] [CrossRef]
  16. Salminen, M.; Järvelä, S.; Ruonala, A.; Harjunen, V.J.; Hamari, J.; Jacucci, G. Evoking Physiological Synchrony and Empathy Using Social VR with Biofeedback. IEEE Trans. Affect. Comput. 2019, 13, 746–755. [Google Scholar] [CrossRef]
  17. Behrens, F.; Snijdewint, J.; Moulder, R.; Prochazkova, E.; Sjak-Shie, E.; Boker, S.; Kret, M. Physiological synchrony is associated with cooperative success in real-life interactions. Sci. Rep. 2020, 10, 19609. [Google Scholar] [CrossRef] [PubMed]
  18. Algumaei, M.; Hettiarachchi, I.; Veerabhadrappa, R.; Bhatti, A. Physiological synchrony predict task performance and negative emotional state during a three-member collaborative task. Sensors 2023, 23, 2268. [Google Scholar] [CrossRef] [PubMed]
  19. Chanel, G.; Mühl, C. Connecting Brains and Bodies: Applying Physiological Computing to Support Social Interaction. Interact. Comput. 2015, 27, 534–550. [Google Scholar] [CrossRef]
  20. Robinson, R.B.; Rheeder, R.; Klarkowski, M.; Mandryk, R.L. “Chat Has No Chill”: A Novel Physiological Interaction for Engaging Live Streaming Audiences. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems, New York, NY, USA, 29 April–5 May 2022; pp. 1–18. [Google Scholar] [CrossRef]
  21. Bulu, S.T. Place presence, social presence, co-presence, and satisfaction in virtual worlds. Comput. Educ. 2012, 58, 154–161. [Google Scholar] [CrossRef]
  22. Oh, H.J.; Bailenson, J.N.; Welch, G.F. A Systematic Review of Social Presence: Definition, Antecedents, and Implications. J. Comput.-Mediat. Commun. 2018, 23, 329–346. [Google Scholar] [CrossRef]
  23. Nowak, K. Defining and differentiating copresence, social presence and presence as transportation. In Proceedings of the Presence 2001 Conference, Philadelphia, PA, USA, 21–23 May 2001; Volume 2, pp. 686–710. [Google Scholar]
  24. Kreijns, K.; Xu, K.; Weidlich, J. Social Presence: Conceptualization and Measurement. Educ. Psychol. Rev. 2022, 34, 139–170. [Google Scholar] [CrossRef]
  25. Hétu, S.; Taschereau-Dumouchel, V.; Jackson, P.L. Stimulating the brain to study social interactions and empathy. Brain Stimul. 2012, 5, 95–102. [Google Scholar] [CrossRef]
  26. Felnhofer, A.; Kothgassner, O.D.; Hauk, N.; Beutl, L.; Hlavacs, H.; Kryspin-Exner, I. Physical and social presence in collaborative virtual environments: Exploring age and gender differences with respect to empathy. Comput. Hum. Behav. 2014, 31, 272–279. [Google Scholar] [CrossRef]
  27. Pimentel, D.; Kalyanaraman, S.; Lee, Y.H.; Halan, S. Voices of the unsung: The role of social presence and interactivity in building empathy in 360 video. New Media Soc. 2021, 23, 2230–2254. [Google Scholar] [CrossRef]
  28. Bailenson, J.N.; Blascovich, J.; Beall, A.C.; Loomis, J.M. Interpersonal Distance in Immersive Virtual Environments. Personal. Soc. Psychol. Bull. 2003, 29, 819–833. [Google Scholar] [CrossRef] [PubMed]
  29. James, T.W.; Potter, R.F.; Lee, S.; Kim, S.; Stevenson, R.A.; Lang, A. How Realistic Should Avatars Be? J. Media Psychol. 2015, 27, 109–117. [Google Scholar] [CrossRef]
  30. Roth, D.; Lugrin, J.L.; Galakhov, D.; Hofmann, A.; Bente, G.; Latoschik, M.E.; Fuhrmann, A. Avatar realism and social interaction quality in virtual reality. In Proceedings of the 2016 IEEE Virtual Reality (VR), Los Angeles, CA, USA, 19–23 March 2016; pp. 277–278. [Google Scholar]
  31. Steed, A.; Schroeder, R. Collaboration in immersive and non-immersive virtual environments. In Immersed in Media: Telepresence Theory, Measurement & Technology; Schroeder, R., Ed.; Springer: London, UK, 2015; pp. 263–282. [Google Scholar] [CrossRef]
  32. Appel, J.; von der Pütten, A.; Krämer, N.C.; Gratch, J. Does humanity matter? Analyzing the importance of social cues and perceived agency of a computer system for the emergence of social reactions during human-computer interaction. Adv. Hum.-Comput. Interact. 2012, 2012, 324694. [Google Scholar] [CrossRef]
  33. Morgan, C.T. Physiological Psychology; McGraw-Hill: New York, NY, USA, 1943. [Google Scholar]
  34. Bachrach, A.; Fontbonne, Y.; Joufflineau, C.; Ulloa, J.L. Audience entrainment during live contemporary dance performance: Physiological and cognitive measures. Front. Hum. Neurosci. 2015, 9, 179. [Google Scholar] [CrossRef]
  35. Bizzego, A.; Gabrieli, G.; Azhari, A.; Setoh, P.; Esposito, G. Computational methods for the assessment of empathic synchrony. In Progresses in Artificial Intelligence and Neural Systems; Springer: Singapore, 2020; pp. 555–564. [Google Scholar] [CrossRef]
  36. Mayo, O.; Lavidor, M.; Gordon, I. Interpersonal autonomic nervous system synchrony and its association to relationship and performance—A systematic review and meta-analysis. Physiol. Behav. 2021, 235, 113391. [Google Scholar] [CrossRef] [PubMed]
  37. Palumbo, R.V.; Marraccini, M.E.; Weyandt, L.L.; Wilder-Smith, O.; McGee, H.A.; Liu, S.; Goodwin, M.S. Interpersonal autonomic physiology: A systematic review of the literature. Personal. Soc. Psychol. Rev. 2017, 21, 99–141. [Google Scholar] [CrossRef]
  38. Kinreich, S.; Djalovski, A.; Kraus, L.; Louzoun, Y.; Feldman, R. Brain-to-brain synchrony during naturalistic social interactions. Sci. Rep. 2017, 7, 17060. [Google Scholar] [CrossRef]
  39. Tomashin, L.; Smith, J.; Brown, M. Physiological synchrony predicts group cohesion during collaborative tasks. J. Soc. Psychol. 2022, 35, 150–165. [Google Scholar]
  40. Markova, G.; Nguyen, T.; Hoehl, S. The role of synchrony in parent-infant bonding. Dev. Psychol. 2019, 55, 1075–1083. [Google Scholar] [CrossRef]
  41. Lumsden, J.; Miles, L.K.; Richardson, M.J. Prosocial orientation and interpersonal synchrony. J. Exp. Soc. Psychol. 2012, 48, 746–751. [Google Scholar] [CrossRef]
  42. Tschacher, W.; Rees, G.M.; Ramseyer, F. Synchrony during competitive and cooperative interactions. Front. Psychol. 2014, 5, 1323. [Google Scholar] [CrossRef]
  43. Järvelä, S.; Kätsyri, J.; Ravaja, N.; Chanel, G.; Henttonen, P. Intragroup emotions: Physiological linkage and social presence. Front. Psychol. 2016, 7, 105. [Google Scholar] [CrossRef] [PubMed]
  44. Järvelä, S.; Kivikangas, J.M.; Kätsyri, J.; Ravaja, N. Physiological linkage of dyadic gaming experience. Simul. Gaming 2013, 45, 24–40. [Google Scholar] [CrossRef]
  45. Moulder, R.G.; Boker, S.M. Windowed cross-correlation for the analysis of synchrony in time series data. J. Math. Psychol. 2018, 85, 89–99. [Google Scholar] [CrossRef]
  46. Delaherche, E.; Chetouani, M.; Mahdhaoui, A.; Saint-Georges, C.; Cohen, D. Interpersonal synchrony: A survey of evaluation methods across disciplines. IEEE Trans. Affect. Comput. 2012, 3, 349–365. [Google Scholar] [CrossRef]
  47. Jun, E.; McDuff, D.; Czerwinski, M. Circadian rhythms and physiological synchrony: Evidence of the impact on group creativity. Proc. ACM Hum.-Comput. Interact. 2019, 3, 1–18. [Google Scholar] [CrossRef]
  48. Raimondo, L.; Oliveira, L.A.F.; Heij, J.; Priovoulos, N.; Kundu, P.; Leoni, R.F.; van der Zwaag, W. Advances in resting state fMRI acquisitions for functional connectomics. NeuroImage 2021, 243, 118503. [Google Scholar] [CrossRef]
  49. Smitha, K.; Akhil Raja, K.; Arun, K.; Rajesh, P.; Thomas, B.; Kapilamoorthy, T.; Kesavadas, C. Resting state fMRI: A review on methods in resting state connectivity analysis and resting state networks. Neuroradiol. J. 2017, 30, 305–317. [Google Scholar] [CrossRef]
  50. Solberg, R.T.; Dibben, N. Peak experiences with electronic dance music: Subjective experiences, physiological responses, and musical characteristics of the break routine. Music Percept. Interdiscip. J. 2019, 36, 371–389. [Google Scholar] [CrossRef]
  51. Wikström, V.; Falcon, M.; Martikainen, S.; Pejoska, J.; Durall, E.; Bauters, M.; Saarikivi, K. Heart Rate Sharing at the Workplace. Multimodal Technol. Interact. 2021, 5, 60. [Google Scholar] [CrossRef]
  52. Oh, S.; Lee, J.-Y.; Kim, D.K. The Design of CNN Architectures for Optimal Six Basic Emotion Classification Using Multiple Physiological Signals. Sensors 2020, 20, 866. [Google Scholar] [CrossRef] [PubMed]
  53. Krumhansl, C.L. An exploratory study of musical emotions and psychophysiology. Can. J. Exp. Psychol. Can. Psychol. Exp. 1997, 51, 336. [Google Scholar] [CrossRef]
  54. Van Kerrebroeck, B.; Caruso, G.; Maes, P.J. A methodological framework for assessing social presence in music interactions in virtual reality. Front. Psychol. 2021, 12, 663725. [Google Scholar] [CrossRef]
  55. Curmi, F.; Ferrario, M.A.; Southern, J.; Whittle, J. HeartLink: Open broadcast of live biometric data to social networks. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Paris France, 27 April–2 May 2013; pp. 1749–1758. [Google Scholar]
  56. Merrill, N.; Cheshire, C. Trust your heart: Assessing cooperation and trust with biosignals in computer-mediated interactions. In Proceedings of the 2017 ACM Conference on Computer Supported Cooperative Work and Social Computing, Portland, OR, USA, 25 February–1 March 2017; pp. 2–12. [Google Scholar]
  57. Schnädelbach, H.; Rennick Egglestone, S.; Reeves, S.; Benford, S.; Walker, B.; Wright, M. Performing thrill: Designing telemetry systems and spectator interfaces for amusement rides. In Proceedings of the Sigchi Conference on Human Factors in Computing Systems, Florence, Italy, 5–10 April 2008; pp. 1167–1176. [Google Scholar]
  58. Tan, C.S.S.; Schöning, J.; Luyten, K.; Coninx, K. Investigating the effects of using biofeedback as visual stress indicator during video-mediated collaboration. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Toronto, ON, Canada, 26 April–1 May 2014; pp. 71–80. [Google Scholar]
  59. Slovák, P.; Janssen, J.; Fitzpatrick, G. Understanding heart rate sharing: Towards unpacking physiosocial space. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Austin, TX, USA, 5–10 May 2012; pp. 859–868. [Google Scholar]
  60. Walmink, W.; Wilde, D.; Mueller, F. Displaying heart rate data on a bicycle helmet to support social exertion experiences. In Proceedings of the 8th International Conference on Tangible, Embedded and Embodied Interaction, Munich, Germany, 16–19 February 2014; pp. 97–104. [Google Scholar]
  61. Lee, K.H.; Min, J.Y.; Byun, S. Electromyogram-based classification of hand and finger gestures using artificial neural networks. Sensors 2021, 22, 225. [Google Scholar] [CrossRef]
  62. Jaramillo-Yánez, A.; Benalcázar, M.E.; Mena-Maldonado, E. Real-time hand gesture recognition using surface electromyography and machine learning: A systematic literature review. Sensors 2020, 20, 2467. [Google Scholar] [CrossRef] [PubMed]
  63. Goshvarpour, A.; Abbasi, A.; Goshvarpour, A. An accurate emotion recognition system using ECG and GSR signals and matching pursuit method. Biomed. J. 2017, 40, 355–368. [Google Scholar] [CrossRef] [PubMed]
  64. Boker, S.M.; Xu, M. Windowed Cross-Correlation and Peak Picking for the Analysis of Variability in the Association Between Behavioral Time Series. Psychol. Methods 2002, 7, 338–355. [Google Scholar] [CrossRef]
  65. Braun, V.; Clarke, V. One size fits all? What counts as quality practice in (reflexive) thematic analysis? Qual. Res. Psychol. 2021, 18, 328–352. [Google Scholar] [CrossRef]
  66. Ylätalo, H. Empathy and EDA Synchrony in Virtual Reality Collaboration. Ph.D. Thesis, Helsinki University, Helsinki, Finland, 2022. [Google Scholar]
  67. Liu, F.; Kaufman, G.; Dabbish, L. The Effect of Expressive Biosignals on Empathy and Closeness for a Stigmatized Group Member. Proc. ACM Hum.-Comput. Interact. CSCW 2019, 3, 1–27. [Google Scholar] [CrossRef]
  68. Coan, J.A.; Sbarra, D.A. Social Baseline Theory: The Social Regulation of Risk and Effort. Curr. Opin. Psychol. 2015, 1, 87–91. [Google Scholar] [CrossRef]
  69. Hopstaken, J.F.; Van Der Linden, D.; Bakker, A.B.; Kompier, M.A.; Leung, Y.K. Shifts in attention during mental fatigue: Evidence from subjective, behavioral, physiological, and eye-tracking data. J. Exp. Psychol. Hum. Percept. Perform. 2016, 42, 878. [Google Scholar] [CrossRef]
  70. Souchet, A.D.; Philippe, S.; Lourdeaux, D.; Leroy, L. Measuring visual fatigue and cognitive load via eye tracking while learning with virtual reality head-mounted displays: A review. Int. J. Hum.- Interact. 2022, 38, 801–824. [Google Scholar] [CrossRef]
  71. Bækgaard, P.; Hansen, J.P.; Minakata, K.; MacKenzie, I.S. A fitts’ law study of pupil dilations in a head-mounted display. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications, New York, NY, USA, 25–28 June 2019; pp. 1–5. [Google Scholar] [CrossRef]
  72. Aardema, F.; O’Connor, K.; Côté, S.; Taillon, A. Virtual reality induces dissociation and lowers sense of presence in objective reality. Cyberpsychol. Behav. Soc. Netw. 2010, 13, 429–435. [Google Scholar] [CrossRef]
  73. Garvey, G. Dissociation: A natural state of mind? J. Conscious. Stud. 2010, 17, 139–155. [Google Scholar]
  74. Smith, A. Cognitive empathy and emotional empathy in human behavior and evolution. Psychol. Rec. 2006, 56, 3–21. [Google Scholar] [CrossRef]
  75. Jung, Y. Understanding the role of sense of presence and perceived autonomy in users’ continued use of social virtual worlds. J. Comput.-Mediat. Commun. 2011, 16, 492–510. [Google Scholar] [CrossRef]
  76. Bergström, J.; Knibbe, J.; Pohl, H.; Hornbæk, K. Sense of agency and user experience: Is there a link? ACM Trans. Comput.-Hum. Interact. TOCHI 2022, 29, 1–22. [Google Scholar] [CrossRef]
  77. Ryan, R.M.; Deci, E.L. Intrinsic and extrinsic motivations: Classic definitions and new directions. Contemp. Educ. Psychol. 2000, 25, 54–67. [Google Scholar] [CrossRef] [PubMed]
  78. Zeng, L. Designing the User Interface: Strategies for Effective Human-Computer Interaction by B. Shneiderman and C. Plaisant. Int. J. Hum.-Comput. Interact. 2009, 25, 707–708. [Google Scholar] [CrossRef]
  79. Tan, F.F.Y.; Ram, A.; Haigh, C.; Zhao, S. Mindful Moments: Exploring On-the-go Mindfulness Practice On Smart-glasses. In Proceedings of the 2023 ACM Designing Interactive Systems Conference, New York, NY, USA, 10–14 July 2023; pp. 476–492. [Google Scholar] [CrossRef]
  80. D’Errico, F.; Leone, G.; Schmid, M.; D’Anna, C. Prosocial virtual reality, empathy, and EEG measures: A pilot study aimed at monitoring emotional processes in intergroup helping behaviors. Appl. Sci. 2020, 10, 1196. [Google Scholar] [CrossRef]
  81. Parsons, T.D.; Gaggioli, A.; Riva, G. Extended reality for the clinical, affective, and social neurosciences. Brain Sci. 2020, 10, 922. [Google Scholar] [CrossRef] [PubMed]
  82. Arden, M.A.; Dye, L.; Walker, A. Menstrual synchrony: Awareness and subjective experiences. J. Reprod. Infant Psychol. 1999, 17, 255–265. [Google Scholar] [CrossRef]
Figure 1. Research framework and flow.
Figure 1. Research framework and flow.
Electronics 14 01129 g001
Figure 2. Virtual environment. (A) Fixation cross, (B) control condition—without biosensory cues, (C) experimental condition—with biosensory cues (GSR signal as example).
Figure 2. Virtual environment. (A) Fixation cross, (B) control condition—without biosensory cues, (C) experimental condition—with biosensory cues (GSR signal as example).
Electronics 14 01129 g002
Figure 3. Biosensory cues. (A) Biosensory cues of pEMG. (B) Biosensory cues of GSR. (C) Biosensory cues of HR. (D) Biosensory cues of RE. (E) Biosensory cues of SpO2.
Figure 3. Biosensory cues. (A) Biosensory cues of pEMG. (B) Biosensory cues of GSR. (C) Biosensory cues of HR. (D) Biosensory cues of RE. (E) Biosensory cues of SpO2.
Electronics 14 01129 g003
Figure 4. Physiological data collection. (A) Biosensors used, (B) Overall experimental settings.
Figure 4. Physiological data collection. (A) Biosensors used, (B) Overall experimental settings.
Electronics 14 01129 g004
Figure 5. WCC Analysis result of each biosignals. (A) WCC analysis result of RE, (B) WCC analysis result of pEMG, (C) WCC analysis result of GSR, (D) WCC analysis result of HR, (E) WCC analysis result of SpO2.
Figure 5. WCC Analysis result of each biosignals. (A) WCC analysis result of RE, (B) WCC analysis result of pEMG, (C) WCC analysis result of GSR, (D) WCC analysis result of HR, (E) WCC analysis result of SpO2.
Electronics 14 01129 g005
Figure 6. Histograms of the r * corresponding to biosignals. (A) Histogram of the r * corresponding to the RE signal, (B) histogram of the r * corresponding to the pEMG signal, (C) histogram of the r * corresponding to the GSR signal, (D) histogram of the r * corresponding to the HR signal, (E) histogram of the r * corresponding to the SpO2 signal.
Figure 6. Histograms of the r * corresponding to biosignals. (A) Histogram of the r * corresponding to the RE signal, (B) histogram of the r * corresponding to the pEMG signal, (C) histogram of the r * corresponding to the GSR signal, (D) histogram of the r * corresponding to the HR signal, (E) histogram of the r * corresponding to the SpO2 signal.
Electronics 14 01129 g006
Figure 7. Affinity diagram showing codes and major themes.
Figure 7. Affinity diagram showing codes and major themes.
Electronics 14 01129 g007
Figure 8. Network analysis diagram showing strongest thematic connections.
Figure 8. Network analysis diagram showing strongest thematic connections.
Electronics 14 01129 g008
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Gong, D.; Yan, H.; Wu, M.; Wang, Y.; Lei, Y.; Wang, X.; Xiao, R. Comparing Physiological Synchrony and User Copresent Experience in Virtual Reality: A Quantitative–Qualitative Gap. Electronics 2025, 14, 1129. https://doi.org/10.3390/electronics14061129

AMA Style

Gong D, Yan H, Wu M, Wang Y, Lei Y, Wang X, Xiao R. Comparing Physiological Synchrony and User Copresent Experience in Virtual Reality: A Quantitative–Qualitative Gap. Electronics. 2025; 14(6):1129. https://doi.org/10.3390/electronics14061129

Chicago/Turabian Style

Gong, Daojun, Haoming Yan, Ming Wu, Yimin Wang, Yifu Lei, Xuewen Wang, and Ruowei Xiao. 2025. "Comparing Physiological Synchrony and User Copresent Experience in Virtual Reality: A Quantitative–Qualitative Gap" Electronics 14, no. 6: 1129. https://doi.org/10.3390/electronics14061129

APA Style

Gong, D., Yan, H., Wu, M., Wang, Y., Lei, Y., Wang, X., & Xiao, R. (2025). Comparing Physiological Synchrony and User Copresent Experience in Virtual Reality: A Quantitative–Qualitative Gap. Electronics, 14(6), 1129. https://doi.org/10.3390/electronics14061129

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop