Next Article in Journal
Full-Scale Simulation and Experimental Study of Heat Transfer in Landing Gear Brake Discs for Medium-Sized Passenger Aircraft
Next Article in Special Issue
The Influence of Different Visual Elements of High-Density Urban Observation Decks on the Visual Behavior and Place Identity of Tourists and Residents
Previous Article in Journal
Enhanced Linearity in Intracranial Pressure Monitoring System Through Sample Isolation Bridge ROIC
Previous Article in Special Issue
Eye Movement Parameters in Children with Reading Difficulties
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Evaluating Pupillometry as a Tool for Assessing Facial and Emotional Processing in Nonhuman Primates

1
Shenzhen Technological Research Center for Primate Translational Medicine, Shenzhen-Hong Kong Institute of Brain Science, Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen 518055, China
2
CAS Key Laboratory of Brain Connectome and Manipulation, The Brain Cognition and Brain Disease Institute, Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen 518055, China
3
Guangdong Provincial Key Laboratory of Brain Connectome and Behavior, Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen 518055, China
4
University of Chinese Academy of Sciences, Beijing 100049, China
*
Author to whom correspondence should be addressed.
Appl. Sci. 2025, 15(6), 3022; https://doi.org/10.3390/app15063022
Submission received: 31 October 2024 / Revised: 27 January 2025 / Accepted: 28 January 2025 / Published: 11 March 2025
(This article belongs to the Special Issue Latest Research on Eye Tracking Applications)

Abstract

:
Non-human primates (NHPs) are extensively utilized to investigate the neural mechanisms underlying face processing; however, measuring their brain activity necessitates a diverse array of technologies. Pupillometry emerges as a convenient, cost-effective, and non-invasive alternative for indirectly assessing brain activity. To evaluate the efficacy of pupillometry in assessing facial and emotional processing in NHPs, this study designed a face fixation task for experimental monkeys (Rhesus macaque) and recorded variations in their pupil size in response to face images with differing characteristics, such as species, emotional expression, viewing angles, and orientation (upright vs. inverted). All face images were balanced with luminance and spatial frequency. A sophisticated eye-tracking system (Eye-link 1000 plus) was employed to observe the pupils and track the viewing trajectories of monkeys as they examined images of faces. Our findings reveal that monkeys exhibited larger pupil sizes in response to carnivore faces (versus human faces, p = 0.035), negative conspecific faces (versus human faces, p = 0.018), and profile viewing angles (versus frontal view angles, p = 0.010). Notably, pupil size recorded during the 500–1000 ms post-stimulus interval was negatively correlated with their gaze durations directed at those images (r = −0.357, p = 0.016). Overall, this study demonstrates that pupillometry effectively captures subtle differences in facial and emotional processing, underscoring its potential as a valuable tool in future cognitive research and the diagnosis of disorders.

1. Introduction

Facial expressions are pivotal in social interactions, serving as essential indicators of identity, species, gender, emotion, and intention [1]. Accurately interpreting facial cues is vital for effective social engagement, positioning face processing as a significant area of interest within psychology and neuroscience. Both humans and nonhuman primates (NHPs) exhibit comparable behaviors in face observation [2,3], making NHPs an excellent model for exploring the neural mechanisms involved in facial recognition. Research utilizing functional magnetic resonance imaging (fMRI) and multi-channel electrophysiological recording has shown that NHPs’ brain activity is modulated by facial cues. For example, neurons in the prefrontal cortex encode species-specific information [4,5], while the temporal cortex contains neurons responsive to both face orientation [6] and viewing angles [7,8,9,10].
Moreover, pupil size in primates is influenced by neural activity, offering a unique “window” into brain function [11,12,13]. Key neural substrates that regulate pupil size, primarily located in the brain stem, include the locus coeruleus (LC) and pretectal olivary nucleus (PON), which receive input from retinal ganglion cells [11,14]. In NHPs, the LC receives projections from both the prefrontal cortex (PFC) and anterior cingulate cortex (ACC) [15,16], influencing PFC activity through noradrenergic pathways [17]. The PON connects to the inferior temporal and visual cortices [18,19]. Electrophysiological studies have revealed that variations in pupil size correspond to neural activity in the PFC and ACC during passive fixation and unexpected events in NHPs [20]. Additionally, microstimulation of the PFC has been shown to affect the pupil light reflex in these animals [21,22]. In humans, similar MRI studies have identified correlations between blood oxygenation level-dependent (BOLD) activity in the ACC and orbitofrontal cortex and changes in pupil size [23]. Collectively, these findings suggest that changes in pupil size can reflect underlying brain activity in primates.
Given this context, pupillometry has emerged as a valuable tool for assessing emotional and cognitive processing in NHPs [11,24]. This study aims to evaluate whether pupillometry can be applied to investigate facial processing in NHPs. Compared to traditional methods for studying primate brain activity—such as multi-channel electrophysiological recording and fMRI—pupillometry offers a cost-effective, convenient, and non-invasive alternative [20]. As NHPs lack the ability to communicate verbally with humans, pupillometry presents an ideal approach for uncovering their internal states [24]. The study developed a face fixation task (FFT) featuring various face images, differing in species, expressions, orientation, and viewpoints—all known to affect neural activity in NHPs [4,5,6,25,26,27,28,29,30]. By employing an eye-tracking system, it will monitor the pupil area and gaze location of NHPs while they observe these facial stimuli. Through this research, it aims to determine whether pupillometry effectively reveals psychological differences in facial processing among NHPs.
The key contributions of this research can be outlined as follows:
(1)
This study investigates the potential of pupil size variations as indicators of emotional responses in NHPs. Furthermore, it aims to evaluate the effectiveness of changes in pupil size as a method for interpreting the emotional states of animals.
(2)
By analyzing the pupillary reactions of NHPs to facial imagery, it can differentiate their emotional responses to faces from those of humans.
(3)
Ultimately, this research seeks to explore the viability of pupil responses as a biomarker for deficits in face processing, as such impairments are a significant hallmark of several neurological disorders.
The structure of this paper is organized as follows: Section 1 (Introduction) offers a review of recent related literature in this field. Section 2 (Related Works) reviews recent related works. Section 3 (Materials and Methods) outlines the animal subjects, experimental procedures, and materials utilized throughout the experiments, while also covering data and statistical analysis methods. Section 4 (Results) presents the analysis of the study’s findings. Finally, Section 5 (Discussion) discusses the key conclusions drawn from the research and suggests directions for future investigations.

2. Related Works

Pupillometry has garnered increasing interest among psychologists and neuroscientists due to its non-invasive nature and cost-effectiveness (Table 1). For example, research led by Chang YH et al. has revealed that pupillometry correlates with various physiological measures, including heart rate, skin conductance, pulse wave amplitude, and respiratory signals. This suggests its potential as a reliable real-time indicator of autonomic arousal [31]. Additionally, studies have demonstrated that emotional stimuli—both visual and auditory—such as those prompting feelings of sadness, happiness, or fear, result in significant pupil dilation in humans, while neutral stimuli do not elicit such a response [32,33,34]. Further investigations indicate that the pupil dilation associated with arousal is more pronounced within social contexts [35,36]. This effect has also been documented in rodents, although it may be suppressed under anesthesia (specifically with dexmedetomidine-isoflurane) [37]. Moreover, findings from human studies have shown that increased cognitive workload and demands for attentional control can lead to greater pupil size [38,39], suggesting that cognitive effort is a contributing factor to pupil dilation. Therefore, it is hypothesized that facial characteristics may influence variations in pupil size. However, there is a lack of research exploring the responses of non-human primates (NHPs) to facial images through pupillometry.
In contrast to previous studies conducted over the past three years, this research is innovative in several ways: (1) it uses NHPs as subjects; (2) it rigorously controls the lower-level properties of images and minimizes the influence of extraneous factors on pupil size; and (3) it investigates the relationship between gaze duration and pupil variation (Table 1). While prior reviews have indicated that pupillometry can effectively measure arousal levels [11,24], we contend that it also serves as an indicator of cognitive effort and attention during facial processing.

3. Materials and Methods

The entire experimental procedure unfolded as follows: it commenced by assembling the necessary materials, specifically images of faces, along with the required equipment. Subsequently, all macaques participated in the behavioral task to facilitate data collection. Finally, a thorough analysis was conducted of the experimental data and a discussion regarding the findings ensued.

3.1. Animals

Seven male Rhesus macaques (age: 8–12 years; weight: 7.2–11.2 kg) were utilized in this study. All monkeys were housed individually in an AAALAC-accredited NHP facility with controlled temperature (20–26 °C) and humidity (40–70%). They lived in the same colony room, which maintained a 12:12 h light–dark cycle (light onset at 7 a.m.) [40,41]. To ensure stable eye tracking, each monkey was surgically implanted with a titanium headpost [42]. All experimental procedures adhered strictly to the Guide for the Care and Use of Laboratory Animals (Eighth Edition) and were approved by the Institutional Animal Care and Use Committee (IACUC) at the Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences.

3.2. Experimental Procedures

The monkeys sat comfortably in a custom-designed chair in a dark, quiet recording room with their heads fixed via the titanium headpost. An LED monitor (ASUS, 24 inches, 1920 × 1080 resolution, 144 Hz refresh rate) was positioned 57 cm in front of the subjects. Eye positions and pupil sizes were monitored using an eye-tracking system (Eyelink 1000 Plus; sampling rate: 1000 Hz, Figure 1). Prior to each recording, subjects underwent a nine-point calibration.
All monkeys were trained to perform a face fixation task (FFT; see Figure 1), adapted from previous studies involving humans and NHPs [43,44,45]. Each trial commenced with a fixation point at the center of the screen, which subjects had to focus on for 1 s to trigger the presentation of face stimuli, which lasted for 3 s. Subjects needed to maintain fixation within four visual degrees during the initial second; breaking this fixation would terminate the trial. Successful completion of a trial resulted in a juice reward. The background of the screen was set to the average RGB of all visual stimuli used (gray) to minimize contrast. The inter-trial interval (ITI) was set to 5–7 s, and all stimuli were presented in a random order. Overall, four distinct experiments utilizing the FFT paradigm were conducted.

3.3. Experiment 1: Responses to Different Animals and Objects

In this experiment, 12 images of faces and three images of objects were used as visual stimuli. The face stimuli included four categories: human, monkey, carnivore, and snake, with three images per category. Object images included a fire hydrant, lamp, and clock. Human, snake, and object images were sourced from the International Affective Picture System (IAPS) [46], while monkey and predator images (tiger, lion, cheetah) were obtained from publicly available sources. All subjects in the images were looking sideways. Monkeys completed two FFT sessions, each consisting of 360 trials.

3.4. Experiment 2: Responses to Monkey and Human Facial Expressions

For this experiment, six face images featuring different facial expressions were employed as visual stimuli, encompassing two species (human and monkey) and three expression types (negative, positive, and neutral). Human faces were obtained from the NimStim database [47], while monkey face images were sourced from one lab monkey displaying open-mouth, lip-smacking, and relaxed expressions, classified as negative, positive, and neutral, respectively [25,48,49,50,51]. All faces were oriented straight ahead. This experiment involved one FFT session comprising 120 trials.

3.5. Experiment 3: Responses to Faces of Different Viewing Angles

This experiment employed 12 face images captured from two rhesus macaques and two young male humans at three viewing angles (0°, 45°, and 90°), termed frontal view, middle profile, and profile view, respectively. All faces displayed neutral expressions. Monkeys performed two FFT sessions, each containing 360 trials.

3.6. Experiment 4: Responses to Faces of Different Orientations

In the final experiment, 12 face images featuring upright and inverted orientations were used as visual stimuli. This included three upright and three inverted human faces (female), as well as three upright and three inverted monkey faces. Human face images were acquired from IAPS [46], while monkey images were sourced from publicly available platforms. All faces displayed neutral expressions, oriented straight ahead. Monkeys completed two FFT sessions, each comprising 360 trials.
All images were processed using Photoshop (2017 version). The images were trimmed for consistent sizing, and face-irrelevant information was removed or obscured. Images within each experiment were balanced for luminance and spatial frequency utilizing the ShineToolbox in MATLAB (2016a version) [52]. Briefly, the luminance distributions of each image were normalized and adjusted to align with the corresponding mean and standard deviations. Additionally, each image’s spatial frequency was subjected to a fast Fourier transform to generate a spectrum, with the average spectrum established as the reference target. Subsequently, the phase information from the original image was merged with the target amplitude spectrum, and an inverse fast Fourier transform was applied to convert the data from the frequency domain back into the spatial domain.

3.7. Data Analysis

Eye tracking data underwent preprocessing through the utilization of custom scripts in Matlab (Version 2016a). For the analysis of pupil size, only those trials were considered where the pupil sizes remained stable and were continuously recorded for 1000 ms prior to and 1000 ms following the visual stimulus. To obtain the mean pupil size change curve, pupil areas were averaged across trials aligned to 200 ms prior to visual presentation. Notably, it observed that monkey pupils began to constrict approximately 200 ms post visual presentation, reaching minimum levels between 400 and 500 ms. The mean pupil sizes were calculated during three phases: early (1–200 ms), middle (201–500 ms), and late (501–1000 ms) after stimulus onset.
In conducting the gaze duration analysis, data from the free viewing interval, specifically between 1000 ms and 3000 ms following visual stimulation, were extracted for evaluation. To assess the relationship between pupil size and gaze duration for each image, we first determined the average pupil size during three distinct phases and recorded the gaze duration for each image presented to the four monkeys. Subsequently, we computed the z-scores for both pupil size and gaze duration within the context of the experiment. Finally, the Pearson product-moment correlation coefficient was calculated to quantify the correlation between the z-scores of gaze duration and pupil size. All figures were produced using Prism (Version 8.0).

3.8. Statistical Analysis

In Experiment 1 and 2, a Friedman test (a nonparametric test) was employed to assess the main effects between groups, followed by Dunn’s multiple comparisons. For Experiments 3 and 4, repeated measures two-way ANOVA (RM two-way ANOVA) was utilized to evaluate the effects of facial attributes (viewing angle, orientation) and species on pupil size changes. Data conformed to the sphericity assumption according to the Mauchly sphericity test. A Greenhouse–Geisser correction was applied if the data violated the sphericity assumption. A Shapiro–Wilk test was used to test the normality of data. Bonferroni correction for multiple comparisons was applied when differences reached significance (α = 0.05) or marginal significance (α = 0.06). All statistical analyses were conducted using SPSS (version 25, IBM Corporation).

4. Results

4.1. Experiment 1: Pupil Change in Response to Face Images of Different Species

It began by examining whether the pupil responses of monkeys varied when viewing images of different animals and objects. The experiment involved six monkeys, and the results concerning pupil size changes are illustrated in Figure 2A–C. Although no significant differences were detected in the early phase across conditions, a marginally significant effect of species on pupil size emerged in the middle phase (p = 0.059, Friedman test, Figure 2B), followed by a significant effect in the late phase (p = 0.048, Figure 2C). Further analysis through multiple comparisons revealed that pupil size in response to carnivores was significantly larger than to human faces in the late phase (p = 0.035, Figure 2C). Given that the physical features of the images were balanced, these variations in pupil responses likely reflect the innate responses of the subjects to different species. Thus, these findings indicate that pupil size can serve as a sensitive indicator of a monkey’s perceptual state.

4.2. Experiment 2: Pupil Change to Facial Expressions of Humans and Monkeys

Recognizing emotional states in facial expressions is crucial for primates. Therefore, it investigated whether eye tracking could yield insights by employing the FFT paradigm. This experiment involved six monkeys and presented human and monkey facial expressions as visual stimuli. The pupil area responses to these expressions are presented in Figure 3A–C, featuring two species (human/monkey) and three expressions (negative/positive/neutral). Consistent with Experiment 1, no significant differences in mean pupil size were observed during the early phase. Notably, facial expressions exhibited significant effects on mean pupil size in the middle (p = 0.028, Friedman test) and later phases (p = 0.040, Friedman test). Dunn’s multiple comparisons indicated that pupil size in response to monkey negative expressions was greater than that for human negative expressions in the middle phase (p = 0.018, Figure 3B) and late phase (p = 0.051, Figure 3C). These results imply that pupil size is a sensitive measure for detecting nuanced differences in responses to conspecific facial expressions.

4.3. Experiment 3: Pupil Size Change in Response to Face Images with Different Viewing Angles

Next, it aimed to explore how viewing angles (frontal view, mid-profile view, profile view) affect pupil size. Pupil size changes were examined in response to human and monkey faces presented at various angles (Figure 4A–C), involving five monkeys. While no significant effects of viewing angle were seen in the early phase, significant main effects emerged in the middle phase (repeated-measures two-way ANOVA: interaction F(2, 8) = 0.883, p = 0.450; viewing angle F(2, 8) = 5.031, p = 0.038) and late phase (interaction F(2, 8) = 0.098, p = 0.908; viewing angle F(2, 8) = 8.744, p = 0.010). Post hoc analysis revealed that pupil size in the late phase was larger for the profile view compared to the frontal view (p = 0.030, Figure 4C). These findings suggest that pupil size may reflect the cognitive processing related to viewing angles.

4.4. Experiment 4: Pupil Size Change in Response to Upright and Inverted Faces

Investigating face orientation is a common approach in studies of facial processing. It further sought to assess the efficacy of pupil tracking in gauging differences in processing upright versus inverted faces. This experiment involved six monkeys, utilizing both human and monkey faces as stimuli. The results of pupil size changes in response to upright and inverted faces are displayed in Figure 5A–C. No significant differences were found across face orientations at any phase (early, middle, late). These findings suggest that pupil size may not be particularly sensitive to variations in face orientation.

4.5. Correlation Between Gaze Duration and Mean Pupil Size

Lastly, variability in gaze duration was observed among monkeys fixating on different stimuli, ranging from 2500 to 2900 ms, despite a consistent presentation duration of 3000 ms. This study aimed to identify any potential correlation between gaze duration and pupil dynamics. The z-scores were obtained for both gaze duration and pupil size for each visual stimulus across the three phases, followed by Pearson correlation analysis for all phases with data from four monkeys who participated in all experiments. Notably, a significant negative correlation was found between gaze duration and mean pupil size in the late phase (r = −0.357, p = 0.016, Figure 6A,B, with Figure 6A depicting results for each stimulus). This suggests that pupil size in the late phase may predict a monkey’s interest in specific images; for example, smaller pupil sizes correspond to longer gaze durations towards those images in the FFT paradigm. This correlation was absent in the early (r = −0.063, p = 0.679) and middle phases (r = −0.255, p = 0.091, Figure 6C).

5. Discussion

In this study, it was demonstrated that pupillometry could effectively capture subtle differences in facial and emotional processing in NHPs. Our findings indicate that pupil size increases when NHPs view images of carnivores, negative conspecific facial expressions, and profile views. Notably, a significant negative correlation was also observed between mean pupil size in the late phase and gaze duration for each visual stimulus. These results suggest that pupil size may serve as a sensitive, indirect biomarker to gauge face processing in NHPs.

5.1. Pupil Size Changes in Face Processing

Firstly, it was found that NHPs exhibited larger pupil sizes in the late phase when viewing images of carnivores (tigers, leopards, and lions) compared to human faces. The subject monkeys had been trained to interact with humans, receiving food, water, and fruits from them, and had visual exposure to both monkeys and humans within their environments. As a result, these monkeys can be considered familiar with human faces. In contrast, they had no prior exposure to carnivores or snakes, which are recognized as primary predators of primates [53,54]. Din et al. [4] reported that neurons in the prefrontal cortex show stronger responses to images of snakes and carnivores than to humans in NHPs. Significant pupil size changes were not observed in response to snake images; this may be attributed to our control of low-level image properties, such as color, spatial frequency, and luminance. Snakeskin is characterized by a distinctive color and texture (smoothy and hairless) that differs from that of common mammals. By manipulating these low-level attributes of the images, it may have complicated the monkeys’ recognition of snakes, ultimately reducing their salience.
Secondly, it identified that NHPs displayed larger pupil sizes in response to negative facial expressions of monkeys, but not humans. The ability to recognize negative emotions from faces is crucial for survival [1], and imaging studies have shown that multiple brain regions in NHPs exhibit increased BOLD signals when viewing conspecific negative expressions. Electrophysiological research suggests that the orbitofrontal cortex and medial prefrontal cortex contain neurons responsive to these expressions [4,5,28]. Our findings align with this biological evidence regarding pupil size modulation by conspecific negative expressions. However, a similar pupil response to negative human expressions was not observed, potentially because NHPs and humans communicate emotional facial expressions differently [41,49,55], making it challenging for NHPs to infer negative emotions from human faces.
Thirdly, viewing angle influenced pupil size, with NHPs responding with larger pupil sizes to profile views compared to frontal views. Research has shown that neurons in the temporal cortex of NHPs are sensitive to viewing angles [8,10]. The recognition of emotions and identities from frontal views occurs more quickly and accurately than from profile views [56,57,58,59]. Therefore, we propose that viewing faces from a frontal perspective allows NHPs to more efficiently gather social information, such as identity and emotions, thereby reducing uncertainty in facial recognition [56,57,58,59].
These findings indicate that changes in pupil size during the middle and late phases (200–1000 ms post-stimulus presentation) reflect the processing of facial information. Human studies have shown that pupil sizes increase under high working memory load and attentional control [38,39]. It is plausible that NHPs also require increased cognitive effort to recognize threatening or uncertain face information quickly. Moreover, our discovery of a negative correlation between gaze duration and pupil size in the late phase (500–1000 ms post-stimulus presentation) suggests that smaller pupils correspond to longer gaze durations, indicating that NHPs might allocate less attentional control and engage in prolonged viewing to comprehend facial information.

5.2. Limitations of Pupillometry in Exploring Face Processing

While our study demonstrates that pupillometry can reveal differences in face processing among NHPs, certain limitations must be acknowledged. Compared to commonly used brain research techniques, such as MRI, multi-channel electrophysiological recordings, and EEG, pupillometry offers several advantages, including shorter preparation times, no need for consumable materials, absence of required anesthesia, and non-invasive measurements. Therefore, it posits that pupillometry is a valuable and suitable tool for studying NHPs as experimental models.
However, pupillometry does present some challenges. Firstly, it typically requires a longer latency to reveal differences in face processing. EEG and MEG studies have shown that the N170 and M170 components reflect differences in face processing in humans around 150–200 ms post-stimulus [60,61,62]. In NHPs, similar components appear approximately 112 ms after face presentation [63], while pupillometry may need 200–1000 ms to reveal processing differences. Secondly, controlling low-level properties of visual input, such as luminance, spatial frequency, and color, is essential in pupillometry studies, as these factors can significantly impact pupil size [64]. However, controlling these properties can affect subjective experiences with the images [65,66], which may also influence pupil size. Thirdly, the neural mechanisms modulating pupil size changes are not fully elucidated; thus, pupillometry serves as an indirect method to assess brain activity [11]. It is essential to consider these limitations prior to implementing pupillometry in a particular study. Integrating artificial intelligence with pupillometry may further enhance its application in future cognitive research and the diagnosis of disorders [67].

5.3. Conclusions and Directions for Future Research

In summary, our findings indicate that pupillometry is a valuable tool for uncovering variations in how non-human primates (NHPs) process facial expressions. The levels of arousal, along with cognitive control, appear to influence this mechanism. While pupillometry does have certain limitations, it presents an economical and practical approach for indirectly assessing brain activity. This research reinforces the notion that changes in pupil size can be an effective measure of emotional face processing in primates, potentially aiding in the diagnosis of neurological conditions such as autism.
Looking ahead, we aim to create a specialized task and formulate a mathematical model that will enable us to interpret the emotional states and cognitive demands of NHPs based on data derived from pupil size fluctuations [68].

Author Contributions

Conceptualization, X.L. and J.D.; methodology, X.L. and Z.Z.; software, X.L.; validation, X.L., Z.Z., and J.D.; formal analysis, X.L.; investigation, X.L.; resources, X.L. and Z.Z.; data curation, X.L.; writing—original draft preparation, X.L.; writing—review and editing, J.D.; visualization, X.L.; supervision, J.D.; project administration, J.D.; funding acquisition, J.D. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Natural Science Foundation of China (32371066), the Guangdong Basic and Applied Basic Research Foundation (2022A1515010134, 2022A1515110598), the Shenzhen-Hong Kong Institute of Brain Science–Shenzhen Fundamental Research Institutions (NYKFKT2019009), and Shenzhen Technological Research Center for Primate Translational Medicine (XMHT20220104005). The APC was funded by the National Natural Science Foundation of China (32371066).

Institutional Review Board Statement

The animal study protocol was approved by the Institutional Animal Care and Use Committee (IACUC) at the Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences. (SIAT-IACUC-201123-NS-DJ-A1483, approved on 12-07-2020).

Informed Consent Statement

Not applicable.

Data Availability Statement

The datasets presented in this article are not readily available because the data are part of an ongoing study. Requests to access the datasets should be directed to the corresponding author.

Acknowledgments

We appreciate the assistance of the SIAT MRI facility and NHP veterinaries.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Hesse, J.K.; Tsao, D.Y. The macaque face patch system: A turtle’s underbelly for the brain. Nat. Rev. Neurosci. 2020, 21, 695–716. [Google Scholar] [CrossRef] [PubMed]
  2. Dahl, C.D.; Wallraven, C.; Bülthoff, H.H.; Logothetis, N.K. Humans and macaques employ similar face-processing strategies. Curr. Biol. 2009, 19, 509–513. [Google Scholar] [CrossRef] [PubMed]
  3. Shepherd, S.V.; Steckenfinger, S.A.; Hasson, U.; Ghazanfar, A.A. Human-monkey gaze correlations reveal convergent and divergent patterns of movie viewing. Curr. Biol. 2010, 20, 649–656. [Google Scholar] [CrossRef]
  4. Dinh, H.T.; Nishimaru, H.; Matsumoto, J.; Takamura, Y.; Le, Q.V.; Hori, E.; Maior, R.S.; Tomaz, C.; Tran, A.H.; Ono, T.; et al. Superior Neuronal Detection of Snakes and Conspecific Faces in the Macaque Medial Prefrontal Cortex. Cereb. Cortex 2018, 28, 2131–2145. [Google Scholar] [CrossRef]
  5. Gothard, K.M.; Battaglia, F.P.; Erickson, C.A.; Spitler, K.M.; Amaral, D.G. Neural responses to facial expression and face identity in the monkey amygdala. J. Neurophysiol. 2007, 97, 1671–1683. [Google Scholar] [CrossRef]
  6. Sugase-Miyamoto, Y.; Matsumoto, N.; Ohyama, K.; Kawano, K. Face inversion decreased information about facial identity and expression in face-responsive neurons in macaque area TE. J. Neurosci. 2014, 34, 12457–12469. [Google Scholar] [CrossRef]
  7. Desimone, R.; Albright, T.; Gross, C.; Bruce, C. Stimulus-selective properties of inferior temporal neurons in the macaque. J. Neurosci. 1984, 4, 2051–2062. Available online: https://www.jneurosci.org/content/jneuro/4/8/2051.full.pdf (accessed on 1 August 2024). [CrossRef]
  8. Perrett, D.I.; Oram, M.W.; Harries, M.H.; Bevan, R.; Hietanen, J.K.; Benson, P.J.; Thomas, S. Viewer-centred and object-centred coding of heads in the macaque temporal cortex. Exp. Brain Res. 1991, 86, 159–173. Available online: https://www.scopus.com/inward/record.uri?eid=2-s2.0-0025734432&doi=10.1007%2fBF00231050&partnerID=40&md5=f116a3a935f8853eb7f2b018e3bbf82a (accessed on 1 January 2024). [CrossRef]
  9. Perrett, D.I.; Mistlin, A.J.; Chitty, A.J. Visual neurones responsive to faces. Trends Neurosci. 1987, 10, 358–364. Available online: https://www.sciencedirect.com/science/article/pii/0166223687900713 (accessed on 1 September 2024). [CrossRef]
  10. Perrett, D.I.; Rolls, E.T.; Caan, W. Visual neurones responsive to faces in the monkey temporal cortex. Exp. Brain Res. 1982, 47, 329–342. [Google Scholar] [CrossRef]
  11. Joshi, S.; Gold, J.I. Pupil Size as a Window on Neural Substrates of Cognition. Trends Cogn. Sci. 2020, 24, 466–480. [Google Scholar] [CrossRef] [PubMed]
  12. Ferencová, N.; Višňovcová, Z.; Bona Olexová, L.; Tonhajzerová, I. Eye pupil—A window into central autonomic regulation via emotional/cognitive processing. Physiol. Res. 2021, 70, S669–S682. [Google Scholar] [CrossRef] [PubMed]
  13. Ebitz, R.B.; Moore, T. Both a Gauge and a Filter: Cognitive Modulations of Pupil Size. Front. Neurol. 2019, 9, 1190. [Google Scholar] [CrossRef] [PubMed]
  14. Gamlin, P.D.; McDougal, D.H.; Pokorny, J.; Smith, V.C.; Yau, K.W.; Dacey, D.M. Human and macaque pupil responses driven by melanopsin-containing retinal ganglion cells. Vis. Res. 2007, 47, 946–954. [Google Scholar] [CrossRef]
  15. Arnsten, A.F.; Goldman-Rakic, P.S. Selective prefrontal cortical projections to the region of the locus coeruleus and raphe nuclei in the rhesus monkey. Brain Res. 1984, 306, 9–18. [Google Scholar] [CrossRef] [PubMed]
  16. Porrino, L.J.; Goldman-Rakic, P.S. Brainstem innervation of prefrontal and anterior cingulate cortex in the rhesus monkey revealed by retrograde transport of HRP. J. Comp. Neurol. 1982, 205, 63–76. [Google Scholar] [CrossRef] [PubMed]
  17. Morrison, J.H.; Foote, S.L.; O’Connor, D.; Bloom, F.E. Laminar, tangential and regional organization of the noradrenergic innervation of monkey cortex: Dopamine-beta-hydroxylase immunohistochemistry. Brain Res. Bull. 1982, 9, 309–319. [Google Scholar] [CrossRef] [PubMed]
  18. Steele, G.E.; Weller, R.E. Subcortical connections of subdivisions of inferior temporal cortex in squirrel monkeys. Vis. Neurosci. 1993, 10, 563–583. Available online: https://www.scopus.com/inward/record.uri?eid=2-s2.0-0027603502&doi=10.1017%2fS0952523800004776&partnerID=40&md5=11acec2b634f23922c4ae5c7d37ebb22 (accessed on 31 May 2024). [CrossRef]
  19. Leichnetz, G.R. Preoccipital cortex receives a differential input from the frontal eye field and projects to the pretectal olivary nucleus and other visuomotor-related structures in the rhesus monkey. Vis. Neurosci. 1990, 5, 123–133. Available online: https://www.scopus.com/inward/record.uri?eid=2-s2.0-0025477761&doi=10.1017%2fS095252380000016X&partnerID=40&md5=2faad4f2ef1de2448a4727adec881e17 (accessed on 15 August 2024). [CrossRef]
  20. Joshi, S.; Li, Y.; Kalwani, R.M.; Gold, J.I. Relationships between Pupil Diameter and Neuronal Activity in the Locus Coeruleus, Colliculi, and Cingulate Cortex. Neuron 2016, 89, 221–234. Available online: https://www.sciencedirect.com/science/article/pii/S089662731501034X (accessed on 6 January 2024). [CrossRef]
  21. Ebitz, R.B.; Moore, T. Selective Modulation of the Pupil Light Reflex by Microstimulation of Prefrontal Cortex. J. Neurosci. 2017, 37, 5008–5018. Available online: https://www.jneurosci.org/content/jneuro/37/19/5008.full.pdf (accessed on 10 May 2024). [CrossRef] [PubMed]
  22. Lehmann, S.J.; Corneil, B.D. Transient Pupil Dilation after Subsaccadic Microstimulation of Primate Frontal Eye Fields. J. Neurosci. 2016, 36, 3765–3776. [Google Scholar] [CrossRef]
  23. Murphy, P.R.; O’Connell, R.G.; O’Sullivan, M.; Robertson, I.H.; Balsters, J.H. Pupil diameter covaries with BOLD activity in human locus coeruleus. Hum. Brain Mapp. 2014, 35, 4140–4154. [Google Scholar] [CrossRef]
  24. Kuraoka, K.; Nakamura, K. Facial temperature and pupil size as indicators of internal state in primates. Neurosci. Res. 2022, 175, 25–37. [Google Scholar] [CrossRef] [PubMed]
  25. Liu, N.; Hadj-Bouziane, F.; Jones, K.B.; Turchi, J.N.; Averbeck, B.B.; Ungerleider, L.G. Oxytocin modulates fMRI responses to facial expression in macaques. Proc. Natl. Acad. Sci. USA 2015, 112, E3123–E3130. [Google Scholar] [CrossRef]
  26. Xu, P.; Peng, S.; Luo, Y.J.; Gong, G. Facial expression recognition: A meta-analytic review of theoretical models and neuroimaging evidence. Neurosci. Biobehav. Rev. 2021, 127, 820–836. [Google Scholar] [CrossRef]
  27. Adolphs, R.; Tranel, D.; Damasio, H.; Damasio, A. Impaired recognition of emotion in facial expressions following bilateral damage to the human amygdala. Nature 1994, 372, 669–672. [Google Scholar] [CrossRef]
  28. Barat, E.; Wirth, S.; Duhamel, J.R. Face cells in orbitofrontal cortex represent social categories. Proc. Natl. Acad. Sci. USA 2018, 115, E11158–E11167. [Google Scholar] [CrossRef]
  29. Moeller, S.; Crapse, T.; Chang, L.; Tsao, D.Y. The effect of face patch microstimulation on perception of faces and objects. Nat. Neurosci. 2017, 20, 743–752. [Google Scholar] [CrossRef] [PubMed]
  30. Tsao, D.Y.; Freiwald, W.A.; Tootell, R.B.; Livingstone, M.S. A cortical region consisting entirely of face-selective cells. Science 2006, 311, 670–674. [Google Scholar] [CrossRef] [PubMed]
  31. Chang, Y.H.; Yep, R.; Wang, C.A. Pupil size correlates with heart rate, skin conductance, pulse wave amplitude, and respiration responses during emotional conflict and valence processing. Psychophysiology 2024, 62, e14726. [Google Scholar] [CrossRef] [PubMed]
  32. Lee, C.L.; Pei, W.; Lin, Y.C.; Granmo, A.; Liu, K.H. Emotion Detection Based on Pupil Variation. Healthcare 2023, 11, 322. [Google Scholar] [CrossRef] [PubMed]
  33. Pan, J.; Sun, X.; Park, E.; Kaufmann, M.; Klimova, M.; McGuire, J.T.; Ling, S. The effects of emotional arousal on pupil size depend on luminance. Sci. Rep. 2024, 14, 21895. [Google Scholar] [CrossRef] [PubMed]
  34. Yuan, T.; Wang, L.; Jiang, Y. Multi-level processing of emotions in life motion signals revealed through pupil responses. Neuroscience 2024, 12, RP89873. [Google Scholar] [CrossRef]
  35. Yu, P.; Yu, L.; Li, Y.; Qian, C.; Hu, J.; Zhu, W.; Liu, F.; Wang, Q. Emotional and visual responses to trypophobic images with object, animal, or human body backgrounds: An eye-tracking study. Front. Psychol. 2024, 15, 1467608. [Google Scholar] [CrossRef] [PubMed]
  36. Bonino, G.; Mazza, A.; Capiotto, F.; Berti, A.; Pia, L.; Dal Monte, O. Pupil dilation responds to the intrinsic social characteristics of affective touch. Sci. Rep. 2024, 14, 24297. [Google Scholar] [CrossRef] [PubMed]
  37. Sun, J.; Zhu, L.; Fang, X.; Tang, Y.; Xiao, Y.; Jiang, S.; Lin, J.; Li, Y. Pupil dilation and behavior as complementary measures of fear response in Mice. Cogn. Neurodyn. 2024, 18, 4047–4054. [Google Scholar] [CrossRef]
  38. Keene, P.A.; deBettencourt, M.T.; Awh, E.; Vogel, E.K. Pupillometry signatures of sustained attention and working memory. Atten. Percept. Psychophys. 2022, 84, 2472–2482. [Google Scholar] [CrossRef] [PubMed]
  39. Zokaei, N.; Board, A.G.; Manohar, S.G.; Nobre, A.C. Modulation of the pupillary response by the content of visual working memory. Proc. Natl. Acad. Sci. USA 2019, 116, 22802–22810. Available online: https://www.pnas.org/doi/abs/10.1073/pnas.1909959116 (accessed on 5 November 2024). [CrossRef]
  40. Zhang, Z.; Shan, L.; Wang, Y.; Li, W.; Jiang, M.; Liang, F.; Feng, S.; Lu, Z.; Wang, H.; Dai, J. Primate preoptic neurons drive hypothermia and cold defense. Innovation 2023, 4, 100358. Available online: https://www.ncbi.nlm.nih.gov/pubmed/36583100 (accessed on 5 December 2023). [CrossRef]
  41. Liu, X.H.; Gan, L.; Zhang, Z.T.; Yu, P.K.; Dai, J. Probing the processing of facial expressions in monkeys via time perception and eye tracking. Zool. Res. 2023, 44, 882–893. Available online: https://www.ncbi.nlm.nih.gov/pubmed/37545418 (accessed on 18 September 2023). [CrossRef]
  42. Shan, L.; Yuan, L.; Zhang, B.; Ma, J.; Xu, X.; Gu, F.; Jiang, Y.; Dai, J. Neural Integration of Audiovisual Sensory Inputs in Macaque Amygdala and Adjacent Regions. Neurosci. Bull. 2023, 39, 1749–1761. Available online: https://www.ncbi.nlm.nih.gov/pubmed/36920645 (accessed on 15 March 2023). [CrossRef]
  43. Wang, C.A.; Baird, T.; Huang, J.; Coutinho, J.D.; Brien, D.C.; Munoz, D.P. Arousal Effects on Pupil Size, Heart Rate, and Skin Conductance in an Emotional Face Task. Front. Neurol. 2018, 9, 1029. [Google Scholar] [CrossRef] [PubMed]
  44. Finke, J.B.; Deuter, C.E.; Hengesch, X.; Schächinger, H. The time course of pupil dilation evoked by visual sexual stimuli: Exploring the underlying ANS mechanisms. Psychophysiology 2017, 54, 1444–1458. [Google Scholar] [CrossRef]
  45. Suzuki, T.W.; Kunimatsu, J.; Tanaka, M. Correlation between Pupil Size and Subjective Passage of Time in Non-Human Primates. J. Neurosci. 2016, 36, 11331–11337. [Google Scholar] [CrossRef] [PubMed]
  46. Lang, P.J.; Bradley, M.M.; Cuthbert, B.N. International Affective Picture System (IAPS): Affective Ratings of Pictures and Instruction Manual; University of Florida: Gainesville, FL, USA, 2008. [Google Scholar]
  47. Tottenham, N.; Tanaka, J.W.; Leon, A.C.; McCarry, T.; Nurse, M.; Hare, T.A.; Marcus, D.J.; Westerlund, A.; Casey, B.J.; Nelson, C. The NimStim set of facial expressions: Judgments from untrained research participants. Psychiatry Res. 2009, 168, 242–249. [Google Scholar] [CrossRef] [PubMed]
  48. Maréchal, L.; Levy, X.; Meints, K.; Majolo, B. Experience-based human perception of facial expressions in Barbary macaques (Macaca sylvanus). PeerJ 2017, 5, e3413. [Google Scholar] [CrossRef] [PubMed]
  49. Micheletta, J.; Whitehouse, J.; Parr, L.A.; Waller, B.M. Facial expression recognition in crested macaques (Macaca nigra). Anim. Cogn. 2015, 18, 985–990. [Google Scholar] [CrossRef]
  50. Morozov, A.; Parr, L.A.; Gothard, K.; Paz, R.; Pryluk, R. Automatic Recognition of Macaque Facial Expressions for Detection of Affective States. eNeuro 2021, 8, 1–16. [Google Scholar] [CrossRef]
  51. Wang, L.; Zhang, B.; Lu, X.; Wang, R.; Ma, J.; Chen, Y.; Zhou, Y.; Dai, J.; Jiang, Y. Genetic and Neuronal Basis for Facial Emotion Perception in Humans and Macaques. Natl. Sci. Rev. 2024, 11, nwae381. [Google Scholar] [CrossRef]
  52. Willenbockel, V.; Sadr, J.; Fiset, D.; Horne, G.O.; Gosselin, F.; Tanaka, J.W. Controlling low-level image properties: The SHINE toolbox. Behav. Res. Methods 2010, 42, 671–684. [Google Scholar] [CrossRef] [PubMed]
  53. Isbell, L.A. Snakes as agents of evolutionary change in primate brains. J. Hum. Evol. 2006, 51, 1–35. [Google Scholar] [CrossRef] [PubMed]
  54. Isbell, L.A. Predation on primates: Ecological patterns and evolutionary consequences. Evol. Anthropol. Issues News Rev. 1994, 3, 61–71. [Google Scholar] [CrossRef]
  55. Parr, L.A.; Waller, B.M.; Vick, S.J.; Bard, K.A. Classifying chimpanzee facial expressions using muscle action. Emotion 2007, 7, 172–181. [Google Scholar] [CrossRef] [PubMed]
  56. Swystun, A.G.; Logan, A.J. Quantifying the effect of viewpoint changes on sensitivity to face identity. Vis. Res. 2019, 165, 1–12. [Google Scholar] [CrossRef] [PubMed]
  57. Lee, Y.; Matsumiya, K.; Wilson, H.R. Size-invariant but viewpoint-dependent representation of faces. Vis. Res. 2006, 46, 1901–1910. [Google Scholar] [CrossRef] [PubMed]
  58. Hill, H.; Schyns, P.G.; Akamatsu, S. Information and viewpoint dependence in face recognition. Cognition 1997, 62, 201–222. [Google Scholar] [CrossRef] [PubMed]
  59. Guo, K.; Shaw, H. Face in profile view reduces perceived facial expression intensity: An eye-tracking study. Acta Psychol. 2015, 155, 19–28. [Google Scholar] [CrossRef] [PubMed]
  60. Ewbank, M.P.; Smith, W.A.; Hancock, E.R.; Andrews, T.J. The M170 reflects a viewpoint-dependent representation for both familiar and unfamiliar faces. Cereb. Cortex 2008, 18, 364–370. [Google Scholar] [CrossRef] [PubMed]
  61. Magnuski, M.; Gola, M. It’s not only in the eyes: Nonlinear relationship between face orientation and N170 amplitude irrespective of eye presence. Int. J. Psychophysiol. 2013, 89, 358–365. [Google Scholar] [CrossRef]
  62. Jacques, C.; Rossion, B. Misaligning face halves increases and delays the N170 specifically for upright faces: Implications for the nature of early face representations. Brain Res. 2010, 1318, 96–109. [Google Scholar] [CrossRef]
  63. Orczyk, J.; Schroeder, C.E.; Abeles, I.Y.; Gomez-Ramirez, M.; Butler, P.D.; Kajikawa, Y. Comparison of Scalp ERP to Faces in Macaques and Humans. Front. Syst. Neurosci. 2021, 15, 667611. [Google Scholar] [CrossRef] [PubMed]
  64. Kardon, R. Pupillary light reflex. Curr. Opin. Ophthalmol. 1995, 6, 20–26. [Google Scholar] [CrossRef]
  65. Menzel, C.; Hayn-Leichsenring, G.U.; Langner, O.; Wiese, H.; Redies, C. Fourier power spectrum characteristics of face photographs: Attractiveness perception depends on low-level image properties. PLoS ONE 2015, 10, e0122801. [Google Scholar] [CrossRef]
  66. Barros, F.; Soares, S.C.; Rocha, M.; Bem-Haja, P.; Silva, S.; Lundqvist, D. The angry versus happy recognition advantage: The role of emotional and physical properties. Psychol. Res. 2023, 87, 108–123. [Google Scholar] [CrossRef] [PubMed]
  67. Huang, T.; Xu, H.; Wang, H.; Huang, H.; Xu, Y.; Li, B.; Hong, S.; Feng, G.; Kui, S.; Liu, G.; et al. Artificial intelligence for medicine: Progress, challenges, and perspectives. Innov. Med. 2023, 1, 100030. Available online: https://www.the-innovation.org/medicine//article/doi/10.59717/j.xinn-med.2023.100030 (accessed on 15 September 2023). [CrossRef]
  68. Feng, G.; Xu, H.; Wan, S.; Wang, H.; Chen, X.; Magari, R.; Han, Y.; Wei, Y.; Gu, H. Twelve practical recommendations for developing and applying clinical predictive models. Innov. Med. 2024, 2, 100105. Available online: https://www.the-innovation.org/medicine//article/doi/10.59717/j.xinn-med.2024.100105 (accessed on 6 December 2024). [CrossRef]
Figure 1. The schematic of the behavioral task and examples of stimulus. (A) The flow of the task. (B) Illustration of the experimental setup. (C) The stimuli set includes faces of monkeys, humans, animals, and objects, with different emotions, viewing angles, and upright or inverted.
Figure 1. The schematic of the behavioral task and examples of stimulus. (A) The flow of the task. (B) Illustration of the experimental setup. (C) The stimuli set includes faces of monkeys, humans, animals, and objects, with different emotions, viewing angles, and upright or inverted.
Applsci 15 03022 g001
Figure 2. Changes in pupil size in response to images of various animals and objects. (A) Pupil alterations observed 500 ms before and 1000 ms post the time of stimulus presentation. Light gray and green zones indicate the middle (201–500 ms) and late phases (501–1000 ms), respectively. (B,C) Averaged pupil changes during the mid and late phases in reaction to different animals and objects. Color coding: red for monkeys; yellow for humans; gray for objects; green for carnivores; and blue for snakes. Error bar: SEM. * p < 0.05.
Figure 2. Changes in pupil size in response to images of various animals and objects. (A) Pupil alterations observed 500 ms before and 1000 ms post the time of stimulus presentation. Light gray and green zones indicate the middle (201–500 ms) and late phases (501–1000 ms), respectively. (B,C) Averaged pupil changes during the mid and late phases in reaction to different animals and objects. Color coding: red for monkeys; yellow for humans; gray for objects; green for carnivores; and blue for snakes. Error bar: SEM. * p < 0.05.
Applsci 15 03022 g002
Figure 3. Changes in pupil size in response to facial expressions of humans and monkeys. (A) Pupil alterations observed 500 ms before and 1000 ms post the time of stimulus presentation. Light gray and green zones indicate the middle (201–500 ms) and late phases (501–1000 ms), respectively. (B,C) Averaged pupil changes during the mid and late phases in reaction to different facial expressions of humans and monkeys. Color coding: blue for monkey—negative; dark gray for monkey—neutral; red for monkey—positive; purple for human—negative; light gray for human—neutral; and yellow for human—positive. Error bar: SEM. * p < 0.05.
Figure 3. Changes in pupil size in response to facial expressions of humans and monkeys. (A) Pupil alterations observed 500 ms before and 1000 ms post the time of stimulus presentation. Light gray and green zones indicate the middle (201–500 ms) and late phases (501–1000 ms), respectively. (B,C) Averaged pupil changes during the mid and late phases in reaction to different facial expressions of humans and monkeys. Color coding: blue for monkey—negative; dark gray for monkey—neutral; red for monkey—positive; purple for human—negative; light gray for human—neutral; and yellow for human—positive. Error bar: SEM. * p < 0.05.
Applsci 15 03022 g003
Figure 4. Changes in pupil size in response to face images with different viewing angles. (A) Pupil alterations observed 500 ms before and 1000 ms post the time of stimulus presentation. Light gray and green zones indicate the middle (201–500 ms) and late phases (501–1000 ms), respectively. (B,C) Averaged pupil changes during the mid and late phases in reaction to face images with different viewing angles. Color coding: dark gray for monkey frontal; blue for monkey mid-profile; green for monkey profile; light gray for human frontal; purple for human mid-profile; and light green for human profile. Error bar: SEM. * p < 0.05.
Figure 4. Changes in pupil size in response to face images with different viewing angles. (A) Pupil alterations observed 500 ms before and 1000 ms post the time of stimulus presentation. Light gray and green zones indicate the middle (201–500 ms) and late phases (501–1000 ms), respectively. (B,C) Averaged pupil changes during the mid and late phases in reaction to face images with different viewing angles. Color coding: dark gray for monkey frontal; blue for monkey mid-profile; green for monkey profile; light gray for human frontal; purple for human mid-profile; and light green for human profile. Error bar: SEM. * p < 0.05.
Applsci 15 03022 g004
Figure 5. Changes in pupil size in response to upright and inverted faces. (A) Pupil alterations observed 500 ms before and 1000 ms post the time of stimulus presentation. Light gray and green zones indicate the middle (201–500 ms) and late phases (501–1000 ms), respectively. (B,C) Averaged pupil changes during the mid and late phases in reaction to upright and inverted faces. Color coding: dark gray for monkey upright; red for monkey inverted; light gray for human upright; and yellow for human inverted. Error bar: SEM.
Figure 5. Changes in pupil size in response to upright and inverted faces. (A) Pupil alterations observed 500 ms before and 1000 ms post the time of stimulus presentation. Light gray and green zones indicate the middle (201–500 ms) and late phases (501–1000 ms), respectively. (B,C) Averaged pupil changes during the mid and late phases in reaction to upright and inverted faces. Color coding: dark gray for monkey upright; red for monkey inverted; light gray for human upright; and yellow for human inverted. Error bar: SEM.
Applsci 15 03022 g005
Figure 6. Correlation between gaze duration and mean pupil size. (A) The z-scores for gaze duration (x-axis) and pupil size (y-axis) for each visual stimulus. (B) The correlation between gaze duration and pupil size during the late (B) and middle (C) phases. * p < 0.05.
Figure 6. Correlation between gaze duration and mean pupil size. (A) The z-scores for gaze duration (x-axis) and pupil size (y-axis) for each visual stimulus. (B) The correlation between gaze duration and pupil size during the late (B) and middle (C) phases. * p < 0.05.
Applsci 15 03022 g006
Table 1. Main differences between the current study and related works in the past three years.
Table 1. Main differences between the current study and related works in the past three years.
StudyObjective
Measures
SubjectsExperimental MaterialsExplore Cross-Species Effects?Control Low-Level PropertiesCompare Effects of View PointCompare Correlation with Gaze Duration
Chang YH et al., 2024 [31]Pupil sizeHumanImages and wordsNoNot mentionNoNo
Lee CL et al., 2023 [32]Pupil sizeHumanVideosNoNot mentionNoNo
Pan J et al., 2024 [33]Pupil sizeHumanAuditory stimuliNoYesNoNo
Yuan T. et al., 2024 [34]Pupil sizeHumanMotion visual stimuliNoYesNoNo
Yu P et al., 2024 [35]Pupil sizeHumanImagesYesYesNoNo
Bonino G et al., 2024 [36]Pupil sizeHumanTouchNoNo mentionNoNo
Current studyPupil sizeNHPImagesYesYesYesYes
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Liu, X.; Zhang, Z.; Dai, J. Evaluating Pupillometry as a Tool for Assessing Facial and Emotional Processing in Nonhuman Primates. Appl. Sci. 2025, 15, 3022. https://doi.org/10.3390/app15063022

AMA Style

Liu X, Zhang Z, Dai J. Evaluating Pupillometry as a Tool for Assessing Facial and Emotional Processing in Nonhuman Primates. Applied Sciences. 2025; 15(6):3022. https://doi.org/10.3390/app15063022

Chicago/Turabian Style

Liu, Xinhe, Zhiting Zhang, and Ji Dai. 2025. "Evaluating Pupillometry as a Tool for Assessing Facial and Emotional Processing in Nonhuman Primates" Applied Sciences 15, no. 6: 3022. https://doi.org/10.3390/app15063022

APA Style

Liu, X., Zhang, Z., & Dai, J. (2025). Evaluating Pupillometry as a Tool for Assessing Facial and Emotional Processing in Nonhuman Primates. Applied Sciences, 15(6), 3022. https://doi.org/10.3390/app15063022

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop