Next Article in Journal
Effect of Corneal Tilt on the Determination of Asphericity
Next Article in Special Issue
Influence of Label Design and Country of Origin Information in Wines on Consumers’ Visual, Sensory, and Emotional Responses
Previous Article in Journal
Development of a Low-Cost Optical Sensor to Detect Eutrophication in Irrigation Reservoirs
Previous Article in Special Issue
Computer Vision and Machine Learning Analysis of Commercial Rice Grains: A Potential Digital Approach for Consumer Perception Studies
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Digital Integration and Automated Assessment of Eye-Tracking and Emotional Response Data Using the BioSensory App to Maximize Packaging Label Analysis

by
Sigfredo Fuentes
1,*,
Claudia Gonzalez Viejo
1,
Damir D. Torrico
2 and
Frank R. Dunshea
1,3
1
Digital Agriculture Food and Wine Group, School of Agriculture and Food, Faculty of Veterinary and Agricultural Sciences, University of Melbourne, Parkville, VIC 3010, Australia
2
Department of Wine, Food and Molecular Biosciences, Faculty of Agriculture and Life Sciences, Lincoln University, Lincoln 7647, Canterbury, New Zealand
3
Faculty of Biological Sciences, University of Leeds, Leeds LS2 9JT, UK
*
Author to whom correspondence should be addressed.
Sensors 2021, 21(22), 7641; https://doi.org/10.3390/s21227641
Submission received: 18 October 2021 / Revised: 2 November 2021 / Accepted: 15 November 2021 / Published: 17 November 2021
(This article belongs to the Special Issue Novel Contactless Sensors for Food, Beverage and Packaging Evaluation)

Abstract

:
New and emerging non-invasive digital tools, such as eye-tracking, facial expression and physiological biometrics, have been implemented to extract more objective sensory responses by panelists from packaging and, specifically, labels. However, integrating these technologies from different company providers and software for data acquisition and analysis makes their practical application difficult for research and the industry. This study proposed a prototype integration between eye tracking and emotional biometrics using the BioSensory computer application for three sample labels: Stevia, Potato chips, and Spaghetti. Multivariate data analyses are presented, showing the integrative analysis approach of the proposed prototype system. Further studies can be conducted with this system and integrating other biometrics available, such as physiological response with heart rate, blood, pressure, and temperature changes analyzed while focusing on different label components or packaging features. By maximizing data extraction from various components of packaging and labels, smart predictive systems can also be implemented, such as machine learning to assess liking and other parameters of interest from the whole package and specific components.

1. Introduction

Packaging and labels are the first points of contact between food and beverage products with consumers. Around 95% of food and beverage products that do not have consumer preference assessments for packaging will probably fail in the market [1]. The implementation of new and emerging digital technologies for sensory analysis of food, beverage, and packaging products, such as video acquisition for physiological [2,3,4,5,6], emotional [7,8,9], and eye-tracking data [10,11,12], requires multiple devices from different companies and respective software packages for data acquisition, handling, and analysis [13]. The latter makes the data analysis process more complicated since it requires specialized personnel to simultaneously manage multiple devices and software, making the whole process time-consuming and cost-prohibitive. Hence, many studies focus on only one or a couple of biometrics at most, which are usually recorded independently [6,13].
The integration of several technologies is frequently not straightforward due to proprietary rights from different companies concerning their analysis algorithms or even images (e.g., FLIR for infrared thermal data). One computer application that has already integrated self-reported sensory data with infrared thermal imagery and visible video acquisition is the BioSensory App [14] developed by the Digital Agriculture, Food and Wine Sciences group (DAFW), The University of Melbourne (UoM), Australia. The BioSensory App can obtain, besides the self-reported data, digital information to extract (i) physiological biometrics from video of panelists, such as heart rate, blood pressure, and temperature changes; and (ii) emotional response from videos. The latter is capable of analyzing three head orientation parameters, eight emotions, valence, engagement, 21 different facial movements and 12 emojis that resemble the participants’ expressions.
Eye-tracking devices and software have been used as a tool to analyze the gaze of panelists when looking at imagery or video with multiple and varied applications, such as multimedia learning [15], aviation [16], tourism [17], and sports [18], among others. For food and beverages [19,20], eye tracking has been helpful in the research of warning labels on sugar levels [21], healthy labels and food choice [22], fixations in different areas of interest (AOI) [23], packaging design and type [24,25], and more complex situations, such as the influence of soundtracks on visual attention and food choice [26]. Other studies have combined eye tracking with contact sensors, such as electrodermal activity, to assess food perception [27]. However, contact sensors may introduce biases in the analysis due to participants’ self-awareness [13,28,29].
Combining eye-tracking and other remote sensing biometrics, such as emotional response, has been used primarily in psychiatric research, with some research interpreting only eye-tracking data with negative emotions [30]. In food and beverage labels, eye-tracking data have been combined with self-reported data such as wine purchase intention [31]. However, combining eye-tracking data with emotional responses based on video analysis using computer vision is rarer and mainly focuses on the overall assessment of the whole label [32].
This study aimed to propose the integration of eye-tracking information and emotional response of sensory panelists to assess specific areas of interest (AOI) of labels, such as images, logos, and nutrition information, among others, and self-reported liking of the overall label. The integration system proposed and trialed relies on the timestamp synchronization between the eye tracker device and the BioSensory App to create digital time tags for automated processing using multivariate data analysis.

2. Materials and Methods

2.1. Sensory Session Description

A total of 55 participants (44% males, 56% females; 25–50 years old) were recruited from the pool of staff and students from UoM. Power analysis was conducted using the SAS Power and Sample Size 14.1 software (SAS Institute, Cary, NC, USA), the result (1 − β > 0.999; effect size: 0.59) was used to confirm that the number of participants was enough to find significant differences between samples.
The sensory session was conducted in the Faculty of Veterinary and Agricultural Sciences laboratory from UoM and approved by the Human Ethics Advisory Group (Ethics ID: 1545786.2). The sensory laboratory, which was designed according to the ISO 8589 Sensory analysis—General guidance for the design of test rooms, has 20 individual booths with uniform lighting, and each is equipped with a Samsung Galaxy View 18” tablet (Samsung Group, Seoul, Korea) and a Gazepoint GP3 eye tracker (accuracy: 0.5–1.0 degree of visual, frequency: 60 Hz; Gazepoint, Vancouver, BC, Canada). The BioSensory application (App; The University of Melbourne, Parkville, Australia) [14] was used to display the questionnaire and to record videos of participants while evaluating the samples.
Three food labels (Stevia, Potato chips and Spaghetti) with different AOIs (product’s name, claims, nutrition facts, net content, nutrition squares, ingredients, image, manufacturer, suggested use, bar code, company logo and product’s denomination) were selected randomly and used as samples to test the new system proposed through the integration of eye-tracking and emotional response techniques. The eye tracker was connected to a computer, and the Gazepoint software presenting the slideshow with the samples was displayed in the tablet using RemotePC™ (RemotePC™, Calabasas, CA, USA). Participants were required to do a nine-point calibration between samples and were instructed to see the label for 10 s using the RemotePC App, while the BioSensory App was recording videos in the background. Once the 10 s looking at the label passed, a screen with instructions to switch to the BioSensory App was displayed. To do this, participants were provided with a wireless keyboard to switch between Apps (Figure 1). Once in the BioSensory App, participants had to rate the label for Overall liking (15 cm non-structured scale) and select the preferred AOI.

2.2. Biometrics

Videos from participants were acquired using the BioSensory App and analyzed through a computer application developed by the DAFW from UoM based on the Affectiva software development kit (SDK; Affectiva, Boston, MA, USA; Figure 2). The parameters obtained from this analysis were emotions such as (i) joy, (i) fear, (iii) disgust, (iv) sadness, (v) anger, (vi) contempt, (vii) valence dimension, (viii) engagement, and (ix) smile facial expression.
Eye-tracking data was analyzed using the Gazepoint analysis software, and the parameters extracted per AOI for each participant were (i) time to first fixation, (ii) time viewed, (iii) fixations number, and (iv) revisits number.
Using the timestamps from both analyses, the emotional responses and eye-tracking data, the values of emotions were matched for each AOI to assess the participant’s reactions while viewing each area. Figure S1 in supplementary material shows an example of the emotions elicited per AOI.

2.3. Statistical Analysis

Data were analyzed for ANOVA to assess significant differences (p < 0.05) between samples using the Tukey honest significant difference (HSD) post hoc test (α = 0.05). Furthermore, a multivariate data analysis consisting of principal components analysis (PCA) and cluster analysis based on Euclidean distance was conducted using a customized code written in Matlab® R2021a (Mathworks, Inc., Natick, MA, USA). A matrix was developed to assess significant (p < 0.05) correlations between emotional responses and the eye-tracking parameters using the latter software.

3. Results and Discussion

The analytical system proposed in this study allows the automated analysis of labels as a whole and to separate analysis from different label components. Below are presented the results from the new applications developed in the form of processed data for eye-tracking information and integrated analysis for eye tracking and emotional response based on videos from participants and computer vision algorithms.
The analyses presented in this paper are an example of how the data may be handled; however, each user of the proposed method would be free to analyze their own data according to their needs. ANOVAs may be conducted to assess differences per AOI as presented in this paper, but also per sample and the interaction of AOIs and samples; this will depend on the aim of the specific study.

3.1. Overall Label Liking End Emotional Response from Label Components

Figure 3 shows significant differences (p < 0.05) between samples for the overall liking. The chips label was the most liked, with the spaghetti and stevia labels being rated similarly. This may be due to the layout and colors of the labels and/or to the consumers preference for chips over spaghetti and stevia.
Table 1 shows the mean and standard error values of the emotional responses for each AOI. There were non-significant differences (p > 0.05) between AOIs for different emotions. However, the variability in standard error (SE) shows some trends that can be used to predict liking among other parameters using machine learning modelling [6,33,34].

3.2. Differences in Eye-Tracking Data for Label Components

Figure 4 shows significant differences (p < 0.05) between samples for both the time to first view and time viewed. The AOI manufacturer was the one that took longer for participants to first view (4.53 s), which means it was the last AOI they see when evaluating the labels. On the contrary, the product’s name took the least time to be first viewed (1.28 s), this being the first AOI that participants focus visual attention on the labels analyzed. On the other hand, participants spent the longest time (0.94 s) viewing the suggested use than the other AOIs, with net content being the element they spent the least time (0.06 s). The large SE values were expected due to differences in participants reactions; this is since subconscious responses are being evaluated and stimuli elicit different responses in each individual.
In Figure 5, it can be observed that there were significant differences (p < 0.05) between the AOIs for the number of fixations and revisits. Suggested use, nutrition facts, and image were the highest in the number of fixations (4.24, 3.85, and 3.75, respectively), while net content was the lowest (0.56). On the other hand, the image was the AOI with the most revisits (2.02), while net content had the least (0.13).

3.3. Integrating Eye Tracking and Emotional Response Data

Figure 6 shows the combined data from eye trackers and emotional responses. Figure 6a shows that considering the first two principal components (PC), the PCA represented a total of 61.88% of data variability (PC1 = 38.01%; PC2 = 23.87%). According to the factor loadings (FL), PC1 was mainly represented on the positive side of the axis by the number of revisits (FL = 0.40), number of fixations (FL = 0.37), disgust (FL = 0.35), and time viewed (FL = 0.34). On the negative side, it was represented by joy (FL = −0.29), engagement and smile (FL = −0.26 for both). On the other hand, PC2 was characterized by smile (FL = 0.40), valence (FL = 0.38), and joy (FL = 0.36) on the positive side of the axis, and contempt (FL = −0.33), time to first view (FL = −0.31), and sadness (FL = −0.27) on the negative side.
The preferred AOI was positively related to fear, disgust, and number of revisits and negatively related to time to first view. Revisits number, fixations number, and time viewed had a positive relationship among them and disgust. Associated with these were the AOIs nutrition facts, image, and product name. This association coincides with results reported in an eye-tracking study to evaluate olive oil dressing labels, in which higher fixations were found for product’s name and image [25] and an eye-tracking study with organic food labels in which visual attention was higher when viewing the image [35]. On the other hand, time to first view was positively related to contempt, with AOIs manufacturer, bar code, company logo, and associated claims. Net content AOI was related to engagement, joy, smile, and valence. The other AOIs were more ambiguous as they are located more towards the center for the PCA. However, in Figure 6b, there are three main clusters, one of them with four subclusters. Product name, nutrition facts, and image conform one cluster; net content is independent of the other AOIs. The third cluster is composed of subgroups as (i) manufacturer, suggested use and bar code, (ii) product denomination, (iii) nutrition squares and ingredients, and (iv) company logo and claims.
Figure 7 shows there were positive significant correlations (p < 0.05) between disgust and time viewed (r = 0.58), fixations number (r = 0.67), revisits number (r = 0.76), and preferred AOI (r = 0.74). Similar results were found by Schienle et al. [36]; in their study, participants had a higher number of fixations when evaluating disgust images. Furthermore, disgust was negatively correlated with time to first view (r = −0.63). Whilst contempt was positively correlated with time to first view (r = 0.62). The preferred AOI had a positive correlation with fixations number (r = 0.58) and revisits number (r = 0.70). Engagement was positively correlated with smile (r = 0.74) and joy (r = 0.83) as expected. The latter was also correlated with valence (r = 0.80) and smile (r = 0.93). The correlation between valence, smile, and joy, also found in the PCA (Figure 6a), was expected as a positive valence is a measure of happiness [37].

3.4. Integration and Analysis of Eye-Tracking and Emotional Response

The BioSensory App used in this study was further developed through specific software modules for the post-analysis of videos acquired from panelists. One of those modules dealt with the integrated analysis of eye-tracking and emotional response output data by analyzing it based on timestamps and through a customized multivariate data analysis code for principal component (Figure 6a), cluster (Figure 6b), and correlation (Figure 7) analysis.
The use of multivariate data analysis such as PCA for the proposed system outputs to assess AOIs in labels may render critical information that may be picked up by the methods used separately. This may provide an overview of the specific AOIs from the labels that could require modifications in the design to satisfy consumers and, therefore, increase the overall acceptability of the labels. This is an advantage of the proposed system since the integrated method provides more precise information from consumers than traditional methods that use separate measures and focus on the overall emotional responses or other biometrics such as skin conductance elicited by the entire label [10,12,27]. This leads developers to fully redesign labels that may not be optimal to satisfy consumers and is more time-consuming and less cost-effective.
Not only self-reported data and emotional response can be integrated using the methodology proposed in this study, but also further digital data can be obtained with the BioSensory App system, such as physiological response based on heart rate, blood pressure, and temperature changes from panelists. The latter data were not presented in this study to avoid overcomplication of information presented. However, extra information can be used for more complex modelling strategies using artificial intelligence (AI).
The proposed system allows further analysis and the development of prediction models using machine learning techniques based on biometrics. The latter approach has been used in the case of consumer acceptability based on visual evaluation of beer pouring videos using eye-tracking, emotional and physiological responses [34] and for consumers acceptability towards beer tasting using biometrics such as emotions, heart rate, and body temperature [33]. Other authors have used machine learning modelling to predict food choice using eye-tracking gaze data when evaluating food images [38] and to predict participants age from their gaze patterns [39]. These digital and AI tools can be implemented in the design stage of packaging and labels rendering images or 3D representation of the same on screens for panelists or potential consumers. This could expedite the design and modification process since modifications can be readily assessed and applied digitally for immediate re-rendering. The latter will avoid the requirement of further sensory sessions and reduce costs. Previous research has shown that sensory analysis and liking of packaging and labels do not have statistical differences when packaging is rendered digitally on a screen compared to 3D physical prototypes for panelists to handle [40].

4. Conclusions

Further development of the BioSensory computer application has helped maximize the extraction of information from packaging and labels. The proposed system not only applies to the packaging and labels, but it can also give more specific information about the different components or areas of interest (AOI) and the overall acceptability of the products. A potential future application using artificial intelligence can be developed to assess which components are liked by consumers and which require modifications only from eye-tracking, facial expressions, and further biometrics. This AI system could expedite packaging design and secure the success of food and beverage products in the market.

Supplementary Materials

The following are available online at https://www.mdpi.com/article/10.3390/s21227641/s1, Figure S1. Example of a heatmap from a label showing the different emotions elicited in consumers by each area of interest. In the top left, the identified eye section of participant is shown. The label has been blurred to hide brands and participant’s identity.

Author Contributions

Conceptualization, S.F., C.G.V. and D.D.T.; Data curation, S.F., C.G.V. and D.D.T.; Formal analysis, S.F., C.G.V. and D.D.T.; Funding acquisition, F.R.D.; Investigation, S.F. and C.G.V.; Methodology, S.F., C.G.V. and D.D.T.; Resources, F.R.D.; Software, S.F. and C.G.V.; Validation, S.F., C.G.V. and D.D.T.; Visualization, S.F., C.G.V. and D.D.T.; Writing—original draft, S.F. and C.G.V.; Writing—review & editing, S.F., C.G.V., D.D.T. and F.R.D. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Australian Government through the Australian Research Council [Grant number IH120100053] ‘Unlocking the Food Value Chain: Australian industry transformation for ASEAN markets’.

Institutional Review Board Statement

The study was approved by the Human Ethics Advisory Group (HEAG) of The University of Melbourne (Ethics ID: 1545786.2).

Informed Consent Statement

Signed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data and intellectual property belong to The University of Melbourne; any sharing needs to be evaluated and approved by the University.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Buss, D. Food Companies Get Smart About Artificial Intelligence. Food Technol. 2018, 72, 26–41. [Google Scholar]
  2. He, W.; Boesveldt, S.; de Graaf, C.; de Wijk, R.A. Dynamics of autonomic nervous system responses and facial expressions to odors. Appl. Olfactory Cogn. 2014, 5, 104. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  3. Modica, E.; Cartocci, G.; Rossi, D.; Martinez Levy, A.C.; Cherubino, P.; Maglione, A.G.; Di Flumeri, G.; Mancini, M.; Montanari, M.; Perrotta, D. Neurophysiological responses to different product experiences. Comput. Intell. Neurosci. 2018, 2018, 9616301. [Google Scholar] [CrossRef] [PubMed]
  4. Schulte-Holierhoek, A.; Verastegui-Tena, L.; Goedegebure, R.P.; Fiszman, B.P.; Smeets, P.A. Sensory expectation, perception, and autonomic nervous system responses to package colours and product popularity. Food Qual. Prefer. 2017, 62, 60–70. [Google Scholar] [CrossRef]
  5. Vila-López, N.; Küster-Boluda, I. Consumers’ physiological and verbal responses towards product packages: Could these responses anticipate product choices? Physiol. Behav. 2019, 200, 166–173. [Google Scholar] [CrossRef] [PubMed]
  6. Gonzalez Viejo, C.; Torrico, D.; Dunshea, F.; Fuentes, S. Emerging Technologies Based on Artificial Intelligence to Assess the Quality and Consumer Preference of Beverages. Beverages 2019, 5, 62. [Google Scholar] [CrossRef] [Green Version]
  7. Kreibig, S.D. Autonomic nervous system activity in emotion: A review. Biol. Psychol. 2010, 84, 394–421. [Google Scholar] [CrossRef]
  8. Liao, L.X.; Corsi, A.M.; Chrysochou, P.; Lockshin, L. Emotional responses towards food packaging: A joint application of self-report and physiological measures of emotion. Food Qual. Prefer. 2015, 42, 48–55. [Google Scholar] [CrossRef]
  9. Vila-López, N.; Kuster-Boluda, I.; Alacreu-Crespo, A. Designing a Low-Fat Food Packaging: Comparing Consumers’ Responses in Virtual and Physical Shopping Environments. Foods 2021, 10, 211. [Google Scholar] [CrossRef]
  10. Cuesta, U.; Niño, J.I.; Martínez-Martínez, L. Neuromarketing: Analysis of Packaging Using Gsr, Eye-Tracking and Facial Expression. In Proceedings of the Paper presented at The European Conference on Media, Communication & Film, Brighton, UK, 9–10 July 2018. [Google Scholar]
  11. Rodríguez-Escudero, A.I.; Carbonell, P.; Moreno-Albaladejo, P. The conjoint effect of front-label claims’ surface size and distance-to-center on customers’ visual attention and emotional response. J. Appl. Packag. Res. 2019, 11, 4. [Google Scholar]
  12. Songa, G.; Slabbinck, H.; Vermeir, I.; Russo, V. How do implicit/explicit attitudes and emotional reactions to sustainable logo relate? A neurophysiological study. Food Qual. Prefer. 2019, 71, 485–496. [Google Scholar] [CrossRef]
  13. Fuentes, S.; Tongson, E.; Gonzalez Viejo, C. Novel digital technologies implemented in sensory science and consumer perception. Curr. Opin. Food Sci. 2021, 41, 99–106. [Google Scholar] [CrossRef]
  14. Fuentes, S.; Gonzalez Viejo, C.; Torrico, D.; Dunshea, F. Development of a biosensory computer application to assess physiological and emotional responses from sensory panelists. Sensors 2018, 18, 2958. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  15. Alemdag, E.; Cagiltay, K. A systematic review of eye tracking research on multimedia learning. Comput. Educ. 2018, 125, 413–428. [Google Scholar] [CrossRef]
  16. Peißl, S.; Wickens, C.D.; Baruah, R. Eye-tracking measures in aviation: A selective literature review. Int. J. Aerosp. Psychol. 2018, 28, 98–112. [Google Scholar] [CrossRef]
  17. Scott, N.; Zhang, R.; Le, D.; Moyle, B. A review of eye-tracking research in tourism. Curr. Issues Tour. 2019, 22, 1244–1261. [Google Scholar] [CrossRef]
  18. Kredel, R.; Vater, C.; Klostermann, A.; Hossner, E.-J. Eye-tracking technology and the dynamics of natural gaze behavior in sports: A systematic review of 40 years of research. Front. Psychol. 2017, 8, 1845. [Google Scholar] [CrossRef] [Green Version]
  19. Motoki, K.; Saito, T.; Onuma, T. Eye-tracking research on sensory and consumer science: A review, pitfalls and future directions. Food Res. Int. 2021, 145, 110389. [Google Scholar] [CrossRef] [PubMed]
  20. Duerrschmid, K.; Danner, L. Eye tracking in consumer research. In Methods in Consumer Research, Volume 2; Elsevier: Amsterdam, The Netherlands, 2018; pp. 279–318. [Google Scholar]
  21. Popova, L.; Nonnemaker, J.; Taylor, N.; Bradfield, B.; Kim, A. Warning labels on sugar-sweetened beverages: An eye tracking approach. Am. J. Health Behav. 2019, 43, 406–419. [Google Scholar] [CrossRef] [PubMed]
  22. Fenko, A.; Nicolaas, I.; Galetzka, M. Does attention to health labels predict a healthy food choice? An eye-tracking study. Food Qual. Prefer. 2018, 69, 57–65. [Google Scholar] [CrossRef]
  23. Mokrý, S.; Birčiaková, N.; Slováčková, T.; Stávková, J.; Nagyová, Ľ. Perception of wine labels by generation Z: Eye-tracking experiment. Potravin. Slovak J. Food Sci. 2016, 10, 524–531. [Google Scholar]
  24. Merdian, P.; Piroth, P.; Rueger-Muck, E.; Raab, G. Looking behind eye-catching design: An eye-tracking study on wine bottle design preference. Int. J. Wine Bus. Res. 2020, 33, 134–151. [Google Scholar] [CrossRef]
  25. Fazio, M.; Reitano, A.; Loizzo, M.R. Consumer Preferences for New Products: Eye Tracking Experiment on Labels and Packaging for Olive Oil Based Dressing. Proceedings 2021, 70, 59. [Google Scholar] [CrossRef]
  26. Peng-Li, D.; Byrne, D.V.; Chan, R.C.; Wang, Q.J. The influence of taste-congruent soundtracks on visual attention and food choice: A cross-cultural eye-tracking study in Chinese and Danish consumers. Food Qual. Prefer. 2020, 85, 103962. [Google Scholar] [CrossRef]
  27. Sung, B.; Butcher, L.; Easton, J. Elevating Food Perceptions Through Luxury Verbal Cues: An Eye-Tracking and Electrodermal Activity Experiment. Australas. Mark. J. 2021, 2021, 18393349211028676. [Google Scholar] [CrossRef]
  28. Frelih, N.G.; Podlesek, A.; Babič, J.; Geršak, G. Evaluation of psychological effects on human postural stability. Measurement 2017, 98, 186–191. [Google Scholar] [CrossRef]
  29. Gonzalez Viejo, C.; Fuentes, S.; Torrico, D.; Dunshea, F. Non-Contact Heart Rate and Blood Pressure Estimations from Video Analysis and Machine Learning Modelling Applied to Food Sensory Responses: A Case Study for Chocolate. Sensors 2018, 18, 1802. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  30. Stone, A.; Potton, A. Emotional responses to disfigured faces and disgust sensitivity: An eye-tracking study. J. Health Psychol. 2019, 24, 1191–1200. [Google Scholar] [CrossRef]
  31. Monteiro, P.; Guerreiro, J.; Loureiro, S.M.C. Understanding the role of visual attention on wines’ purchase intention: An eye-tracking study. Int. J. Wine Bus. Res. 2019, 32, 161–179. [Google Scholar] [CrossRef]
  32. Gunaratne, N.M.; Fuentes, S.; Gunaratne, T.M.; Torrico, D.D.; Ashman, H.; Francis, C.; Gonzalez Viejo, C.; Dunshea, F.R. Consumer acceptability, eye fixation, and physiological responses: A study of novel and familiar chocolate packaging designs using eye-tracking devices. Foods 2019, 8, 253. [Google Scholar] [CrossRef] [Green Version]
  33. Gonzalez Viejo, C.; Fuentes, S.; Howell, K.; Torrico, D.; Dunshea, F. Integration of non-invasive biometrics with sensory analysis techniques to assess acceptability of beer by consumers. Physiol. Behav. 2019, 200, 139–147. [Google Scholar] [CrossRef] [PubMed]
  34. Gonzalez Viejo, C.; Fuentes, S.; Howell, K.; Torrico, D.; Dunshea, F.R. Robotics and computer vision techniques combined with non-invasive consumer biometrics to assess quality traits from beer foamability using machine learning: A potential for artificial intelligence applications. Food Control 2018, 92, 72–79. [Google Scholar] [CrossRef]
  35. Drexler, D.; Fiala, J.; Havlíčková, A.; Potůčková, A.; Souček, M. The effect of organic food labels on consumer attention. J. Food Prod. Mark. 2018, 24, 441–455. [Google Scholar] [CrossRef]
  36. Schienle, A.; Gremsl, A.; Übel, S.; Körner, C. Testing the effects of a disgust placebo with eye tracking. Int. J. Psychophysiol. 2016, 101, 69–75. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  37. Aydin, S.G.; Kaya, T.; Guler, H. Wavelet-based study of valence–arousal model of emotions on EEG signals with LabVIEW. Brain Inform. 2016, 3, 109–117. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  38. Gere, A.; Danner, L.; de Antoni, N.; Kovács, S.; Dürrschmid, K.; Sipos, L. Visual attention accompanying food decision process: An alternative approach to choose the best models. Food Qual. Prefer. 2016, 51, 1–7. [Google Scholar] [CrossRef]
  39. Dalrymple, K.A.; Jiang, M.; Zhao, Q.; Elison, J.T. Machine learning accurately classifies age of toddlers based on eye tracking. Sci. Rep. 2019, 9, 1–10. [Google Scholar]
  40. Torrico, D.D.; Fuentes, S.; Viejo, C.G.; Ashman, H.; Gurr, P.A.; Dunshea, F.R. Analysis of thermochromic label elements and colour transitions using sensory acceptability and eye tracking techniques. LWT Food Sci. Technol. 2018, 89, 475–481. [Google Scholar] [CrossRef]
Figure 1. A participant during the sensory session in an individual booth equipped with (1) a Samsung 18” Tablet containing the BioSensory App, (2) a GazePoint GP3 eye tracker, (3) a computer connecting the eye tracker, and (4) a keyboard to switch between applications in the tablet. The FLIR infrared camera is also visible on top of the tablet but was not used in this study.
Figure 1. A participant during the sensory session in an individual booth equipped with (1) a Samsung 18” Tablet containing the BioSensory App, (2) a GazePoint GP3 eye tracker, (3) a computer connecting the eye tracker, and (4) a keyboard to switch between applications in the tablet. The FLIR infrared camera is also visible on top of the tablet but was not used in this study.
Sensors 21 07641 g001
Figure 2. Example of a participant’s video and the emotion analysis plotted from the outputs of the software developed using Affectiva. Left (primary) y-axis corresponds to all emotions except for valence, while right y-axis (secondary) corresponds to valence (green).
Figure 2. Example of a participant’s video and the emotion analysis plotted from the outputs of the software developed using Affectiva. Left (primary) y-axis corresponds to all emotions except for valence, while right y-axis (secondary) corresponds to valence (green).
Sensors 21 07641 g002
Figure 3. Mean values of the overall liking of the labels evaluated. Error bars represent the standard error. Different letters denote significant differences based on the ANOVA and Tukey honest significant difference (HSD) post hoc test (α = 0.05).
Figure 3. Mean values of the overall liking of the labels evaluated. Error bars represent the standard error. Different letters denote significant differences based on the ANOVA and Tukey honest significant difference (HSD) post hoc test (α = 0.05).
Sensors 21 07641 g003
Figure 4. Mean values of the time to first view and time viewed from the eye-tracking analysis of the labels evaluated. Error bars represent the standard error. Different letters denote significant differences based on the ANOVA and Tukey honest significant difference (HSD) post hoc test (α = 0.05).
Figure 4. Mean values of the time to first view and time viewed from the eye-tracking analysis of the labels evaluated. Error bars represent the standard error. Different letters denote significant differences based on the ANOVA and Tukey honest significant difference (HSD) post hoc test (α = 0.05).
Sensors 21 07641 g004
Figure 5. Mean values of the number of fixations and revisits from the eye-tracking analysis of the labels evaluated. Error bars represent the standard error. Different letters denote significant differences based on the ANOVA and Tukey honest significant difference (HSD) post hoc test (α = 0.05).
Figure 5. Mean values of the number of fixations and revisits from the eye-tracking analysis of the labels evaluated. Error bars represent the standard error. Different letters denote significant differences based on the ANOVA and Tukey honest significant difference (HSD) post hoc test (α = 0.05).
Sensors 21 07641 g005
Figure 6. Multivariate data analysis based on (a) principal components analysis (PCA) and (b) cluster analysis. Abbreviations: PC: Principal Component; AOI: Area of Interest.
Figure 6. Multivariate data analysis based on (a) principal components analysis (PCA) and (b) cluster analysis. Abbreviations: PC: Principal Component; AOI: Area of Interest.
Sensors 21 07641 g006
Figure 7. Matrix showing the significant correlations (p < 0.05) between emotional responses and eye-tracking parameters. Color bar represents the positive (blue) and negative (yellow) correlations.
Figure 7. Matrix showing the significant correlations (p < 0.05) between emotional responses and eye-tracking parameters. Color bar represents the positive (blue) and negative (yellow) correlations.
Sensors 21 07641 g007
Table 1. Means (top value) and standard error (bottom value) of the emotional subconscious responses from consumers. Abbreviations: AOI: Areas of Interest.
Table 1. Means (top value) and standard error (bottom value) of the emotional subconscious responses from consumers. Abbreviations: AOI: Areas of Interest.
AOI/EmotionBar CodeClaimsCompany LogoImageIngredientsManufacturerNet ContentNutrition FactsNutrition SquaresProduct DenominationProducts NameSuggested Use
Joy0.022.241.821.394.781.5111.822.492.881.442.883.54
±0.02±2.24±1.81±0.83±2.02±1.51±8.08±1.12±2.14±0.88±1.21±2.44
Fear1.820.530.421.182.830.010.071.352.151.523.041.10
±1.82±0.37±0.41±0.49±1.16±0.01±0.07±0.59±1.26±1.05±0.91±0.90
Disgust0.470.650.560.990.440.440.311.210.920.521.610.60
±0.10±0.11±0.10±0.22±0.04±0.07±0.04±0.37±0.43±0.07±0.78±0.11
Sadness0.090.800.910.080.110.020.020.150.030.430.170.65
±0.07±0.74±0.86±0.02±0.05±0.00±0.01±0.11±0.01±0.29±0.11±0.37
Anger0.171.560.570.160.030.010.000.040.010.870.080.20
±0.11±1.52±0.53±0.11±0.01±0.00±0.00±0.03±0.00±0.83±0.04±0.12
Contempt3.790.220.531.072.463.760.150.240.180.210.213.26
±3.45±0.03±0.30±0.66±1.51±3.10±0.02±0.06±0.01±0.02±0.03±2.00
Valence−5.11−0.94−2.13−0.522.112.0311.670.791.462.700.51−4.89
±3.52±1.95±3.31±1.00±2.76±1.99±7.13±1.68±2.39±2.06±1.57±3.03
Engagement9.1410.7910.558.7011.734.3418.888.1711.599.287.9913.39
±3.79±3.34±4.15±1.87±2.67±2.94±9.91±1.89±3.45±2.85±1.62±3.67
Smile3.062.252.443.037.192.8212.584.804.015.394.594.15
±1.72±2.01±1.82±0.94±2.22±2.15±7.40±1.35±2.26±2.05±1.34±2.39
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Fuentes, S.; Gonzalez Viejo, C.; Torrico, D.D.; Dunshea, F.R. Digital Integration and Automated Assessment of Eye-Tracking and Emotional Response Data Using the BioSensory App to Maximize Packaging Label Analysis. Sensors 2021, 21, 7641. https://doi.org/10.3390/s21227641

AMA Style

Fuentes S, Gonzalez Viejo C, Torrico DD, Dunshea FR. Digital Integration and Automated Assessment of Eye-Tracking and Emotional Response Data Using the BioSensory App to Maximize Packaging Label Analysis. Sensors. 2021; 21(22):7641. https://doi.org/10.3390/s21227641

Chicago/Turabian Style

Fuentes, Sigfredo, Claudia Gonzalez Viejo, Damir D. Torrico, and Frank R. Dunshea. 2021. "Digital Integration and Automated Assessment of Eye-Tracking and Emotional Response Data Using the BioSensory App to Maximize Packaging Label Analysis" Sensors 21, no. 22: 7641. https://doi.org/10.3390/s21227641

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop