Next Article in Journal
Estimation of PM10 Distribution using Landsat5 and Landsat8 Remote Sensing
Previous Article in Journal
Effects of Some Geometric Parameters in Energy-Efficient Heat Distribution of Pre-Insulated Double Pipes
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Proceeding Paper

Monitoring Food Intake in an Aging Population: A Survey on Technological Solutions †

1
University of Extremadura, 10004 Cáceres, Spain
2
University of Évora, Évora, Portugal
*
Authors to whom correspondence should be addressed.
Presented at the 12th International Conference on Ubiquitous Computing and Ambient Intelligence (UCAmI 2018), Punta Cana, Dominican Republic, 4–7 December 2018.
These authors contributed equally to this work.
Proceedings 2018, 2(19), 445; https://doi.org/10.3390/proceedings2190445
Published: 31 October 2018
(This article belongs to the Proceedings of UCAmI 2018)

Abstract

:
The aging of the population has increased the research efforts focused on eldercare. In this area, nutrition is a topic of particular interest. A significant number of problems related with aging have their origin in a deficient nutrition. Many elders suffer changes in their nutritional patterns and these changes can cause serious deterioration of their physical condition. Therefore, to be able to address nutritional problems in the elderly, their nutritional patterns must be recorded and analyzed in an easy and efficient way. From a technological point of view, numerous works focus on monitoring the food intake, not only of elders but also of the general population. However, these works usually do not take into account the problems associated with an aging population. In this paper we present a survey of existing technological solutions for monitoring food intake. The analyzed solution are categorized based on their technological implementation and their adaptation to the needs of an aging population.

1. Introduction

In the course of the coming years, the World will experience population aging, a trend that is both pronounced and historically unprecedented. Over the past six decades, countries of the world had experienced only a slight increase in the share of people aged 65 years and older, from 8% to 10%. However, in the next four decades, this group is expected to rise to 22% of the total population, a jump from 800 million to 2 billion people [1].
Aging itself is not a problem, at least not directly. However, problems such as cognitive impairment, mobility problems, hormonal problems or nutritional disorders compounds the negative effects of aging. Particularly, a significant number of problems related to aging have their origin in a deficient nutrition [2].
Additionally, the problems related with nutrition are not limited exclusively to elderly people. Ill people, children, etc. also suffer from nutritional disorders. Due to this, important research efforts are dedicated to various aspects related to feeding. Specifically, one of the most frequent aspects addressed by researchers is the monitoring of the people food intake.
Food intake monitoring is intended to acquire information, such as the number of vitamins, minerals, and other substances ingested by a person. This information is then used for the identification of nutritional patterns that can help detect or address nutritional problems.
With respect to food, one of the most disadvantaged groups in our society are the elderly people [2]. Some of the main problems of aging are those related with nutrition. Frequently, the elderly suffer changes in their nutritional patterns that, in some cases, can cause significant damages to their physical condition. For example, if the amount of ingested food decreases, it can generate important nutrient losses. To the point that some elderly end up experimenting anorexia episodes. Those nutritional problems influence the health of older people directly.
One of the fundamental elements to determine the health of the elderly population and to avoid the aforementioned problems, is having precise information about their nutritional patterns and ingested food. Different studies have shown that the nutrition information of the elderly population is a valid parameter for predicting his quality of life[3,4,5]. A not convenient feeding can bring about biggest problems in the functional and cognitive state, in addition to a higher rate of mortality [6]. Most of the works in the nutrition field, focus on the elderly population state of nutrition and their relationship with different diseases like the obesity [7], the Alzheimer’s disease [8], the depression [9] or the metabolic syndrome [10].
In that regard, diet monitoring is one of the most important aspects in preventative health care that aims to reduce various elderly health risks. Manual recording has been the prevalent method of diet monitoring. However, it is tedious and often end up with a low adherence rate. Due to this, several techniques have been developed to help monitor food intake but they still suffer with accuracy problems, efficiency, and user acceptance rate.
These technological proposals have additional difficulties when taking into account the circumstances of the aging population. Particularly, when the monitored elders live in rural environments, with low population density, and with few resources.
Continuous nutrition monitoring is essential to influence positively the nutrient content of the food supply and meet the changing nutrition needs of the population. Nutrition scientists in the food industry use nutrition monitoring data in a variety of ways that include developing nutrition communications for consumers and health professionals, guiding product development and reformulation, and applying research applications [11].
In order to understand and improve the feeding of the elderly, monitoring what they eat is necessary. In this paper, a comparative study of current monitoring techniques is presented taking into account the difficulties of their application to the elderly.
To describe the survey that was carried out, the rest of the paper is structured as follows. Section 2 presents the motivation for monitoring the food intake in the aging population. Section 3 details the different technological proposals analyzed. In Section 4 a discussion involving all the proposals considered is provided. And finally, Section 5 presents the conclusion of this work.

2. Motivation

As pointed out earlier, monitoring the food intake in an aging population is important and one of the best ways to get this monitoring is through technological solutions.
Works such as [12] already analyze different food monitoring technologies. These technologies are rated and described taking several aspects into account, mainly overall impression, social acceptance, and system outputs. In this work, technological solutions are analyzed from a different perspective. Specifically, the adaptability of the proposed solutions to be used by elders living in rural areas is our main focus. For example, monitoring the food intake in an aging population depends on the place (a nursing home, a hospital, etc.), the intermediaries (e.g., a supermarket), the technology dependence (e.g., continued accessibility to the Internet), etc.
All these particularities hinder the implementation of these systems in rural environments with older population and with low population density. Places where there are no health centers, there are no intermediaries, and even certain technology components cannot be used because of the lack of infrastructure.
This means that there is a set of issues which difficult the deployment of some technological solutions. In this work, we are going to evaluate the existing solutions from this point of view.
In particular, the issues to review are as follows:
  • Detect intake. The system, of course, must detect the food intake of the person and the moment when he or she is eating.
  • Detect food. Furthermore, the system must detect the food type and its quantity so it can evaluate the nutrients intake of the elderly.
  • Identify specific person. It is important that the system will be able to identify the person who is eating at each moment.
  • Low-cost. Also, it is important that the system implementation is not expensive, because in this environments big investments usually can not be undertaken.
  • Unsupervised. Also, the system must be usable without supervision, since the most independence possible is sought and the elder should not need any assistance or supervision to use the system.
  • Without intermediary. In addition, the system must run without intermediaries, since, in many cases, the rural environments have not enough infrastructure to support the presence of intermediaries.
  • Portable. And finally, the system must be portable, so it can be deployed in the different places where the elderly live.
Next, we will evaluate the existing technological proposals for food intake monitoring based on the above mentioned features.

3. Food Intake Monitoring Techniques

In this paper we present a preliminary study over the existing technological proposals for food intake monitoring. This study did not follow a systematic literature review approach, but it is based on the review of relevant literature in the area.
According to the scientific literature presented along this document, we can divide the food intake monitoring technology into the categories shown in Figure 1.
  • Smartphone. This category includes Smartphone technologies, such as proposals based on devices or specific sensors of the smartphone, proposals based on smartphone applications (Apps), etc.
  • Computer Vision. Applications, techniques and/or algorithms that can obtain a high-level understanding from digital images or videos.
  • Wearables. Solutions based on electronic devices that can be worn on the body, either as an accessory or as part of the material used in clothing.
  • Smart Home. This category incorporates advanced automation systems to provide the persons with sophisticated monitoring and control over the home’s functions.
  • IoT. Approaches based on Internet of Things technology, such as devices with sensors inside and connected to the Internet which send the data taken. When an IoT device has been designed to be used at home we would regard it as Smart Home technology.
  • Microcomputers. Technologies based on a small computer, especially used for writing documents or small processing programs.
  • Others. Solutions that can not be included in the preceding categories.
There are works which could be classified into two or more categories, but we will classify them in the category to which their core technology belongs. For example, if a work makes use of smartphone and computer vision technologies, but its core technology is the smartphone App, then we classify this proposal in the smartphone category.
The generalized aging of the population has made eldercare a growing subject of study. Numerous works have analyzed different aspects of bringing technological advances to nutritional area.

3.1. Smartphone

From a Smartphone Apps perspective, the acceptance of mobile health (mHealth) technologies, such as smartphone applications has experienced dramatic growth, with over 259,000 mHealth apps available [13]. 58% smartphone have downloaded a health-related app [14], with fitness and nutrition apps most frequently downloaded [15]. Smartphone Apps as presented in [16] help professionals and private users in feed monitoring, but in all cases the information about the food products is updated manually by the users or can be used to manually record food intake. These type of tools provide two main advantages over existing traditional methods, they make records easier to keep and can perform calculations automatically. However, such methods clearly do not make the data collection process easier for the user nor do they help increase the accuracy of the food intake estimation.
Another Smartphone Apps such as [17,18,19] make use of computer vision techniques for identifying food intake. But these types of solutions employ a client-server configuration. The user begins the process by taking images of food plates using the camera of his smartphone, and then the images are sent to the server for the analysis. The analysis process consists of two main parts: food identification and volume estimation. The analysis results are then sent back to the smartphone for the user to review and confirm.
But this type of solutions are usually not able to identify the person who carries out the food intake and this type of solutions need always an intermediary.

3.2. Computer Vision

Continuing with the Computer Vision techniques, but without using smartphones, for example the online system known as FoodLog [20] relies on the users to take images of their food plates using a camera and then send the images to the system by email. FoodLog system uses image analysis methods for food identification from which the nutrients can then be calculated. Although this system is not a dedicated mobile application but a web application. A dedicated mobile application would offer several features that cannot be available in a web-based system. For example, with a dedicated mobile application the user can take the images and know the results immediately. A further example is PlateMate [21] that allows users to upload food photographs and receive nutrition estimates within a few hours. The estimates consist of a list of foods in the photograph, with associated measurements of serving size, calories, fat, carbohydrates, and protein for each food item. This system works in a similar way to the previous projects, that is to say, the user sends the images directly to the server and waits the response.
Along these line, another example, [22] that propose a augmented reality (AR)-based wearable food-monitoring system to make food nutritional information available and visible in real-life scenarios, such as at the supermarket. It combines reverse image search (RIS) and text mining to recognize the food, and then it retrieves nutritional information from a remote database, which is located on an external server. For user-friendliness, the information is overlaid on a live feed from a Google Glass front-facing camera.
But as well as happened in the smartphone section, this type of solutions need an intermediary, besides they are an expensive solution.

3.3. Wearable

From a Wearable perspective, technology that is being used more and more, in numerous facets of the daily life. Several authors explored the use of chewing sounds captured through wearable devices to detect and characterize food intake activity. Projects such as [23,24] present implementation of acoustic earpad sensor device to capture air-conducted vibrations of food chewing. The spectral sound analysis is performed and compares food classification performance to food texture clustering results for different foods. Although there is clearly a problem, in the presence of loud foods, it is challenging to discern food properties and therefore makes difficult to identify the food intake. In the same line, there is progress on specialized algorithms [25,26] to detect and characterize the food intake activity making use of the chewing sounds (without specifying any device). But, these algorithms process the acoustic signal achieving acceptable results for single meal experiments in laboratory settings, where the number of food types consumed was restricted.
Additionally, works like [27] (a wearable, mobile and wireless necklace featuring an embedded piezoelectric sensor) or [28] (a wearable and wireless chest-belt) present a wireless and wearable food intake monitoring system which identifies the food and drink intake by detecting a person’s swallowing events. These systems are based on the key observation that a person’s otherwise continuous breathing process is interrupted by a short apnea when she or he swallows as a part of solid or liquid intake process using a wearable chest-belt. Although these systems are only capable of detecting the intake and do not differentiate between the types of food that is consumed.
Other works, like [29,30,31]; use wearable sensors placed on the user’s hand in order to monitor the food intake. In [29], the authors proposed an approach of eating activity detection by using the accelerometer sensor data and video observation. But, this study only let identify if the user eats rice or noodles. In [30], the authors proposed a similar system but making use of a specific wrist-worn smart-watch. They produced positive results that may be of interest in the future. Furthermore, in [31] the authors proposed a comparison between unsupervised and supervised approaches for the recognition of nine daily gestures (including food intake) making use of a sensor on the wrist and one on the index finger. However, this solution is does not detect the food types, does not identify the user, etc.
In the case of [32] presents a wearable multi-sensor system for monitoring food intake. The proposed device integrates three sensor modalities that wirelessly interface to a smartphone: a jaw motion sensor, a hand gesture sensor, and an accelerometer. This system use a pattern recognition method was developed for food intake recognition. The system is able to detect food intake but is not able to detect the different foods.

3.4. Smart Home

From a Smart Home perspective, exists projects such as [33] that perform an automatic acquisition of nutritional habits based on Ambient Intelligence [34] techniques. The final goal of this project is to support the nutritionist by providing an estimation of the nutritional habits of the user using smart home technologies. The risk of abandon of healthy nutritional habits affects especially to old people living alone responsible for its own nutrition. In this project was used of (1) user interaction components (the elder interacts with the system with intuitive and easy to use methods based on the use of touch screens) and (2) ambient acquisition context components (the system use existing sensors in the market that are also low cost, for example, magnetic contacts in shelves, fridge, drawers; home automation sensors to detect the use of home devices; flow-meter to detect when water is flowing through faucet, etc.). Data acquired from this system helps to nutrition professionals for the nutritional management much more effective.
There are also household appliances such as [35] or [36] that present an application for a smart household appliance, in this case, a smart fridge, with multimedia capability. These devices are designed for managing items stored in it and advising its users with cooking methods depending on what kind of food is stored. It can also perform other functions such as dietary control, nutrition monitoring, eating habit analysis, etc. This type of projects are interesting and helpful to monitoring the foodstuff, but not who consumes these products.

3.5. IoT

From a Internet of Things (IoT) perspective, projects are being developed on measuring food nutrition facts, through a pocket-size non-intrusive near-infrared (NIR) scanner [37] or [38]. With this device are recorded NIR spectra reflected from foods and use them as features to predict nutrients, for example energy and carbohydrate. Researchers use partial least square (PLS) regression and support vector regression (SVR) for prediction. Although, with this device only able to infer percentages of food nutrient contents. However, to better monitor food intake for health and wellbeing, it is crucial that weight of the foods is accessible in order to calculate the exact nutritional content in foods [39].
Making use of other technology, a multi-sensor based on an ultrasonic sensor, an RGB color sensor, and machine learning algorithms is presented [40]. The system only measures beverage volume and detects beverage types. The device can be mounted on the lid of any drinking bottle. Also, the system utilizes machine learning algorithms to calculate the volume and type of the liquid intake based on the sensor readings from ultrasonic sensor and RGB module. The system can store the volume of the liquid intake and the type continuously and in real-time. When the system detects pouring the liquid into the bottle it starts measuring the liquid volume and detecting the liquid type. But this type of solution cannot be used to detect the complete food intake.
Other implementations of IoT, such as Near Field Communication (NFC) tags, can provide solid and reliable food traceability, for example [41] which proposes a system to improve and optimize food traceability based on a real-time tracking and remote monitoring of the food package. The NFC tags enable to identify a given food but it requires a NFC reader attached to the pc or another device.

3.6. Microcomputer

From a Microcomputer perspective, the systems based on microcomputers exists for a long time. They have the general operating features and nutrient databases needed for food intake analysis. 8 of this microcomputers are analyzed exhaustively in [42]. The different systems vary in cost, the number of foods, nutrients in the database and ease of food entry. But the main difference, with the projects analyzed above, is the low speed of analyzing the food intake and the complexity of learning and using the microcomputers for the elderly. They also require the supervision and the management of the assistant for the elderly. Other examples of this type of microcomputers are [43,44,45].
As can be seen, all these projects are rather old and thus we will analyze but we do not take it into consideration for this survey.

3.7. Others

There are other projects which do not fit in the preceding sections by their typology. Such as [46,47] that describe a voice-based mobile nutrition monitoring system that devises speech processing, natural language processing (NLP), and text mining techniques in a unified platform to facilitate the nutrition monitoring. After converting the spoken data to text, nutrition-specific data are identified within the text using an NLP-based approach. Furthermore, the information is combined with a tiered matching algorithm to search the food name in a nutrition database and accurately compute different intake values (for example type of feed or calories).
On the other hand, there are systems such as [48] that uses shopping receipts to generate suggestions about healthier food items that could help to supplement missing nutrients. This system consists of three major stages. First, receipts are scanned in and passed through an optical character recognition (OCR) program. Second, data from this program is passed to a database system which records historical information and also stores important nutritional information about foods and nutrients. The third component of the system is an inferencing system that estimates what the user eats on average per week and compares this to recommended nutrient consumption. But this system can not perform a monitoring for each food intake.

4. Discussion

As can be seen in Figure 2, the issues where more works are in progress are in the detect food (18 papers) and unsupervised (16 papers) categories. On the contrary, the issues that there is still work to be done are without an intermediary (20 papers) and portable (15 papers) categories. Also, we can say that the analyzed works, none of them meet with all of the features required for monitoring food intake in an aging population in rural environments and with a low-density population.
Although the number of papers by itself is not representative of the relevance of a given issue, it does show the interest of the scientist community in these works, as previously mentioned.
Once all works have been evaluated, we present them in Table 1 where every related work can be seen if it is meeting ( Proceedings 02 00445 i001) or not ( Proceedings 02 00445 i002) with the features described above. It should be explained, that in some cases, the features may be meeting only partially or do not specify on the article itself. This situation will be indicated by the symbol ( Proceedings 02 00445 i003).
In all works is clear that the food intake is identified or not, but more controversy exists in the food type detection. There are projects such as [23,24] that are able to detect the person who is eating and can identify if the food is liquid or solid but not what food type is or its quantity.
On the other hand, there are some projects that we marked as “undefined” ( Proceedings 02 00445 i003) because of they do not indicate if the identification is carried out by the own system or by the assistant. In some works are supposed the person identification but do not make it a reliable manner.
As for the price of implementation of the system of every work, in the majority of cases, they are too expensive or they do not indicate the cost (can be developing prototypes).
In respect of the “unsupervised” feature, in many cases, the system needs assistants or experts in order to run it, either by for inserting the data or for starting the own system.
Furthermore, in the majority of cases, the intermediation of third parties is needed, either a supermarket, an external Web service (Internet connection is required) or experts that observe the obtained results. This means that the elder cannot make the food intake monitoring without supervision.

5. Conclusions

Among the various studied solutions, gaps are identified in time to raise a deployment in a rural environment. None of the solutions meet the established requirements. About the studied options, best match with our interests are:
  • Making use of wearable technology, the work of [23] or [27] are interesting because can detect the food intake and identify the specific person, without supervision and without intermediaries. However, these works cannot detect the food in particular and their prototypes have an expensive cost.
  • In the smart home category, the work [36] is very curious because making use of a usual appliance such as the fridge. This device monitors the purchase and the food taken out itself but can detect neither the intake moment nor identify the person. Neither, this solution is portable, which complicates things.
  • The best options are found in the smartphone category. The works of [18] or [19] have developed a mobile application (App) that send a photo taken with the smartphone to external web service and thereafter receive the data extracted from that photo. These solutions detect the intake and the food without supervision, but they cannot identify the person who is eating. On the other hand, the project [16] is another App, but on this occasion not make on-line processing, though the insertion of data is carried out by the dietists and other health professionals (supervised manner and with intermediaries).
Looking the studies analyzed in the different fields we can say that a complete solution do not exist for monitoring food intake in an aging population in rural environments and with a low-density population.And therefore, additional efforts are needed in the area.

Acknowledgments

This work was supported by the Spanish Ministry of Economy, Industry and Competitiveness (TIN2015-69957-R (MINECO/FEDER)), by 4IE project (0045-4IE-4-P) funded by the Interreg V-A Espańa-Portugal (POCTEP) 2014-2020 program, by the Department of Economy and Infrastructure of the Government of Extremadura (GR15098), and by the European Regional Development Fund.

References

  1. Bloom, D.E.; Chatterji, S.; Kowal, P.; Lloyd-Sherlock, P.; McKee, M.; Rechel, B.; Rosenberg, L.; Smith, J.P. Macroeconomic implications of population ageing and selected policy responses. Lancet 2015, 385, 649–657. [Google Scholar] [CrossRef]
  2. Morley, J.E.; Silver, A.J. Nutritional Issues in Nursing Home Care. Ann. Internal Med. 1995, 123, 850. [Google Scholar] [CrossRef] [PubMed]
  3. Sayer, A.A.; Cooper, C. Early diet and growth: Impact on ageing. Proc. Nutr. Soc. 2002, 61, 79–85. [Google Scholar] [CrossRef] [PubMed]
  4. Darnton-Hill, I.; Nishida, C.; James, W.P.T. A life course approach to diet, nutrition and the prevention of chronic diseases. Public Health Nutr. 2004, 7, 101–21. [Google Scholar] [CrossRef]
  5. GARIBALLA, S. Malnutrition in hospitalized elderly patients: When does it matter? Clin. Nutr. 2001, 20, 487–491. [Google Scholar] [CrossRef] [PubMed]
  6. Evans, C. Malnutrition in the elderly: A multifactorial failure to thrive. Perm. J. 2005, 9, 38–41. [Google Scholar] [CrossRef]
  7. Ledikwe, J.H.; Smiciklas-Wright, H.; Mitchell, D.C.; Jensen, G.L.; Friedmann, J.M.; Still, C.D. Nutritional risk assessment and obesity in rural older adults: A sex difference. Am. J. Clin. Nutr. 2003, 77, 551–558. [Google Scholar] [CrossRef]
  8. Droogsma, E.; Van Asselt, D.Z.B.; Scholzel-Dorenbos, C.J.M.; Van Steijn, J.H.M.; Van Walderveen, P.E.; Van Der Hooft, C.S. Nutritional status of community-dwelling elderly with newly diagnosed Alzheimer’s disease: Prevalence of malnutrition and the relation of various factors to nutritional status. J. Nutr. Health Aging 2013, 17, 606–610. [Google Scholar] [CrossRef]
  9. Boulos, C.; Salameh, P.; Barberger-Gateau, P. Malnutrition and frailty in community dwelling older adults living in a rural setting. Clin. Nutr. 2016, 35, 138–143. [Google Scholar] [CrossRef]
  10. Khanam, M.A.; Qiu, C.; Lindeboom, W.; Streatfield, P.K.; Kabir, Z.N.; Wahlin, A.A. The metabolic syndrome: prevalence, associated factors, and impact on survival among older persons in rural Bangladesh. PLoS ONE 2011, 6, e20259. [Google Scholar] [CrossRef]
  11. Crockett, S.J.; Tobelmann, R.C.; Albertson, A.M.; Jacob, B.L. Nutrition Monitoring Application in the Food Industry. Nutr. Today 2002, 37, 130–135. [Google Scholar] [CrossRef] [PubMed]
  12. Kalantarian, H.; Alshurafa, N.; Sarrafzadeh, M. A Survey of Diet Monitoring Technology. IEEE Pervasive Comput. 2017, 16, 57–65. [Google Scholar] [CrossRef]
  13. Research2guidance. mHealth Economics 2016–Current Status and Trends of the mHealth App Market. Technical report. 2016. Available online: http://research2guidance.com/product/mhealth-app-developer-economics-2016/ (accessed on 7 May 2018).
  14. Krebs, P.; Duncan, D.T. Health App Use among US Mobile Phone Owners: A National Survey. JMIR mHealth uHealth 2015, 3, e101. [Google Scholar] [CrossRef] [PubMed]
  15. Aitken, M. Patient Adoption of mHealth; IMS Institute for Healthcare Informatics: New York, NY, USA, 2015. [Google Scholar]
  16. Chen, J.; Gemming, L.; Hanning, R.; Allman-Farinelli, M. Smartphone apps and the nutrition care process: Current perspectives and future considerations. Patient Educ. Counsel. 2018, 101, 750–757. [Google Scholar] [CrossRef]
  17. Ahmad, Z.; Khanna, N.; Kerr, D.A.; Boushey, C.J.; Delp, E.J. A Mobile Phone User Interface for Image-Based Dietary Assessment. Proc. SPIE Int. Soc. Opt. Eng. 2014, 9030. [Google Scholar] [CrossRef]
  18. Ocay, A.B.; Fernandez, J.M.; Palaoag, T.D. NutriTrack: Android-based food recognition app for nutrition awareness. In Proceedings of the 2017 3rd IEEE International Conference on Computer and Communications (ICCC), Chengdu, China, 13–16 December 2017; pp. 2099–2104. [Google Scholar]
  19. Kohila, R.; Meenakumari, R. Predicting calorific value for mixed food using image processing. In Proceedings of the 2017 International Conference on Innovations in Information, Embedded and Communication Systems (ICIIECS), Coimbatore, India, 17–18 March 2017; pp. 1–4. [Google Scholar]
  20. Kitamura, K.; Yamasaki, T.; Aizawa, K. FoodLog: capture, analysis and retrieval of personal food images via web. In Proceedings of the ACM multimedia 2009 workshop on Multimedia for cooking and eating activities—CEA ’09; ACM Press: New York, NY, USA, 2009; p. 23. [Google Scholar]
  21. Noronha, J.; Hysen, E.; Zhang, H.; Gajos, K.Z. Platemate. In Proceedings of the 24th annual ACM symposium on User interface software and technology—UIST ’11; ACM Press: New York, NY, USA, 2011; p. 1. [Google Scholar]
  22. Jiang, H.; Starkman, J.; Liu, M.; Huang, M.C. Food Nutrition Visualization on Google Glass: Design Tradeoff and Field Evaluation. IEEE Consum. Electron. Mag. 2018, 7, 21–31. [Google Scholar] [CrossRef]
  23. Amft, O. A wearable earpad sensor for chewing monitoring. In Proceedings of the 2010 IEEE Sensors, Kona, HI, USA, 1–4 November 2010; pp. 222–227. [Google Scholar]
  24. Amft, O.; Stäger, M.; Lukowicz, P.; Tröster, G. Analysis of Chewing Sounds for Dietary Monitoring; Springer: Berlin/Heidelberg, Germany, 2005; pp. 56–72. [Google Scholar]
  25. Amft, O.; Troster, G. On-Body Sensing Solutions for Automatic Dietary Monitoring. IEEE Pervasive Comput. 2009, 8, 62–70. [Google Scholar] [CrossRef]
  26. Paßler, S.; Fischer, W.J. Food Intake Activity Detection Using a Wearable Microphone System. In Proceedings of the 2011 Seventh International Conference on Intelligent Environments, Nottingham, UK, 25–28 July 2011; pp. 298–301. [Google Scholar]
  27. Kalantarian, H.; Alshurafa, N.; Sarrafzadeh, M. A Wearable Nutrition Monitoring System. In Proceedings of the 2014 11th International Conference on Wearable and Implantable Body Sensor Networks, Zurich, Switzerland, 16–19 June 2014; pp. 75–80. [Google Scholar]
  28. Bo, D.; Biswas, S. Wearable diet monitoring through breathing signal analysis. Conf. Proc. IEEE Eng. Med. Biol. Soc. 2013, 1186–1189. [Google Scholar] [CrossRef]
  29. Kim, H.J.; Kim, M.; Lee, S.J.; Choi, Y.S. An analysis of eating activities for automatic food type recognition. In Proceedings of the 2012 Asia Pacific Signal and Information Processing Association Annual Summit and Conference, Hollywood, CA, USA, 3–6 December 2012. [Google Scholar]
  30. Sen, S.; Subbaraju, V.; Misra, A.; Balan, R.K.; Lee, Y. The case for smartwatch-based diet monitoring. In Proceedings of the 2015 IEEE International Conference on Pervasive Computing and Communication Workshops (PerCom Workshops), St. Louis, MO, USA, 23–27 March 2015; pp. 585–590. [Google Scholar] [CrossRef]
  31. Moschetti, A.; Fiorini, L.; Esposito, D.; Dario, P.; Cavallo, F. Toward an Unsupervised Approach for Daily Gesture Recognition in Assisted Living Applications. IEEE Sens. J. 2017, 17, 8395–8403. [Google Scholar] [CrossRef]
  32. Fontana, J.M.; Farooq, M.; Sazonov, E. Automatic Ingestion Monitor: A Novel Wearable Device for Monitoring of Ingestive Behavior. IEEE Trans. Biomed. Eng. 2014, 61, 1772–1779. [Google Scholar] [CrossRef]
  33. Lazaro, J.P.; Fides, A.; Navarro, A.; Guillen, S. Ambient Assisted Nutritional Advisor for elderly people living at home. Conf. Proc. IEEE Eng. Med. Biol. Soc. 2010, 198–203. [Google Scholar] [CrossRef]
  34. Aarts, E.; Marzano, S. The New Everyday: Views on Ambient Intelligence; 010 Publishers: Tilburg, The Netherlands, 2003; p. 352. [Google Scholar]
  35. Luo, S.; Xia, H.; Gao, Y.; Jin, J.S.; Athauda, R. Smart Fridges with Multimedia Capability for Better Nutrition and Health. In Proceedings of the 2008 International Symposium on Ubiquitous Multimedia Computing, Hobart, Australia, 13–15 October 2008; pp. 39–44. [Google Scholar]
  36. Lee, Y.; Huang, M.C.; Zhang, X.; Xu, W. FridgeNet: A Nutrition and Social Activity Promotion Platform for Aging Populations. IEEE Intell. Syst. 2015, 30, 23–30. [Google Scholar] [CrossRef]
  37. Thong, Y.J.; Nguyen, T.; Zhang, Q.; Karunanithi, M.; Yu, L. Predicting food nutrition facts using pocket-size near-infrared sensor. In Proceedings of the 2017 39th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Seogwipo, South Korea, 11–15 July 2017; pp. 742–745. [Google Scholar]
  38. Cozzolino, D. Near Infrared Spectroscopy and Food Authenticity. In Advances in Food Traceability Techniques and Technologies; Elsevier: Amsterdam, The Netherlands, 2016; pp. 119–136. [Google Scholar]
  39. Bidlack, W.R. Interrelationships of food, nutrition, diet and health: the National Association of State Universities and Land Grant Colleges White Paper. J. Am. Coll. Nutr. 1996, 15, 422–433. [Google Scholar] [CrossRef]
  40. Pedram, M.; Rokni, S.A.; Fallahzadeh, R.; Ghasemzadeh, H. A beverage intake tracking system based on machine learning algorithms, and ultrasonic and color sensors. In Proceedings of the 16th ACM/IEEE International Conference on Information Processing in Sensor Networks—IPSN ’17; ACM Press: New York, NY, USA, 2017; pp. 313–314. [Google Scholar]
  41. Badia-Melis, R.; Ruiz-Garcia, L. Real-Time Tracking and Remote Monitoring in Food Traceability. In Advances in Food Traceability Techniques and Technologies; Elsevier: Amsterdam, The Netherlands, 2016; pp. 209–224. [Google Scholar]
  42. LEE, R.D.; NIEMAN, D.C.; RAINWATER, M. Comparison of Eight Microcomputer Dietary Analysis Programs with the USDA Nutrient Data Base for Standard Reference. J. Am. Diet. Assoc. 1995, 95, 858–867. [Google Scholar] [CrossRef]
  43. Bassham, S.; Fletcher, L.; Stanton, R. Dietary analysis with the aid of a microcomputer. J. Microcomput. Appl. 1984, 7, 279–289. [Google Scholar] [CrossRef]
  44. Dare, D.; Al-Bander, S.Y. A computerized diet analysis system for the research nutritionist. J. Am. Diet. Assoc. 1987, 87, 629–32. [Google Scholar] [CrossRef]
  45. Adelman, M.O.; Dwyer, J.T.; Woods, M.; Bohn, E.; Otradovec, C.L. Computerized dietary analysis systems: A comparative view. J. Am. Diet. Assoc. 1983, 83, 421–429. [Google Scholar] [CrossRef]
  46. Hezarjaribi, N.; Mazrouee, S.; Ghasemzadeh, H. Speech2Health: A Mobile Framework for Monitoring Dietary Composition From Spoken Data. IEEE J. Biomed. Health Inform. 2018, 22, 252–264. [Google Scholar] [CrossRef]
  47. Hezarjaribi, N.; Reynolds, C.A.; Miller, D.T.; Chaytor, N.; Ghasemzadeh, H. S2NI: A mobile platform for nutrition monitoring from spoken data. Conf. Proc. IEEE Eng. Med. Biol. Soc. 2016, 1991–1994. [Google Scholar] [CrossRef]
  48. Mankoff, J.; Hsieh, G.; Hung, H.C.; Lee, S.; Nitao, E. Using Low-Cost Sensing to Support Nutritional Awareness; Springer: Berlin/Heidelberg, Germany, 2002; pp. 371–378. [Google Scholar]
Figure 1. Food intake monitoring technology classification.
Figure 1. Food intake monitoring technology classification.
Proceedings 02 00445 g001
Figure 2. Bar chart that represents the number of papers of each category.
Figure 2. Bar chart that represents the number of papers of each category.
Proceedings 02 00445 g002
Table 1. Comparative table.
Table 1. Comparative table.
Identify Without
TypeReferencesDetect IntakeDetect FoodPerson-SpecificLow-CostUnsupervisedIntermediaryPortable
Smartphone[16] Proceedings 02 00445 i001 Proceedings 02 00445 i001 Proceedings 02 00445 i001 Proceedings 02 00445 i001 Proceedings 02 00445 i002 Proceedings 02 00445 i002 Proceedings 02 00445 i001
[17] Proceedings 02 00445 i001 Proceedings 02 00445 i001 Proceedings 02 00445 i003 Proceedings 02 00445 i003 Proceedings 02 00445 i002 Proceedings 02 00445 i002 Proceedings 02 00445 i001
[18] Proceedings 02 00445 i001 Proceedings 02 00445 i001 Proceedings 02 00445 i002 Proceedings 02 00445 i003 Proceedings 02 00445 i003 Proceedings 02 00445 i002 Proceedings 02 00445 i001
[19] Proceedings 02 00445 i001 Proceedings 02 00445 i001 Proceedings 02 00445 i002 Proceedings 02 00445 i001 Proceedings 02 00445 i001 Proceedings 02 00445 i002 Proceedings 02 00445 i001
Computer Vision[20] Proceedings 02 00445 i001 Proceedings 02 00445 i001 Proceedings 02 00445 i003 Proceedings 02 00445 i003 Proceedings 02 00445 i002 Proceedings 02 00445 i002 Proceedings 02 00445 i002
[21] Proceedings 02 00445 i001 Proceedings 02 00445 i001 Proceedings 02 00445 i002 Proceedings 02 00445 i002 Proceedings 02 00445 i002 Proceedings 02 00445 i002 Proceedings 02 00445 i002
[22] Proceedings 02 00445 i002 Proceedings 02 00445 i001 Proceedings 02 00445 i001 Proceedings 02 00445 i002 Proceedings 02 00445 i001 Proceedings 02 00445 i002 Proceedings 02 00445 i002
Wearable[23] Proceedings 02 00445 i001 Proceedings 02 00445 i003 Proceedings 02 00445 i001 Proceedings 02 00445 i002 Proceedings 02 00445 i001 Proceedings 02 00445 i001 Proceedings 02 00445 i002
[24] Proceedings 02 00445 i001 Proceedings 02 00445 i003 Proceedings 02 00445 i001 Proceedings 02 00445 i002 Proceedings 02 00445 i001 Proceedings 02 00445 i001 Proceedings 02 00445 i002
[25] Proceedings 02 00445 i002 Proceedings 02 00445 i003 Proceedings 02 00445 i003 Proceedings 02 00445 i002 Proceedings 02 00445 i002 Proceedings 02 00445 i002 Proceedings 02 00445 i002
[26] Proceedings 02 00445 i002 Proceedings 02 00445 i003 Proceedings 02 00445 i003 Proceedings 02 00445 i002 Proceedings 02 00445 i002 Proceedings 02 00445 i002 Proceedings 02 00445 i002
[27] Proceedings 02 00445 i001 Proceedings 02 00445 i002 Proceedings 02 00445 i001 Proceedings 02 00445 i003 Proceedings 02 00445 i001 Proceedings 02 00445 i002 Proceedings 02 00445 i001
[28] Proceedings 02 00445 i001 Proceedings 02 00445 i002 Proceedings 02 00445 i001 Proceedings 02 00445 i002 Proceedings 02 00445 i001 Proceedings 02 00445 i001 Proceedings 02 00445 i001
[29] Proceedings 02 00445 i001 Proceedings 02 00445 i002 Proceedings 02 00445 i002 Proceedings 02 00445 i003 Proceedings 02 00445 i002 Proceedings 02 00445 i002 Proceedings 02 00445 i001
[30] Proceedings 02 00445 i001 Proceedings 02 00445 i002 Proceedings 02 00445 i002 Proceedings 02 00445 i001 Proceedings 02 00445 i002 Proceedings 02 00445 i002 Proceedings 02 00445 i001
[31] Proceedings 02 00445 i001 Proceedings 02 00445 i002 Proceedings 02 00445 i002 Proceedings 02 00445 i001 Proceedings 02 00445 i001 Proceedings 02 00445 i002 Proceedings 02 00445 i001
[32] Proceedings 02 00445 i001 Proceedings 02 00445 i002 Proceedings 02 00445 i001 Proceedings 02 00445 i002 Proceedings 02 00445 i001 Proceedings 02 00445 i003 Proceedings 02 00445 i001
Smart Home[33] Proceedings 02 00445 i003 Proceedings 02 00445 i001 Proceedings 02 00445 i002 Proceedings 02 00445 i002 Proceedings 02 00445 i002 Proceedings 02 00445 i002 Proceedings 02 00445 i002
[35] Proceedings 02 00445 i002 Proceedings 02 00445 i001 Proceedings 02 00445 i002 Proceedings 02 00445 i002 Proceedings 02 00445 i001 Proceedings 02 00445 i001 Proceedings 02 00445 i002
[36] Proceedings 02 00445 i002 Proceedings 02 00445 i001 Proceedings 02 00445 i002 Proceedings 02 00445 i002 Proceedings 02 00445 i001 Proceedings 02 00445 i001 Proceedings 02 00445 i002
IoT[37] Proceedings 02 00445 i002 Proceedings 02 00445 i001 Proceedings 02 00445 i002 Proceedings 02 00445 i002 Proceedings 02 00445 i001 Proceedings 02 00445 i001 Proceedings 02 00445 i001
[38] Proceedings 02 00445 i002 Proceedings 02 00445 i001 Proceedings 02 00445 i002 Proceedings 02 00445 i002 Proceedings 02 00445 i001 Proceedings 02 00445 i001 Proceedings 02 00445 i001
[40] Proceedings 02 00445 i002 Proceedings 02 00445 i002 Proceedings 02 00445 i002 Proceedings 02 00445 i001 Proceedings 02 00445 i001 Proceedings 02 00445 i001 Proceedings 02 00445 i002
Microcomp.[43] Proceedings 02 00445 i002 Proceedings 02 00445 i001 Proceedings 02 00445 i001 Proceedings 02 00445 i003 Proceedings 02 00445 i002 Proceedings 02 00445 i002 Proceedings 02 00445 i002
[44] Proceedings 02 00445 i002 Proceedings 02 00445 i001 Proceedings 02 00445 i001 Proceedings 02 00445 i003 Proceedings 02 00445 i002 Proceedings 02 00445 i002 Proceedings 02 00445 i002
[45] Proceedings 02 00445 i002 Proceedings 02 00445 i001 Proceedings 02 00445 i001 Proceedings 02 00445 i003 Proceedings 02 00445 i002 Proceedings 02 00445 i002 Proceedings 02 00445 i002
Others[46] Proceedings 02 00445 i002 Proceedings 02 00445 i001 Proceedings 02 00445 i002 Proceedings 02 00445 i003 Proceedings 02 00445 i001 Proceedings 02 00445 i002 Proceedings 02 00445 i002
[47] Proceedings 02 00445 i002 Proceedings 02 00445 i001 Proceedings 02 00445 i002 Proceedings 02 00445 i003 Proceedings 02 00445 i001 Proceedings 02 00445 i002 Proceedings 02 00445 i002
[48] Proceedings 02 00445 i002 Proceedings 02 00445 i001 Proceedings 02 00445 i003 Proceedings 02 00445 i001 Proceedings 02 00445 i001 Proceedings 02 00445 i002 Proceedings 02 00445 i001
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Moguel, E.; Berrocal, J.; Murillo, J.M.; Garcia-Alonso, J.; Mendes, D.; Fonseca, C.; Lopes, M. Monitoring Food Intake in an Aging Population: A Survey on Technological Solutions. Proceedings 2018, 2, 445. https://doi.org/10.3390/proceedings2190445

AMA Style

Moguel E, Berrocal J, Murillo JM, Garcia-Alonso J, Mendes D, Fonseca C, Lopes M. Monitoring Food Intake in an Aging Population: A Survey on Technological Solutions. Proceedings. 2018; 2(19):445. https://doi.org/10.3390/proceedings2190445

Chicago/Turabian Style

Moguel, Enrique, Javier Berrocal, Juan M. Murillo, José Garcia-Alonso, David Mendes, Cesar Fonseca, and Manuel Lopes. 2018. "Monitoring Food Intake in an Aging Population: A Survey on Technological Solutions" Proceedings 2, no. 19: 445. https://doi.org/10.3390/proceedings2190445

APA Style

Moguel, E., Berrocal, J., Murillo, J. M., Garcia-Alonso, J., Mendes, D., Fonseca, C., & Lopes, M. (2018). Monitoring Food Intake in an Aging Population: A Survey on Technological Solutions. Proceedings, 2(19), 445. https://doi.org/10.3390/proceedings2190445

Article Metrics

Back to TopTop