sensors-logo

Journal Browser

Journal Browser

Sensors and Technological Ecosystems for eHealth

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Sensor Networks".

Deadline for manuscript submissions: closed (31 December 2021) | Viewed by 28086

Special Issue Editor


E-Mail Website
Guest Editor
Department of Computer Science, University of Salamanca, 37008 Salamanca, Spain
Interests: information systems, human factors in computing; project management in information-systems development; global and distributed software-engineering; systems, services, and software process improvement and innovation; management information systems; business software; innovation in IT
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

The eHealth technological ecosystems promote and strengthen the use of ICT in health development from applications in the field to global governance, with a very special attention to the development of data-driven healthcare and digital health services.

Persons with some form of disability are a very important target for these data-driven healthcare ecosystems, because around 15% of the world’s population suffer any kind of disability.

Sensors have a great potential to increase the potential of the healthcare systems, as was presented and tackled in the Special Issue on "Sensor Technologies for Caring People with Disabilities" of Sensors (https://www.mdpi.com/journal/sensors/special_issues/Sensor_Caring_People_with_Disabilities). This Special Issue looks to extend the pursued goals of this previous collection to achieve a more general scope regarding eHealth.

Taking this into account, the contributions of this Special Issue should focus on (but not be limited to) the following topics:

  • How are data-driven healthcare ecosystems implemented?
  • Studies and best practices that evidence the impact of digitalizing services in the health sector;
  • Frameworks and infrastructures that allow eHealth ecosystem delivery;
  • Privacy, safety, and standardization issues regarding eHealth ecosystems;
  • Assisted living solutions;
  • Game-based or gamified solutions applied to eHealth.

Prof. Dr. Francisco José García-Peñalvo
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Sensors
  • Disabled people
  • Assisted living
  • Assisting systems
  • Health monitoring
  • Wearable technologies
  • Indoor positioning
  • Human activity recognition
  • Vital sign monitoring
  • Personalized medicine
  • eHealth ecosystems
  • eHealth ethical issues
  • Game-based eHealth solutions
  • Gamified approaches for eHealth

Published Papers (7 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

16 pages, 9002 KiB  
Article
VES: A Mixed-Reality Development Platform of Navigation Systems for Blind and Visually Impaired
by Santiago Real and Alvaro Araujo
Sensors 2021, 21(18), 6275; https://doi.org/10.3390/s21186275 - 18 Sep 2021
Cited by 5 | Viewed by 2787
Abstract
Herein, we describe the Virtually Enhanced Senses (VES) system, a novel and highly configurable wireless sensor-actuator network conceived as a development and test-bench platform of navigation systems adapted for blind and visually impaired people. It allows to immerse its users into “walkable” purely [...] Read more.
Herein, we describe the Virtually Enhanced Senses (VES) system, a novel and highly configurable wireless sensor-actuator network conceived as a development and test-bench platform of navigation systems adapted for blind and visually impaired people. It allows to immerse its users into “walkable” purely virtual or mixed environments with simulated sensors and validate navigation system designs prior to prototype development. The haptic, acoustic, and proprioceptive feedback supports state-of-art sensory substitution devices (SSD). In this regard, three SSD were integrated in VES as examples, including the well-known “The vOICe”. Additionally, the data throughput, latency and packet loss of the wireless communication can be controlled to observe its impact in the provided spatial knowledge and resulting mobility and orientation performance. Finally, the system has been validated by testing a combination of two previous visual-acoustic and visual-haptic sensory substitution schemas with 23 normal-sighted subjects. The recorded data includes the output of a “gaze-tracking” utility adapted for SSD. Full article
(This article belongs to the Special Issue Sensors and Technological Ecosystems for eHealth)
Show Figures

Figure 1

18 pages, 1037 KiB  
Article
Seeing through Events: Real-Time Moving Object Sonification for Visually Impaired People Using Event-Based Camera
by Zihao Ji, Weijian Hu, Ze Wang, Kailun Yang and Kaiwei Wang
Sensors 2021, 21(10), 3558; https://doi.org/10.3390/s21103558 - 20 May 2021
Cited by 11 | Viewed by 3017
Abstract
Scene sonification is a powerful technique to help Visually Impaired People (VIP) understand their surroundings. Existing methods usually perform sonification on the entire images of the surrounding scene acquired by a standard camera or on the priori static obstacles acquired by image processing [...] Read more.
Scene sonification is a powerful technique to help Visually Impaired People (VIP) understand their surroundings. Existing methods usually perform sonification on the entire images of the surrounding scene acquired by a standard camera or on the priori static obstacles acquired by image processing algorithms on the RGB image of the surrounding scene. However, if all the information in the scene are delivered to VIP simultaneously, it will cause information redundancy. In fact, biological vision is more sensitive to moving objects in the scene than static objects, which is also the original intention of the event-based camera. In this paper, we propose a real-time sonification framework to help VIP understand the moving objects in the scene. First, we capture the events in the scene using an event-based camera and cluster them into multiple moving objects without relying on any prior knowledge. Then, sonification based on MIDI is enabled on these objects synchronously. Finally, we conduct comprehensive experiments on the scene video with sonification audio attended by 20 VIP and 20 Sighted People (SP). The results show that our method allows both participants to clearly distinguish the number, size, motion speed, and motion trajectories of multiple objects. The results show that our method is more comfortable to hear than existing methods in terms of aesthetics. Full article
(This article belongs to the Special Issue Sensors and Technological Ecosystems for eHealth)
Show Figures

Figure 1

23 pages, 3697 KiB  
Article
Active Game-Based Solutions for the Treatment of Childhood Obesity
by Carina S. González-González, Nazaret Gómez del Río, Pedro A. Toledo-Delgado and Francisco José García-Peñalvo
Sensors 2021, 21(4), 1266; https://doi.org/10.3390/s21041266 - 10 Feb 2021
Cited by 7 | Viewed by 3833
Abstract
Obesity is one of the biggest health problems globally that, together with sedentarism, requires solutions that increase the enthusiasm towards physical activity. Therefore, this paper describes two solutions based on active games using the Kinect sensor and biometric sensors, designed for the outpatient [...] Read more.
Obesity is one of the biggest health problems globally that, together with sedentarism, requires solutions that increase the enthusiasm towards physical activity. Therefore, this paper describes two solutions based on active games using the Kinect sensor and biometric sensors, designed for the outpatient treatment of childhood obesity. The solutions were applied in an intervention program based on active video games and motor games, developed with children in treatment for childhood obesity. An ad hoc questionnaire was used to assess the level of satisfaction, fun, learning, and behavior changes in the children of the experimental group that developed the intervention. The results showed a high index of satisfaction with the intervention program, as well as with the games developed. It is concluded that active video games and group games are highly motivating and can promote behavior change towards healthier life habits in children. Full article
(This article belongs to the Special Issue Sensors and Technological Ecosystems for eHealth)
Show Figures

Figure 1

26 pages, 7672 KiB  
Article
A Kinect-Based Interactive System for Home-Assisted Active Aging
by Gabriel Fuertes Muñoz, Ramón Alberto Mollineda Cardenas and Filiberto Pla
Sensors 2021, 21(2), 417; https://doi.org/10.3390/s21020417 - 08 Jan 2021
Cited by 7 | Viewed by 2760
Abstract
Virtually every country in the world is facing an unprecedented challenge: society is aging. Assistive technologies are expected to play a key role in promoting healthy lifestyles in the elderly. This paper presents a Kinect-based interactive system for home-assisted healthy aging, which guides, [...] Read more.
Virtually every country in the world is facing an unprecedented challenge: society is aging. Assistive technologies are expected to play a key role in promoting healthy lifestyles in the elderly. This paper presents a Kinect-based interactive system for home-assisted healthy aging, which guides, supervises, and corrects older users when they perform scheduled physical exercises. Interactions take place in gamified environments with augmented reality. Many graphical user interface elements and workflows have been designed considering the sensory, physical and technological shortcomings of the elderly, adapting accordingly the interaction methods, graphics, exercises, tolerance margins, physical goals, and scoring criteria. Experiments involved 57 participants aged between 65 and 80 who performed the same physical routine six times during 15 days. After each session, participants completed a usability survey. Results provided significant evidence that support (1) the effectiveness of the system in assisting older users of different age ranges, (2) the accuracy of the system in measuring progress in physical achievement of the elderly, and (3) a progressive acceptance of the system as it was used. As a main conclusion, the experiments verified that despite their poor technological skills, older people can adapt positively to the use of an interactive assistance tool for active aging if they experience clear benefits. Full article
(This article belongs to the Special Issue Sensors and Technological Ecosystems for eHealth)
Show Figures

Figure 1

19 pages, 4865 KiB  
Article
Improvement of the Interaction Model Aimed to Reduce the Negative Effects of Cybersickness in VR Rehab Applications
by Predrag Veličković and Miloš Milovanović
Sensors 2021, 21(2), 321; https://doi.org/10.3390/s21020321 - 06 Jan 2021
Cited by 18 | Viewed by 2894
Abstract
Virtual reality (VR) has the potential to be applied in many fields, including medicine, education, scientific research. The e-health impact of VR on medical therapy for people cannot be ignored, but participants reported problems using them, as the capabilities and limitations of users [...] Read more.
Virtual reality (VR) has the potential to be applied in many fields, including medicine, education, scientific research. The e-health impact of VR on medical therapy for people cannot be ignored, but participants reported problems using them, as the capabilities and limitations of users can greatly affect the effectiveness and usability of the VR in rehabilitation. Previous studies of VR have focused on the development and use of the technology itself, and it is only in recent years that emphasis has been placed on usability problems that include the human factor. In this research, different ways of adapting interaction in VR were tested. One approach was focused on means of navigating through a VR, while the second dealt with the impact of the amount of animation and moving elements through a series of tests. In conclusion, the way of navigation and the amount of animation and moving elements, as well as their combination, are proven to have a great influence on the use of VR systems for rehabilitation. There is a possibility to reduce the occurrence of problems related to cybersickness if the results of this research are taken into consideration and applied from an early stage of designing VR rehabilitation applications. Full article
(This article belongs to the Special Issue Sensors and Technological Ecosystems for eHealth)
Show Figures

Figure 1

14 pages, 5322 KiB  
Article
Development of a Smart Splint to Monitor Different Parameters during the Treatment Process
by José María De Agustín Del Burgo, Fernando Blaya Haro, Roberto D’Amato and Juan Antonio Juanes Méndez
Sensors 2020, 20(15), 4207; https://doi.org/10.3390/s20154207 - 29 Jul 2020
Cited by 3 | Viewed by 3135
Abstract
For certain musculoskeletal complex rupture injuries, the only treatment available is the use of immobilization splints. This type of treatment usually causes discomfort and certain setbacks in patients. In addition, other complications are usually generated at the vascular, muscular, or articular level. Currently, [...] Read more.
For certain musculoskeletal complex rupture injuries, the only treatment available is the use of immobilization splints. This type of treatment usually causes discomfort and certain setbacks in patients. In addition, other complications are usually generated at the vascular, muscular, or articular level. Currently, there is a really possible alternative that would solve these problems and even allows a faster and better recovery. This is possible thanks to the application of engineering on additive manufacturing techniques and the use of biocompatible materials available in the market. This study proposes the use of these materials and techniques, including sensor integration inside the splints. The main parameters considered to be studied are pressure, humidity, and temperature. These aspects are combined and analyzed to determine any kind of unexpected evolution of the treatment. This way, it will be possible to monitor some signals that would be studied to detect problems that are associated to the very initial stage of the treatment. The goal of this study is to generate a smart splint by using biomaterials and engineering techniques based on the advanced manufacturing and sensor system, for clinical purposes. The results show that the prototype of the smart splint allows to get data when it is placed over the arm of a patient. Two temperatures are read during the treatment: in contact with the skin and between skin and splint. The humidity variations due to sweat inside the splint are also read by a humidity sensor. A pressure sensor detects slight changes of pressure inside the splint. In addition, an infrared sensor has been included as a presence detector. Full article
(This article belongs to the Special Issue Sensors and Technological Ecosystems for eHealth)
Show Figures

Figure 1

19 pages, 8440 KiB  
Article
Nextmed: Automatic Imaging Segmentation, 3D Reconstruction, and 3D Model Visualization Platform Using Augmented and Virtual Reality
by Santiago González Izard, Ramiro Sánchez Torres, Óscar Alonso Plaza, Juan Antonio Juanes Méndez and Francisco José García-Peñalvo
Sensors 2020, 20(10), 2962; https://doi.org/10.3390/s20102962 - 23 May 2020
Cited by 38 | Viewed by 8596
Abstract
The visualization of medical images with advanced techniques, such as augmented reality and virtual reality, represent a breakthrough for medical professionals. In contrast to more traditional visualization tools lacking 3D capabilities, these systems use the three available dimensions. To visualize medical images in [...] Read more.
The visualization of medical images with advanced techniques, such as augmented reality and virtual reality, represent a breakthrough for medical professionals. In contrast to more traditional visualization tools lacking 3D capabilities, these systems use the three available dimensions. To visualize medical images in 3D, the anatomical areas of interest must be segmented. Currently, manual segmentation, which is the most commonly used technique, and semi-automatic approaches can be time consuming because a doctor is required, making segmentation for each individual case unfeasible. Using new technologies, such as computer vision and artificial intelligence for segmentation algorithms and augmented and virtual reality for visualization techniques implementation, we designed a complete platform to solve this problem and allow medical professionals to work more frequently with anatomical 3D models obtained from medical imaging. As a result, the Nextmed project, due to the different implemented software applications, permits the importation of digital imaging and communication on medicine (dicom) images on a secure cloud platform and the automatic segmentation of certain anatomical structures with new algorithms that improve upon the current research results. A 3D mesh of the segmented structure is then automatically generated that can be printed in 3D or visualized using both augmented and virtual reality, with the designed software systems. The Nextmed project is unique, as it covers the whole process from uploading dicom images to automatic segmentation, 3D reconstruction, 3D visualization, and manipulation using augmented and virtual reality. There are many researches about application of augmented and virtual reality for medical image 3D visualization; however, they are not automated platforms. Although some other anatomical structures can be studied, we focused on one case: a lung study. Analyzing the application of the platform to more than 1000 dicom images and studying the results with medical specialists, we concluded that the installation of this system in hospitals would provide a considerable improvement as a tool for medical image visualization. Full article
(This article belongs to the Special Issue Sensors and Technological Ecosystems for eHealth)
Show Figures

Figure 1

Back to TopTop