Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (1)

Search Parameters:
Keywords = PRCNN

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
15 pages, 873 KiB  
Article
IoT-Enabled Classification of Echocardiogram Images for Cardiovascular Disease Risk Prediction with Pre-Trained Recurrent Convolutional Neural Networks
by Chitra Balakrishnan and V. D. Ambeth Kumar
Diagnostics 2023, 13(4), 775; https://doi.org/10.3390/diagnostics13040775 - 18 Feb 2023
Cited by 35 | Viewed by 2565
Abstract
Cardiovascular diseases currently present a key health concern, contributing to an increase in death rates worldwide. In this phase of increasing mortality rates, healthcare represents a major field of research, and the knowledge acquired from this analysis of health information will assist in [...] Read more.
Cardiovascular diseases currently present a key health concern, contributing to an increase in death rates worldwide. In this phase of increasing mortality rates, healthcare represents a major field of research, and the knowledge acquired from this analysis of health information will assist in the early identification of disease. The retrieval of medical information is becoming increasingly important to make an early diagnosis and provide timely treatment. Medical image segmentation and classification is an emerging field of research in medical image processing. In this research, the data collected from an Internet of Things (IoT)-based device, the health records of patients, and echocardiogram images are considered. The images are pre-processed and segmented, and then further processed using deep learning techniques for classification as well as forecasting the risk of heart disease. Segmentation is attained via fuzzy C-means clustering (FCM) and classification using a pretrained recurrent neural network (PRCNN). Based on the findings, the proposed approach achieves 99.5% accuracy, which is higher than the current state-of-the-art techniques. Full article
Show Figures

Figure 1

Back to TopTop