Next Article in Journal
Using a Flexible IoT Architecture and Sequential AI Model to Recognize and Predict the Production Activities in the Labor-Intensive Manufacturing Site
Previous Article in Journal
3D Face Recognition Based on an Attention Mechanism and Sparse Loss Function
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Self-Detection of Early Breast Cancer Application with Infrared Camera and Deep Learning

by
Mohammed Abdulla Salim Al Husaini
,
Mohamed Hadi Habaebi
*,
Teddy Surya Gunawan
and
Md Rafiqul Islam
IoT & Wireless Communication Protocols Laboratory, Department of Electrical Computer Engineering, International Islamic University Malaysia (IIUM), Kuala Lumpur 53100, Malaysia
*
Author to whom correspondence should be addressed.
Electronics 2021, 10(20), 2538; https://doi.org/10.3390/electronics10202538
Submission received: 31 August 2021 / Revised: 13 October 2021 / Accepted: 13 October 2021 / Published: 18 October 2021
(This article belongs to the Section Bioelectronics)

Abstract

:
Breast cancer is the most common cause of death in women around the world. A new tool has been adopted based on thermal imaging, deep convolutional networks, health applications on smartphones, and cloud computing for early detection of breast cancer. The development of the smart app included the use of Mastology Research with the Infrared Image DMR-IR database and the training of the modified version of deep convolutional neural network model inception V4 (MV4). In addition to designing the application in a graphical user interface and linking it with the AirDroid application to send thermal images from the smartphone to the cloud and to retrieve the suggestive diagnostic result from the cloud server to the smartphone. Moreover, to verify the proper operation of the app, a set of thermal images was sent from the smartphone to the cloud server from different distances and image acquisition procedures to verify the quality of the images. Four effects on the thermal image were applied: Blur, Shaken, Tilted, and Flipping were added to the images to verify the detection accuracy. After conducting repeated experiments, the classification results of early detection of breast cancer, generated from the MV4, illustrated high accuracy performance. The response time achieved after the successful transfer of diagnostic results from the smartphone to the cloud and back to the smartphone via the AirDroid application is six seconds. The results show that the quality of thermal images did not affect by different distances and methods except in one method when compressing thermal images by 5%, 15%, and 26%. The results indicate 1% as maximum detection accuracy when compressing thermal images by 5%, 15%, and 26%. In addition, the results indicate detection accuracy increased in Blurry images and Shaken images by 0.0002%, while diagnostic accuracy decreased to nearly 11% in Tilted images. Early detection of breast cancer using a thermal camera, deep convolutional neural network, cloud computing, and health applications of smartphones are valuable and reliable complementary tools for radiologists to reduce mortality rates.

1. Introduction

The new smartphone technology has become a strong competitor to computers. It is considered as one of the portable computers characterized by the characteristics of communication and service applications for the user. Recently, specialized applications have appeared in various sections of healthcare, making it easier for the user to access and benefit from it. It also creates self-responsibility and makes it easier to access healthcare in remote areas. This increasing growth in mobile applications in healthcare led to a huge number of applications. These results indicate that the individual’s self-responsibility to obtain the largest amount of healthcare became possible. In 2018, there were about 600 applications for breast cancer awareness, screening, diagnosis, treatment, and managing disease [1].
Mobile phone applications in the modern era carry many advantages that have become part of daily life. Due to the multiplicity of these applications, it has entered the medical field, especially in breast cancer detection. Therefore, many recent studies indicate the culture of finding solutions for early detection and prevention of breast cancer.
The study conducted by the researcher [2] indicates a comparison between previous studies in deep learning and other neural networks and between devices used to detect breast cancer and its effects. The recommendations on the use of deep learning are shown due to its high accuracy compared to other neural networks. In addition, the researcher referred to the self-examination of breast cancer and the directions urged by the World Health Organization to educate women about the risks of breast cancer. The study also urged the creation of a lightweight algorithm to allow it to be installed as a mobile app that is capable of supporting self-detection.
The study [3] explained the methods of developing a Bewell mobile application for the natural treatment of patients with breast cancer. The app has been designed according to studies that have been conducted with regard to patients with breast cancer, health care professionals, and academics. The app also includes customized information, visual displays of training, push notifications, tracking, and progress features. Moreover, the app is easy to use and theist content is clear and motivating.
The researcher referred in [4] to evaluate user satisfaction with the mobile health application for patients with breast cancer, which includes implementing a physical exercise program with a pedometer and providing consultations on the phone. In addition to the program, it was implemented within 12 weeks, and the patients were divided according to age and who received radiotherapy. The results indicated a high level of satisfaction among users and that it can play an active role in managing their health status through mobile applications. The most satisfying result was data transmission accuracy (73%) and patients were very satisfied with phone counseling (62%).
The researcher referred to in study [5] used a smartphone application for the psychological treatment of breast cancer survivors. The study also included a 24-week follow-up of patients with breast cancer under the age of 50. Additionally, the application includes educational videos, an entry form for this study, social networking, games, and chatting. Moreover, two applications, Kaiketsu-App and Genki-App, were used on the iPhone. The results indicate that it is the first trial to measure the effectiveness of psychological treatment for patients with breast cancer using smartphone applications and to facilitate therapeutic interventions without hospitalization.
Study [6] was conducted on the Effects of Nurse-Led Support Via WeChat, a smartphone app, for patients with Breast Cancer After Surgery, in which the researcher used 60 patients divided into two groups (each group consisted of 30 patients). The first group has been checked up on continuously over 6 months via the smartphone application. In addition, subjects in the intervention group participated in a WeChat-based support program (WSP) led by nurses. Moreover, the patients were evaluated by their physical well-being, state of psychology, and social support. WSP is a useful and effective way that provides patients who have undergone surgery for BC with continued, individualized, and timely education. Additionally, it facilitates continuous communication between patients and healthcare providers and provides peer support from patients with similar treatment experiences. WSP assisted with nurse-led support and had physical, psychological, and social benefits for patients after BC surgery.
The study was conducted [7] in a personalized, web-based breast cancer decision-making application: a pre-post survey, which included 255 women with breast cancer. The study also included women (29–85 years old). The application was provided to patients with breast cancer as an educational tool and a decision-making aid. The app also contains information that is easy to access and has a reading level from sixth to eighth grade, with the help of patient education. The results showed that when using a smartphone application in conjunction with medical advice, an online educational tool to provide personalized breast cancer education was associated with increased confidence in decision-making. Most of the patients believed the app contained useful information, and was easy to navigate, and included the right amount of information.
The researcher [8] indicates that the MOCHA application can follow up the patients with breast cancer in terms of nutrition, sports activities, and routine follow-up with health care professionals. In addition to the study, the patients were followed up over six months. The MOCHA system consists of three components: (1) the smartphone tool, (2) the application server, and (3) the provider’s client. The smartphone tool and provider’s client are the two applications that run on the Android and iOS operating systems. The smartphone tool as a tracking and communication tool is installed on the participants’ cell phones. The provider’s client is installed on the cell phones of the caregivers to monitor participants’ attitudes and communication in real-time. The app is also used to guide the behavior of patients with breast cancer to prevent disease. In addition, the work focuses on long-term behavior modification to reduce common comorbidities among breast cancer survivors and improve the quality of cancer care for this growing population.
The study presented by [9] indicates the feasibility of the smartphone application and social media interference in the health outcomes of breast cancer survivors. The My Guide app also included a set of icons that contribute to self-awareness of breast cancer, including social communication, communication with the health awareness team, medical advice, educational audios, and prescriptions for patients with breast cancer. In addition, 25 women participated in this four-week study for a remote training protocol (for example, enhancing the use of the application and facilitating the resolution of problems related to barriers that the participant identified using the My Guide app).
The researcher [10] used an Android app (WalkON®), daily walking steps, and weekly distress scores were collected using app-based Distress Thermometer (DT) questionnaires from participants of approximately 12 weeks. The study also aimed to investigate the impact of a community-based mobile phone application on enhancing trauma and reducing distress among breast cancer survivors. The number of participants in this study is 64 (20–60 years old). Furthermore, participants were instructed to run and update the app at least once a week to have their daily walking data sent to a central database system. The mobile community showed a significant increase in weekly steps and a decrease in a distress thermometer.
The researcher [11] indicated the number of mobile phone applications for surviving breast cancer and self-management available on Android and Apple operating systems. The study included a total of 294 applications which were sorted according to the following criteria: Availability in English language, free of charge to the user; and have a digital rating available to the user on the corresponding mobile app store. A content analysis was performed for the nine applications to meeting the inclusion criteria to assess the inclusion of the following mobile health self-management features derived from the chronic care model: symptom tracking; survival education; sharing information with family and/or caregivers; scheduling follow-up visits; personal alerts and reminders; and social networks. Surviving education was found to be the most popular self-management feature among the apps reviewed, followed by social networking. The results of this study highlighted the paucity of mobile health resources available to breast cancer survivors.
The researcher [12] explains the introduction of a smartphone application as a health care tool for patients with breast cancer. It provides patients with individually tailored information and a support group of peers and healthcare professionals. Online breast cancer support aims to enhance women’s self-efficacy, social support, and symptom management. The study included six months for 108 women. In addition, the application contains an educational forum, a discussion forum, an Ask Experts forum, and a forum for personal stories. The study contributed to the role of self-efficacy and social support in reducing symptom distress and the credibility of using a theoretical framework for developing a BCS intervention.
In this research [13], a comparison was made between a high-quality thermal camera 640 * 480 pixels and a small thermal camera of 160 * 120 pixels. The thermal images were greyed out and the four features were extracted using a gray level co-occurrence matrix (GLCM). Furthermore, the images were classified using the k-Nearest Neighbor classifier. The results indicated that both classification accuracy exceeded 98%. The method achieves 99.21% accuracy with KNN and has superior performance to traditional methods. Meanwhile, with the rapid development of smartphones that incorporate advanced digital camera technology, they will be more accessible for early screening of breast cancer, reduces heavy costs, strict technical limitations, and scarce medical resources as an auxiliary screening tool.
The researcher in [14] used a mobile phone equipped with a thermal camera. Image captured from a thermal imager (Cat® S60: Equipped with FLIR ™ Lepton), transmitted to FPGA via a Bluetooth ultra-low-power link, stored in an SD card, and gray matrix co-presence (GLCM) features and run-length matrix (RLM) is computed and fed into a Machine Learning (ML) classifier for early detection. The results indicated that the breast cancer screening processor targets a portable home environment and achieves sensitivity and specificity of 79.06% and 88.57%, respectively.
Previous studies in early breast cancer diagnoses have not ventured into the use of the inception family of deep learning algorithms. Our proposed system introduces using inception V3, inception V4, and modified version inception V4 [15]. Further, previous studies indicated a lack of home diagnostic tools for early breast cancer detection. It is noted that study [14] has used a mobile phone app to collect images using a thermal camera and transmit images to a machine learning code running on an FPGA card nearby using a Bluetooth connection. However, such solutions may introduce low accuracy levels. In this paper the proposed system introduced uses inception V3, inception V4, and a modified inception MV4 with very high accuracy and efficiency allowing for early detection of breast cancer at an early stage and helps conduct regular and continuous examinations follow-ups without violating the privacy of the patients or introducing any side effects.

2. Materials and Methods

The breast cancer screening scheme is presented using a mobile app connected to the thermal camera. We present the outline at the beginning and then describe each process in detail.

2.1. Framework

Figure 1 illustrates our approach to breast cancer screening using a mobile app. We can summarize the main processes as follows: First, we train the Deep Convolutional Neural Network as a model inception MV4. Second, we create a Graphical User Interface Development Environment (GUIDE) in MATLAB using visual elements such as icons, buttons, scroll bars, windows, and boxes to simplify the interaction between the human and the computer. Third, we use cloud computing for the intense computational requirements and large data processing found in deep convolutional neural networks. Fourth, we use the mobile application to send thermal images for cloud computing, receive diagnostic results, and display on users’ smartphone screen. Moreover, a set of thermal images were sent from the smartphone for cloud computing on different distances and methods to verify the quality of the images as shown in Figure 2. Four effects on the thermal image (Blur, Shaking, Tilting, and Flipping) were added to verify detection accuracy.

2.2. Deep Learning in Matlab

Deep learning is a branch of machine learning that uses deep convolutional neural networks to extract features directly from a database. Therefore, it achieves advanced classification accuracy that exceeds human performance. We used a deep convolutional neural network consisting of 192 layers. Deep convolutional neural network training requires a set of databases that includes thermal images of healthy and breast cancer. Breast thermal images were downloaded from the dynamic thermogram DMR-IR dataset, and Deep Convolutional Neural Network model Inception MV4 was loaded, the learning rate setting was adjusted and the optimization method was chosen. We divided the database into 70% for training and 30% for testing and trained the Deep Convolutional Neural Network Inception MV4 (Figure 3).
Our app employs our developed deep learning algorithms presented in [15], namely Inception V3, Inception V4, and modified Inception MV4. The deep convolutional neural network modified Inception MV4 was developed for higher detection accuracy and faster arithmetic operations compared to Inception V3 and Inception V4. The major change in MV4 is the number of layers in Inception B is less than those in Inception V4. Table 1 shows a comparison between deep convolutional neural networks Inception v3, Inception v4, and Inception Mv4. Finally, all sizes of filters were 3 * 3 and 1 * 1 with average pooling and max pooling [15].

2.3. Graphical User Interface Development Environment (GUIDE)

The advantage of the graphical user interface is the availability to the user who does not know MATLAB. The graphic user interface is created using a set of icons such as buttons, scroll bars, windows, and boxes to simplify it for the user. A graphic user interface was created in Figure 4, which contains a dedicated place for displaying the thermal image and showing the diagnostic result. In addition, auxiliary icons for diagnosis have been created, such as the patient’s name, age, gender, and room temperature. Moreover, we added some questions related to the patient’s condition before the examination and there is an icon to reset all icons in GUIDE. Additionally, we provided two messages for the patient; if there is a suspicion of cancer, it will display ″It is advisable to pay a visit to a specialist clinic’’ and for other conditions, it will display ‘‘You are Safe’’ as shown in Figure 4. In addition, the user interface has been provided with two files; the first was for entering the diagnostic thermal image, and the other was for storing the diagnostic result.

2.4. Cloud Computing

Cloud computing can process large volumes of data, with low cost, high performance, and unlimited storage. Therefore, cloud computing is greatly increased [16]. In addition, it is possible to add a set of GPUs with high specifications in cloud computing. The PC was used as a cloud computing platform, where thermal images were received from smartphones and processed on the PC and the results were sent to the smartphone via the application as shown in Figure 5.

2.5. Smartphone Health Application

Mobile health is an application that provides healthcare via mobile devices. It has been popular in recent times due to people’s interest in public health. Some health apps rely on periodic monitoring and are usually connected to sensors to collect data such as heart rate and track the exact geographical location [17]. Therefore, these applications provide new solutions for digital health services. Previous studies have indicated solutions to breast cancer recurrence and prevention, but no primary diagnostic aid for breast cancer has been mentioned. By adding a thermal camera with smartphone applications, we have added a feature for early detection of breast cancer as showen in Figure 6. The proposal is to create an application that transfers data from the smartphone to the cloud computing platform and sends the results from the cloud computing platform to the mobile phone, as shown in Figure 7.

2.6. Experiment Set Up

Figure 1 shows the steps for implementing early detection of breast cancer using a smartphone application that is equipped with a thermal camera. We used a PC (Core i7, RAM 36 GB, a GTX 1660 GPU with 6 GB RAM), Matlab version 2020a, DMR-IR database, Huawei Smartphone, FLIR Pro-One thermal camera, and AirDroid app. In addition, a deep convolutional neural network is designed as inception MV4. To train the deep convolutional neural network model inception MV4, we set the learning rate to 1 × 10 4   and used the Stochastic Gradient Descent with Momentum (SGDM) optimization method. After training the Deep Convolutional Neural Network Model inception MV4, and verifying a set of tests, we transferred it to MATLAB’s GUIDE. The user interface is designed in two parts. The first icon contains the thermal image and the other contains the diagnostic result as shown in Figure 8 and Figure 9. Moreover, the user interface is programmed to automatically read thermal images from a file (input images) created on the desktop and send the diagnostic results to a file (diagnostic output) on the desktop.
The AirDroid application was installed on the smartphone and on the desktop to insert thermal images from the smartphone into the cloud computing and send the diagnostic results from the cloud server to the smartphone. In conclusion, a group of thermal images captured by the smartphone was tested by a thermal camera (Model FLIR One Pro) in a Shiraz Hospital as shown in Figure 8 and Figure 9. A comparison of the performance of the three deep learning algorithms implemented on the app is shown in Table 1 below.
On the other hand, the experiments were made as follows: where two sources of thermal images were used, the first source was DMR-IR Database, and the second source was the FLIR One Pro connected to a smartphone (thermography from Shiraz Cancer Hospital). The first experiment used six thermal images from Database, including three healthy images and three breast cancer images. The second experiment used five thermal images taken by the Flair Pro-One thermal camera, all of them were suffering from breast cancer. To verify factors affecting the quality of thermal images, they were sent from the smartphone to the cloud in different ways. The first stage: thermal images were sent from the Database and from FLIR One Pro thermal camera to the cloud via Wi-Fi on different scenarios (1 m, 5 m, and 7 m, all without barriers). The second stage: thermal images were sent from the smartphone to the cloud server via Wi-Fi, with barriers between the smartphone and Wi-Fi (one wall, two walls, a roof, a roof with one wall, and a roof with two walls). The third stage: thermal images were sent from the smartphone to the cloud server by cable. The fourth stage: thermal images were sent from the smartphone to the cloud server via the 4G network. The fifth stage: thermal images were compressed by different percentage values such as 5%, 15%, and 26% and sent from the smartphone to the cloud computing.
In addition, eight metrics were used to measure the quality of the thermal images received in cloud server: Mean Squared Error (MSE), Peak Signal to Noise Ratio (PSNR), Average Difference (AD), Structural Content (SC), Normalized Cross-Correlation (NK), Maximum Difference (MD), Laplacian Mean Squared Error (LMSE), and Normalized Absolute Error (NAE).
Image blur occurs when the camera moves during the exposure. As for the thermal camera, when blurring occurs around temperature grades, it may lead to incorrect readings in pixels and too unreliable results in thermal images [18]. The researchers in [19] also studied the extent to which the blurring affects the accuracy of breast cancer detection. Furthermore, thermal image tilt is a common processing routine for training a deep convolutional neural network. The tilt depends on the angle and the point of the tilt [20]. The goal of the blur, flip, and tilt is to increase network training and achieve the best diagnostic outcome [21]. In addition, the increasing acquisition of thermal images by inexperienced users leads to a large number of distortions, including shaking images caused by camera shake [22]. However, four factors have been added to influence the accuracy of thermal imaging diagnostics (Blurry images in Figure 10, Flipped images in Figure 11, tilted images in Figure 12, and Shaken images in Figure 13). Finally, the accuracy of the diagnosis was verified in every aspect of the evaluation.

3. Results and Discussion

We have conducted many experiments by applying the proposed method and the results of the evaluation showed the success of detecting breast cancer at an early stage. The deep convolutional neural network model inception MV4 showed excellent performance in the training phase, so the diagnostic accuracy reached 100%. Moreover, the diagnostic period is only 6 s from sending the image through the application and until receiving the diagnostic result in the interface on the smartphone. The results showed that the size of the application and user interface fees amounted to 1.5 GB due to their large database. In addition, thermal image processing using deep convolutional neural networks requires a high-speed GPU that is not available in smartphones, so cloud computing has the ability to quickly diagnose and send results to the smartphone application.
The results showed that the thermal images (from the database and the FLIR One Pro thermal camera) sent from the smartphone to the cloud computing server via Wi-Fi, either direct or with barriers, did not change their quality as shown in Table 2 and Table 3. However, when compressing thermal images from the database and sent them from the smartphone to the cloud, images quality changed as shown in Table 4. In addition, when thermal images are compressed by 5%, 15%, and 26%, the diagnostic accuracy changes with a very small difference of 0.0001% max with a change in the quality of thermal images. As for the diagnostic accuracy of compressed (5%, 15%, and 26%) thermal images used from the FLIR One Pro compared to the original images, it changed by a maximum difference of 0.1% with a change in the quality of thermal images (Table 5).
In the second part of the experiment, the results indicated that the use of four factors affecting the accuracy of diagnosis fluctuated. The results of the experiment using thermal images from the database show that detection accuracy in the first, the second, and the third health thermal images decreased by a maximum of 1.6% in Blurry images, Tilted images, and Shaken images. However, the Flipped image has maintained 100% detection accuracy. As for breast cancer thermography, detection accuracy increased in Blurry images and Shaken images by 0.0002%, while diagnostic accuracy decreased to nearly 11% in Tilted images. However, in the Flipped image, the accuracy percentage remained the same in the original pictures (Table 6).
On the other hand, the results of the experiment, in which FLIR One Pro thermal images were used, when compressed and sent from the smartphone to the cloud computing server, indicate a decrease in the quality of thermal images sent. However, this change in the quality of thermal images affected the detection accuracy, as the results showed a very slight fluctuation detection accuracy percentage around 0.03%. In addition, when Blurry images, Shaken images, and Flipped images were used, detection accuracy has increased by a small percentage (about 0.4% max.), but when using Tilted images, the detection accuracy percentage fluctuates +−0.5% (Table 7).

4. Conclusions

Health applications in smartphones contributed to the increase in the culture of self-care. Given previous studies that seek to reduce the incidence of breast cancer, however, it needs a primary diagnostic tool that is compatible with health applications in modern smartphones. The current paper proposes a home-automated diagnostic tool with the help of smartphone applications, cloud computing, and thermal cameras. The experimental results confirm the effectiveness of the proposal, as the accuracy rate in breast cancer detection has reached 100%. We conclude that breast thermography using health applications for smartphones and cloud computing is a good tool for early detection of breast cancer, especially for remote areas and elderly patients, in addition to providing features related to health education, rapid response, and periodic follow-up for patients. This tool on the smartphone application makes a quantum leap in raising the efficiency of initial self-diagnosis. Moreover, this technology is characterized by repeated use at the same time and used as a family diagnostic tool. Additionally, results show that the quality of thermal images did not affect by different distances and methods except in one method when compressing thermal images by 5%, 15%, and 26%. The results indicate 1% as maximum detection accuracy when compressing thermal images by 5%, 15%, and 26%. In addition, the results indicate detection accuracy increased in Blurry images and Shaken images by 0.0002%, while diagnostic accuracy decreased to nearly 11% in Tilted images. Future work could lead to the creation of an integrated health application that includes health education, periodic follow-up, communication with the health care center, updating patient data, prescriptions, and exercise, in addition to using this technique to detect other diseases such as lung cancer and foot ulcers. Moreover, future work should add a set of effects to thermal images before training in the deep convolutional neural network. Future works should focus on improving the classification and detection accuracies considering different age groups, gender types, and other convolved medical preconditions. Previous studies have ignored the diagnoses considering these important factors. An interesting fault diagnoses method that combines thermal imaging and machine learning is introduced in [23]. Future work may look into the possible application of this method to the early detection of breast cancer.

Author Contributions

Conceptualization, methodology, and writing, M.A.S.A.H., and M.H.H.; software, M.A.S.A.H.; validation & Revision: M.H.H., T.S.G., M.R.I.; Funding: M.H.H. and M.R.I. All authors have read and agreed to the published version of the manuscript.

Funding

This work has been partially sponsored by International Islamic University Malaysia, Publication Research Initiative Grant Scheme number P-RIGS18-003-0003.

Data Availability Statement

Not applicable.

Conflicts of Interest

Authors declare no conflict of interest.

References

  1. Houghton, L.C.; Howland, R.E.; McDonald, J.A. Mobilizing Breast Cancer Prevention Research Through Smartphone Apps: A Systematic Review of the Literature. Front. Public Health 2019, 7, 298. [Google Scholar] [CrossRef] [PubMed]
  2. Roslidar, R.; Rahman, A.; Muharar, R.; Syahputra, M.R.; Arnia, F.; Syukri, M.; Pradhan, B.; Munadi, K. A Review on Recent Progress in Thermal Imaging and Deep Learning Approaches for Breast Cancer Detection. IEEE Access 2020, 8, 116176–116194. [Google Scholar] [CrossRef]
  3. Harder, H.; Holroyd, P.; Burkinshaw, L.; Watten, P.; Zammit, C.; Harris, P.R.; Good, A.; Jenkins, V. A user-centred approach to developing bWell, a mobile app for arm and shoulder exercises after breast cancer treatment. J. Cancer Surviv. 2017, 11, 732–742. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  4. Lee, H.; Uhm, K.E.; Cheong, I.Y.; Yoo, J.S.; Chung, S.H.; Park, Y.H.; Lee, J.Y.; Hwang, J.H. Patient Satisfaction with Mobile Health (mHealth) Application for Exercise Intervention in Breast Cancer Survivors. J. Med. Syst. 2018, 42, 254. [Google Scholar] [CrossRef] [PubMed]
  5. Akechi, T.; Yamaguchi, T.; Uchida, M.; Imai, F.; Momino, K.; Katsuki, F.; Sakurai, N.; Miyaji, T.; Horikoshi, M.; A Furukawa, T.; et al. Smartphone problem-solving and behavioural activation therapy to reduce fear of recurrence among patients with breast cancer (SMartphone Intervention to LEssen fear of cancer recurrence: SMILE project): Protocol for a randomised controlled trial. BMJ Open 2018, 8, e024794. [Google Scholar] [CrossRef] [PubMed]
  6. Wu, Q.; Kue, J.; Zhu, X.; Yin, X.; Jiang, J.; Chen, J.; Yang, L.; Zeng, L.; Sun, X.; Liu, X.; et al. Effects of Nurse-Led Support Via WeChat, a Smartphone Application, for Breast Cancer Patients After Surgery: A Quasi-Experimental Study. Telemed. e-Health 2020, 26, 226–234. [Google Scholar] [CrossRef] [PubMed]
  7. Wyatt, K.D.; Jenkins, S.M.; Plevak, M.F.; Pont, M.R.V.; Pruthi, S. A personalized, web-based breast cancer decision making application: A pre-post survey. BMC Med. Inform. Decis. Mak. 2019, 19, 1–11. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  8. Stubbins, R.; He, T.; Yu, X.; Puppala, M.; Ezeana, C.F.; Chen, S.; Alvarado, M.V.Y.; Ensor, J.; Rodriguez, A.; Niravath, P.; et al. A Behavior-Modification, Clinical-Grade Mobile Application to Improve Breast Cancer Survivors’ Accountability and Health Outcomes. JCO Clin. Cancer Inform. 2018, 2, 1–11. [Google Scholar] [CrossRef] [PubMed]
  9. Buscemi, J.; Buitrago, D.; Iacobelli, F.; Penedo, F.; Maciel, C.; Guitleman, J.; Balakrishnan, A.; Corden, M.; Adler, R.F.; Bouchard, L.C.; et al. Feasibility of a Smartphone-based pilot intervention for Hispanic breast cancer survivors: A brief report. Transl. Behav. Med. 2018, 9, 638–645. [Google Scholar] [CrossRef] [PubMed]
  10. Chung, I.Y.; Jung, M.; Park, Y.R.; Cho, D.; Chung, H.; Min, Y.H.; Park, H.J.; Lee, M.; Lee, S.B.; Chung, S.; et al. Exercise Promotion and Distress Reduction Using a Mobile App-Based Community in Breast Cancer Survivors. Front. Oncol. 2020, 9, 1–8. [Google Scholar] [CrossRef] [PubMed]
  11. Kapoor, A.; Nambisan, P.; Baker, E. Mobile applications for breast cancer survivorship and self-management: A systematic review. Health Inform. J. 2020, 26, 2892–2905. [Google Scholar] [CrossRef] [PubMed]
  12. Zhu, J.; Ebert, L.; Liu, X.; Chan, S.W.-C. A mobile application of breast cancer e-support program versus routine Care in the treatment of Chinese women with breast cancer undergoing chemotherapy: Study protocol for a randomized controlled trial. BMC Cancer 2017, 17, 291. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  13. Ma, J.; Shang, P.; Lu, C.; Meraghni, S.; Benaggoune, K.; Zuluaga, J.; Zerhouni, N.; Devalland, C.; Al Masry, Z. A portable breast cancer detection system based on smartphone with infrared camera. Vibroengineering Procedia 2019, 26, 57–63. [Google Scholar] [CrossRef]
  14. Majeed, B.; Iqbal, H.T.; Khan, U.; Bin Altaf, M.A. A Portable Thermogram Based Non-Contact Non-Invasive Early Breast-Cancer Screening Device. In Proceedings of the 2018 IEEE Biomedical Circuits and Systems Conference (BioCAS), Cleveland, OH, USA, 17–19 October 2018; pp. 1–4. [Google Scholar]
  15. Al Husaini, M.A.S.; Habaebi, M.H.; Gunawan, T.S.; Islam, R.; Elsheikh, E.A.A.; Suliman, F.M. Thermal-based early breast cancer detection using inception V3, inception V4 and modified inception MV4. Neural Comput. Appl. 2021, 1, 1–16. [Google Scholar] [CrossRef]
  16. Namasudra, S. Data Access Control in the Cloud Computing Environment for Bioinformatics. Int. J. Appl. Res. Bioinform. 2021, 11, 40–50. [Google Scholar] [CrossRef]
  17. Helbostad, J.L.; Vereijken, B.; Becker, C.; Todd, C.; Taraldsen, K.; Pijnappels, M.; Aminian, K.; Mellone, S. Mobile Health Applications to Promote Active and Healthy Ageing. Sensors 2017, 17, 622. [Google Scholar] [CrossRef]
  18. Zhang, X.; He, Y.; Chady, T.; Tian, G.Y.; Gao, J.; Wang, H.; Chen, S. CFRP Impact Damage Inspection Based on Manifold Learning Using Ultrasonic Induced Thermography. IEEE Trans. Ind. Inform. 2018, 15, 2648–2659. [Google Scholar] [CrossRef]
  19. Ma, W.K.; Borgen, R.; Kelly, J.; Millington, S.; Hilton, B.; Aspin, R.; Lança, C.; Hogg, P. Blurred digital mammography images: An analysis of technical recall and observer detection performance. Br. J. Radiol. 2017, 90, 20160271. [Google Scholar] [CrossRef] [Green Version]
  20. Gaster, B.R.; Howes, L.; Kaeli, D.R.; Mistry, P.; Schaa, D. Introduction to Parallel Programming. In Heterogeneous Computing with OpenCL; Elsevier Inc.: Amsterdam, The Netherlands, 2013; pp. 1–13. [Google Scholar] [CrossRef]
  21. Howard, A.G. Some improvements on deep convolutional neural network based image classification. In Proceedings of the 2nd International Conference on Learning Representations, ICLR 2014, Banff, AB, Canada, 14–16 April 2014. [Google Scholar]
  22. Oh, T.; Park, J.; Seshadrinathan, K.; Lee, S.; Bovik, A.C. No-Reference Sharpness Assessment of Camera-Shaken Images by Analysis of Spectral Structure. IEEE Trans. Image Process. 2014, 23, 5428–5439. [Google Scholar] [CrossRef] [PubMed]
  23. Glowacz, A. Ventilation Diagnosis of Angle Grinder Using Thermal Imaging. Sensors 2021, 21, 2853. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Self-detection of breast cancer based on smartphone application with infrared camera.
Figure 1. Self-detection of breast cancer based on smartphone application with infrared camera.
Electronics 10 02538 g001
Figure 2. Examination room (a) thermal camera with one-meter distance from the chair, (b) patient exam.
Figure 2. Examination room (a) thermal camera with one-meter distance from the chair, (b) patient exam.
Electronics 10 02538 g002
Figure 3. Thermal Images validated after training a deep convolutional neural network inception MV4.
Figure 3. Thermal Images validated after training a deep convolutional neural network inception MV4.
Electronics 10 02538 g003
Figure 4. A GUIDE for breast cancer diagnosis from thermal images.
Figure 4. A GUIDE for breast cancer diagnosis from thermal images.
Electronics 10 02538 g004
Figure 5. Diagnostic results on the application interface.
Figure 5. Diagnostic results on the application interface.
Electronics 10 02538 g005aElectronics 10 02538 g005b
Figure 6. Block diagram of the proposed research.
Figure 6. Block diagram of the proposed research.
Electronics 10 02538 g006
Figure 7. Screenshot of AirDroid App.
Figure 7. Screenshot of AirDroid App.
Electronics 10 02538 g007
Figure 8. A GUIDE for BCD.
Figure 8. A GUIDE for BCD.
Electronics 10 02538 g008
Figure 9. Thermal Image from Shiraz Medical Center for breast cancer.
Figure 9. Thermal Image from Shiraz Medical Center for breast cancer.
Electronics 10 02538 g009
Figure 10. (a) Original Thermal Image from Shiraz Medical Center for Breast Cancer & (b) Blurry Thermal Image.
Figure 10. (a) Original Thermal Image from Shiraz Medical Center for Breast Cancer & (b) Blurry Thermal Image.
Electronics 10 02538 g010
Figure 11. (a) Original Thermal Image from Shiraz Medical Center for Breast Cancer & (b) Flipped Thermal Image.
Figure 11. (a) Original Thermal Image from Shiraz Medical Center for Breast Cancer & (b) Flipped Thermal Image.
Electronics 10 02538 g011
Figure 12. (a) Original Thermal Image from Shiraz Medical Center for Breast Cancer & (b) Shaken Thermal Image.
Figure 12. (a) Original Thermal Image from Shiraz Medical Center for Breast Cancer & (b) Shaken Thermal Image.
Electronics 10 02538 g012
Figure 13. (a) Original Thermal Image from Shiraz Medical Center for Breast Cancer & (b) Tilted Thermal Image.
Figure 13. (a) Original Thermal Image from Shiraz Medical Center for Breast Cancer & (b) Tilted Thermal Image.
Electronics 10 02538 g013
Table 1. Benchmarking Inception V3, V4, and MV4 on the app.
Table 1. Benchmarking Inception V3, V4, and MV4 on the app.
ConfigurationInception V3Inception V4 Inception MV4
Parameters of Augmentationrandomly flip the training images along the vertical axis and randomly translate them up to 30 pixels and scale them up to 10% horizontally and verticallyrandomly flip the training images along the vertical axis and randomly translate them up to 30 pixels and scale them up to 10% horizontally and verticallyrandomly flip the training images along the vertical axis and randomly translate them up to 30 pixels and scale them up to 10% horizontally and vertically
ConfigurationGlobal Average Pooling + Full Connected Layer (2048) + SoftMaxGlobal Average Pooling + Dropout (0.8) + Full Connected Layer (1536) + SoftMaxGlobal Average Pooling + Dropout (0.8) + Full Connected Layer (1536) + SoftMax
first 10 convolution layers frozenfirst 10 convolution layers frozenfirst 10 convolution layers frozen
Number of parameters21,806,882156,042,082128,174,466
Optimization method ADAMSGDMSGDM
Database1874 thermal images from DMR-IR (70% training &30% Testing)1874 thermal images from DMR-IR (70% training &30% Testing)1874 thermal images from DMR-IR (70% training &30% Testing)
Learning rate 1e−41e−41e−4
Software MATLABMATLABMATLAB
Accuracy Average 98.104% Average 99.712% Average 99.748 %
Error ±1.52%±0.27%±0.18%
Training Time epoch 36.376 min with error ±0.015 min9.554 min with error ±0.145 min7.704 min with error ±0.01 min
Table 2. Detection accuracy and quality of thermal images (DMR-IR database) after effects.
Table 2. Detection accuracy and quality of thermal images (DMR-IR database) after effects.
MethodSub MethodDMR-IR DatabaseImage 1Image 2Image 3Image 4Image 5Image 6
Classification Health ImagesCancer Images
Cable/
Data
1. MSE 000000
2. PSNR InfInfInfInfInfInf
3. AD 000000
4. SC 111111
5 NK 111111
6. MD 000000
7 LMSE 000000
8. NAE 000000
Accuracy %10010010099.999899.999899.9999
WiFi1 m/5 m/7 m
/One Wall/
Two walls/
Roof/
Roof and one wall/
Roof and two walls
1. MSE000000
2. PSNRInfInfInfInfInfInf
3. AD000000
4. SC111111
5 NK111111
6. MD000000
7 LMSE000000
8. NAE000000
Accuracy %10010010099.999899.999899.9999
Table 3. Detection accuracy and quality of thermal images (FLIR one PRO) after effects.
Table 3. Detection accuracy and quality of thermal images (FLIR one PRO) after effects.
MethodSub MethodFLIR ONE PRO Image 1Image 2Image 3Image 4Image 5
Classification Cancer Images
Cable/Data 1. MSE 00000
2. PSNR InfInfInfInfInf
3. AD 00000
4. SC 11111
5 NK 11111
6. MD 00000
7 LMSE 00000
8. NAE 00000
Accuracy %99.945199.280599.959399.407997.6108
WiFi1 m/5 m/7 m
/One Wall/
Two walls/
Roof/
Roof and one wall/
Roof and two walls
1. MSE 00000
2. PSNR InfInfInfInfInf
3. AD 00000
4. SC 11111
5 NK 11111
6. MD 00000
7 LMSE 00000
8. NAE 00000
Accuracy %99.945199.280599.959399.407997.6108
Table 4. Compressed DMRIR image and check quality.
Table 4. Compressed DMRIR image and check quality.
Quality ParametersDMR-IR
HealthyCancer
Sample 1Sample2Sample 3Sample 1Sample 2
Compressed 5%MSE21.15453120.73817722.21234411.70776013.383073
PSNR34.87677034.96309834.66486037.44606536.865245
AD−0.074635−0.101302−0.0993230.006406−0.029844
SC0.9948410.9951390.9948970.9968240.996018
NK1.0022991.0021571.0022571.0014391.001821
MD2934292224
LMSE0.0509180.0498110.0500310.0336280.034838
NAE0.0188180.0184120.0192310.0127920.013845
Accuracy 100100100100100
Compressed 15%MSE21.15453120.73817722.21234420.57000021.878854
PSNR34.87677034.96309834.66486034.99846134.730558
AD−0.074635−0.101302−0.099323−0.0505210.011979
SC0.9948410.9951390.9948970.9972780.998356
NK1.0022991.0021571.0022571.0010941.000533
MD2934292936
LMSE0.0509180.0498110.0500310.0449610.046356
NAE0.0188180.0184120.0192310.0172100.018037
Accuracy 10010010099.999999.9999
Compressed 26%MSE56.75088552.87630256.05531220.57000021.878854
PSNR30.59107730.89819330.64463634.99846134.730558
AD−0.084948−0.159635−0.147604−0.0505210.011979
SC0.9959860.9958050.9965340.9972780.998356
NK1.0012281.0013781.0009631.0010941.000533
MD6546582936
LMSE0.1061750.0958370.0987070.0449610.046356
NAE0.0303040.0290650.0298680.0172100.018037
Accuracy 10010010099.999999.9999
Table 5. Compressed FLIR one pro image and check quality.
Table 5. Compressed FLIR one pro image and check quality.
Quality Parameters FLIR One Pro
Cancers
Sample 1Sample2Sample 3Sample 4Sample 5
Compressed 5%MSE0.6490300.6757190.5851620.8296292.590069
PSNR50.00815349.83314250.45803948.94196543.997690
AD0.001027−0.003398−0.0043540.002947−0.017099
SC0.9999200.9998450.9998720.9999740.999875
NK1.0000301.0000671.0000551.0000041.000037
MD667713
LMSE0.0693680.2468630.0759930.1768700.226512
NAE0.0021630.0021710.0019140.0026650.005469
Accuracy 99.9599.331199.960899.400697.469
Compressed 15%MSE0.6490300.6757190.5851621.9889792.590069
PSNR50.00815349.83314250.45803945.14450143.997690
AD0.001027−0.003398−0.0043540.001004−0.017099
SC0.9999200.9998450.9998720.9999640.999875
NK1.0000301.0000671.0000550.9999961.000037
MD6671113
LMSE0.0693680.2468630.0759930.3019460.226512
NAE0.0021630.0021710.0019140.0049660.005469
Accuracy 99.9599.331199.960899.427597.469
Compressed 26%MSE0.7161930.6757190.5851622.5203313.415912
PSNR49.58050149.83314250.45803944.11622842.795737
AD0.001384−0.003398−0.0043540.000240−0.008514
SC0.9999150.9998450.9998720.9999550.999934
NK1.0000311.0000671.0000550.9999951
MD6671315
LMSE0.0739860.2468630.0759930.4410040.386388
NAE0.0023740.0021710.0019140.0057450.006382
Accuracy 99.949199.331199.960899.422397.5137
Table 6. Detection accuracy of thermal images (DMR-IR DATABASE) after effects.
Table 6. Detection accuracy of thermal images (DMR-IR DATABASE) after effects.
DMR-IR DatabaseImage 1Image 2Image 3Image 4Image 5Image 6
Image Effects Classification Healthy Healthy Healthy Cancer Cancer Cancer
Blurry imagesAccuracy %99.851499.74299.9156100100100
Tilted imagesAccuracy %98.617598.481498.78688.719790.366490.9845
Shaken imagesAccuracy %99.998899.999999.9995100100100
Flipped image Accuracy %10010010099.999899.999899.9999
Table 7. Detection accuracy of thermal images (Flir one PRO) after effects.
Table 7. Detection accuracy of thermal images (Flir one PRO) after effects.
FLIR ONE PROImage 1Image 2Image 3Image 4Image 5
Image Effects Classification Cancer Cancer Cancer Cancer Cancer
Blurry imagesAccuracy %99.98199.677599.97499.629497.8939
Tilted imagesAccuracy %99.863498.17399.587299.946699.0886
Shaken imagesAccuracy %99.958299.448499.959499.515197.9629
Flipped image Accuracy %99.993399.617999.996799.822898.7214
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Al Husaini, M.A.S.; Hadi Habaebi, M.; Gunawan, T.S.; Islam, M.R. Self-Detection of Early Breast Cancer Application with Infrared Camera and Deep Learning. Electronics 2021, 10, 2538. https://doi.org/10.3390/electronics10202538

AMA Style

Al Husaini MAS, Hadi Habaebi M, Gunawan TS, Islam MR. Self-Detection of Early Breast Cancer Application with Infrared Camera and Deep Learning. Electronics. 2021; 10(20):2538. https://doi.org/10.3390/electronics10202538

Chicago/Turabian Style

Al Husaini, Mohammed Abdulla Salim, Mohamed Hadi Habaebi, Teddy Surya Gunawan, and Md Rafiqul Islam. 2021. "Self-Detection of Early Breast Cancer Application with Infrared Camera and Deep Learning" Electronics 10, no. 20: 2538. https://doi.org/10.3390/electronics10202538

APA Style

Al Husaini, M. A. S., Hadi Habaebi, M., Gunawan, T. S., & Islam, M. R. (2021). Self-Detection of Early Breast Cancer Application with Infrared Camera and Deep Learning. Electronics, 10(20), 2538. https://doi.org/10.3390/electronics10202538

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop