**1. Introduction**

Breast cancer is one of the most common and serious diseases that poses a threat to women's health. According to [1], the female breast cancer (around 2.3 million new cases, which means 11.7% of the total cancer cases in women, and 6.9% of all cancer-related deaths in women, in 2020) and lung cancer (11.4% of the total cancer cases in women and 18% of the cancer-related deaths in women) are the types of cancer with the highest incidence. Breast cancer begins as an uncontrolled multiplication of breast cells which proliferate in the breast tissues. They can be classified as benign (some abnormality of the breast tissues but they do not threaten the patient's life) and malignant (might affect the patient's life).

Breast cancer is examined and investigated through many techniques such as computerized tomography, histopathological imaging, magnetic resonance imaging, mammography, and breast ultrasound (BUS). The BUS is a preferred tool for early breast cancer screening due to the significant amount of information provided in a short time [2]. Breast ultrasound imaging is most suitable for the early detection of breast cancer as it is noninvasive and nonradioactive, resulting in a direct, cost-effective improvement in patient care. Furthermore, breast ultrasound imaging is a practical and feasible approach for the early

**Citation:** Anghelache Nastase, I.-N.; Moldovanu, S.; Moraru, L. Image Moment-Based Features for Mass Detection in Breast US Images via Machine Learning and Neural Network Classification Models. *Inventions* **2022**, *7*, 42. https:// doi.org/10.3390/inventions7020042

Academic Editor: Shoou-Jinn Chang

Received: 3 May 2022 Accepted: 14 June 2022 Published: 15 June 2022

**Publisher's Note:** MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

**Copyright:** © 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https:// creativecommons.org/licenses/by/ 4.0/).

detection of breast cancer. Early diagnosis based on ultrasound can significantly improve the cure rate. However, the BUS images have poor resolution, show various features, and contain speckle noise. Moreover, to read these images, a high degree of professionalism is required for radiologists to overcome the subjectivity of the analysis.

Techniques that belong to the computer-aided diagnosis and deep learning fields can bring advances in this area and provide more accurate judgment for the clinical diagnosis of breast cancer [3–6]. Automatic segmentation of breast ultrasound images (as a method to delineate anomalies containing suspicious regions of interest) and extraction of the image features devoted to describing the tumor shape, size, and texture provide valuable references on classification for a much-improved clinical diagnosis of breast cancer [7–11]. Furthermore, the best features are then input into a classifier to establish the category of the tumors, followed by an evaluation of the accuracy of the method [12]. The accuracy of the image segmentation directly influences the performance of both feature extraction and classification of the tumors. Even though segmentation is a fundamental task in medical image analysis, the segmentation of BUS images is difficult because (i) BUS images are affected by speckle noise, show low contrast, show low peak signal-to-noise ratio, and contain speckle artefacts that are tissue-dependent, and (ii) a large variability among patients exists related to the breast anatomical structures. Moreover, to evaluate the segmentation performance, the ground-truth images are generated by using a manually delineated boundary.

This paper uses a machine learning- and deep learning-based system to assist doctors in making a more accurate judgment in classifying breast tumors as benign or malignant using BUS images. To test our classification strategy, we propose a method that integrates Hu's moments in the breast tumor analysis as handcrafted and meaningful features [13]. Our method uses the tumor masks provided as ground-truth images in a publicly available BUSI image dataset (Breast Ultrasound Images Dataset (BUSI dataset) https://academictorrents.com/details/d0b7b7ae40610bbeaea385aeb51658f527c86a16, accessed on 1 December 2021). Hu's moments provide attributes that are related to the shape of the tumors regardless of scale, location, and orientation. They are used for the extraction of the shape characteristics from the BUS images. However, Hu's moments incorporate redundant information, and it is necessary to investigate the significance of the diagnosis difference by utilizing different shape features. We performed a statistical test (*t*-test, *p*-value) for testing the differentiation between benign and malignant tumors in the analyzed BUS images according to Hu's moments [14]. The t-test (*p* < 0.05 or 95% confidence interval) is much simpler to apply and provides a quick response if the analyzed features are meaningful for the classification or less relevant. Furthermore, the remaining features are used to train a k-NN classifier for classification purposes. The k-NN technique uses learning processes based on instance [6] to identify the group's membership by exploiting the feature similarity [15]. Moreover, the same selected features are fed into the radial basis function neural network (RBFNN), which allows discriminating between benign and malignant tumors with the highest accuracy. Multiclassification aids gathering the precise information about the cancer's current state and, equally, supports the informed diagnosis decision [16].

The remainder of the paper is organized as follows: Section 2 discusses related studies from the literature; Section 3 presents the proposed work, as well as a concise overview of the proposed method; Section 4 discusses the experimental results obtained by employing the proposed method, as well as its performance evaluation; finally, Section 5 concludes and discusses potential future work.
