**5. Conclusions**

This paper presents several approaches that use the accelerometer sensor commonly available in mobile devices for ADLs recognition. Furthermore, the main contribution of this document is to offer a comparative study of different ANN implementations to find the most appropriate method for ADLs identification using only accelerometer data. The comparative study performed in this research recommends the use of DNN for the recognition of ADLs. We proposed the implementation of the trained DNN method in the system for the identification of the ADLs using only the accelerometer sensor available in off-the-shelf mobile devices, applied with the DeepLearning4j framework. The results show the accuracy of 85.89%, a *precision* value of 86.21%, a *recall* value of 85.89%, and an *F1 score* value of 86.05% using the five largest distances between the maximum peaks; the mean, standard deviation, variance, and median of the maximum peaks; and the standard deviation, mean, maximum and minimum values, variance, and median of the raw signal as features.

Nevertheless, this study has some limitations concerning the use of mobile devices. The lack of research on the best position of the mobile device for data collection is a relevant question. Moreover, the energy expenditure concerning the processing power related to the frequency of data acquisition is also a significant challenge that the authors have addressed by using only accelerometer data. The authors verified that the overfitting problem is not avoided, but the results obtained using only accelerometer data are similar to those obtained with the use of multiple sensors. Additionally, the

authors found that using only one sensor and a smaller number of features for the train of the ANN does not significantly decrease the accuracy of the results obtained. Still, it uses less computational resources and promotes the energy consumption of the mobile device when compared with the use of multiple sensors.

As future work, other implementation settings regarding different machine learning methods will be studied. These implementations will include the design of other types of data classification methods, *e.g.*, ensemble learning methods and decision trees, to verify the existence of different approaches with better results using our dataset. The dataset is publicly available, and other authors can use and compare it with their methods.

**Author Contributions:** Conceptualization, methodology, software, validation, formal analysis, investigation, writing—original draft preparation, and writing—review and editing: I.M.P., G.M., N.M.G., F.F.-R., M.C.T., E.Z., S.S., and M.C. All authors have read and agreed to the published version of the manuscript.

**Funding:** This work is funded by FCT/MCTES through national funds, and when applicable, co-funded EU funds under the project **UIDB/EEA/50008/2020** (*Este trabalho é financiado pela FCT/MCTES através de fundos nacionais e quando aplicável cofinanciado por fundos comunitários no âmbito do projeto UIDB/EEA/50008/2020*).

**Acknowledgments:** This work is funded by FCT/MCTES through national funds, and when applicable, co-funded EU funds under the project **UIDB/EEA/50008/2020** (*Este trabalho é financiado pela FCT/MCTES através de fundos nacionais e quando aplicável cofinanciado por fundos comunitários no âmbito do projeto UIDB/EEA/50008/2020*). This article/publication is based on work from COST Action IC1303: AAPELE—Architectures, Algorithms and Protocols for Enhanced Living Environments and COST Action CA16226; SHELD-ON—Indoor living space improvement: Smart Habitat for the Elderly, supported by COST (European Cooperation in Science and Technology). More information at www.cost.eu. COST (European Cooperation in Science and Technology) is a funding agency for research and innovation networks. Our Actions help connect research initiatives across Europe and enable scientists to grow their ideas by sharing them with their peers. This boosts their research, career and innovation.

**Conflicts of Interest:** The authors declare no conflict of interest.
