**6. Conclusions**

The development of a framework for ADL [1] and environment recognition using mobile sensors, including accelerometer, gyroscope, magnetometer and microphone, with the architecture presented in References [5–7], has several steps including data acquisition, data processing, data fusion and classification methods. At this stage of the development, the proposed identified ADL are running, walking, standing, going downstairs and upstairs, and sleeping, and the proposed identified environments are bar, classroom, gym, kitchen, library, street, hall, watching TV and bedroom.

Depending on the types of sensors, several features were extracted from the sensors' data for further processing. The features extracted from the microphone are 26 MFCC coefficients and standard deviation, average, maximum value, minimum value, variance and median of the raw signal. Following the motion and magnetic sensors, we extracted the same features of the previous study [4]. The method developed should be adapted to the number of sensors available in the off-the-shelf mobile devices and adapted to the limited resources of these devices.

In coherence with the previous studies [4,16], this research includes the comparison of three different implementations of ANN, such as MLP and FNN with Backpropagation, and the DNN. The DNN is the best method for the recognition of general ADL and standing activities, but the FNN with Backpropagation is the best method for the recognition of environments. In Reference [4], the different parameters of the ANN implemented are detailed.

The accuracies of the recognition ADL and their environments are different depending on the different stages of the framework for the recognition of ADL and environments. Firstly, the best accuracy for the recognition of the general ADL, presented in previous studies [4,16], is 85.89%, implementing the DNN using L2 regularization and normalized data. Secondly, the best accuracy for the recognition of the environments is 86.50%, implementing the FNN with Backpropagation using non-normalized data. Finally, the recognition of standing activities are always around 100% with all implementations studied, but, due to the performance, the best method for the implementation in the framework is the DNN using L2 regularization and normalized data.

As future work, we intend to develop a framework for the identification of ADL and their environments, adapting the method to the number of sensors available on the mobile device. The recognition of the environments allows the framework for identifying the location in the indoor/outdoor environments, where the ADL were performed. The environment recognition can also improve the recognition of ADL, increasing the number of ADL recognized. The data related to this research are available in a free repository [63].

**Author Contributions:** Conceptualization, methodology, software, validation, formal analysis, investigation, writing—original draft preparation, writing—review and editing: I.M.P., G.M., N.M.G., N.P., F.F.R., S.S., M.C.T. and E.Z.

**Funding:** This work is funded by FCT/MEC through national funds and co-funded by FEDER-PT2020 partnership agreemen<sup>t</sup> under the project **UID/EEA/50008/2019**.

**Acknowledgments:** This work is funded by FCT/MEC through national funds and when applicable co-funded by FEDER-PT2020 partnership agreemen<sup>t</sup> under the project **UID/EEA/50008/2019** This work is funded by FCT/MEC through national funds and co-funded by FEDER-PT2020 partnership agreemen<sup>t</sup> under the project (*Este trabalho é financiado pela FCT/MEC através de fundos nacionais e cofinanciado pelo FEDER, no âmbito do Acordo de Parceria PT2020 no âmbito do projeto UID/EEA/50008/2019*). This article is based upon work from COST Action IC1303-AAPELE—Architectures, Algorithms and Protocols for Enhanced Living Environments and COST Action CA16226–SHELD-ON—Indoor living space improvement: Smart Habitat for the Elderly, supported by COST (European Cooperation in Science and Technology). More information in www.cost.eu.

**Conflicts of Interest:** The authors declare no comflicts of interest.
