**6. Conclusions**

Summarizing, this publication presents a review on the DNN-NILM literature. The scope of this review comprises publications that employ deep neural networks to disaggregate appliances from low frequency data, i.e., data with sampling rates lower than the AC base frequency. Our motivation for the scope is our conviction that plenty of applications could benefit from NILM, coupled with the observation that low frequency data will most likely be available at scale in the near future and the enormous success of DNNs in other application domains. We systematically discuss the many degrees of freedom of these approaches and what has already been tested and tried out in the literature along these dimensions. One of the main contributions is Table 2, which gives a structured overview of the main characteristics of all reviewed DNN-NILM approaches. The review part is followed by a discussion of selected DNN-NILM aspects and corresponding research gaps. We present a performance comparisons with respect to reported MAE and F1-scores and observed different recurring elements in the best performing approaches, namely data sampling intervals below 10 s, a large field of view, the usage of GAN losses, multi-task learning, and post-processing. Subsequently, the benefit of multiple input features and multi-task learning and related research gaps has been discussed, the need for comparative studies has been highlighted, and the missing elements for a successful deployment of DNN-NILM approaches have been pointed out. Finally, we also outline potential future scenarios for the NILM field. This contribution is currently missing in the literature, and can therefore be of value. We conclude that there remain many worthwhile research questions to be pursued.

**Supplementary Materials:** To facilitate future work based on the data collected for this publication, we release Table 2 as a MS Excel file. We also provide data and code that was used to generate Figures 3 and 4. All data and code is available at https://github.com/ihomelab/dnn4nilm\_overview accessed on 11 January 2021.

**Author Contributions:** Conceptualization, P.H., A.R. and A.P.; data curation, P.H.; formal analysis, P.H.; funding acquisition, A.R. and A.P.;investigation, P.H.;project administration, A.R.;supervision, A.R. and A.P.;visualization, P.H.;writing, P.H.;writing—review and editing, P.H., A.C., A.R. and A.P. All authors have read and agreed to the published version of the manuscript

**Funding:** This research was funded by Innosuisse—Schweizerische Agentur für Innovationsförderung, gran<sup>t</sup> number 36152.1 IP-EE and the Lucerne University of Applied Sciences and Arts. The APC was funded by the Lucerne University of Applied Sciences and Arts.

**Institutional Review Board Statement:** Not applicable.

**Informed Consent Statement:** Not applicable.

**Data Availability Statement:** The data presented in this study are available at https://github.com/ ihomelab/dnn4nilm\_overview accessed on 11 January 2021.

**Acknowledgments:** We want to express our gratitude to Gianni Gugolz, who supported us compiling Table 3.

**Conflicts of Interest:** The authors declare no conflict of interest.
