**1. Introduction**

The notion of a smart world, with the aid of smart devices, smartphones, smart cars, smart homes, and smart cities, the paradigm of smart everything, has been a vigorously researched topic for many years. This concept holds the view that people and the world itself will be overlaid with sensing and actuation, with the aid of the internet of things (IoT). Nowadays, IoT has been used in a large number of areas, such as government, industry, and academia [1], for different applications. For example, sensors are placed in buildings for attempting to save energy [2,3]; wireless sensor networks (WSNs) in vehicular communications trying to improve safety and transportation [4]; home automation [5];

industry [6]; or e-Health services which are relying on increased home sensing to support remote medicine and wellness [7].

Nowadays, more than half of the world's population lives in cities [8] with more than six devices per person connected to the Internet [9]. That means that billions of devices will be connected by 2020 to build the aforementioned smart city concept, which can range from end-user devices or wearables to vehicular communication systems, water and gas monitoring, smart lightning, structural monitoring, or smart healthcare systems, among others [10]. These solutions involve a high-density node environment, which in turn requires smaller outdoor and indoor cells leading to heterogeneous networks (HetNet).

Moreover, the advent of next generation 5G communication systems implies the use of ultra-dense small cells in order to increase coverage/capacity requirements inherent to the wide array of services to be offered [11]. In this context, multiple wireless communication systems can be employed in order to diversify service provision, leading to issues such as unpredictable interference sources, or idle cell interference which can greatly impact quality of service [12,13]. Interference analysis has also been considered as a potentially beneficial evaluation element that can be employed in order to enable covert communications in IoT scenarios [14].

The complexity in interference analysis is given by multiple factors, such as requested service types, heterogeneous service requirements, wireless system coordination, or user location and density. In this sense, HetNet architectures implement superimposed cell structures in order to provide adequate capacity requirements as a function of the requested service type. However, this has a serious impact in overall interference levels, requiring the use of cell coordination, such as almost blank subframe technique in order to minimize transmission times [15], employing game theory approaches to provide first order approximation to consider overall interference levels or a similar approach in order to analyze multi-user-multiple-input-multiple-output (MU-MIMO) user assignment in 5G systems [16]. Not only do user distribution and service type have an impact on interference, but also hardware constraints, such as sampling rate mismatch between end user devices and access points/base stations, also affect in the case of massive IoT deployments [17]. In this same line, interference levels can degrade operation of massively deployed transceivers, such as LoRa, owing to loss of ideal orthogonality and leading to increased packet loss [18]. In the future, these constraints can further be aggravated by the use of novel schemes, such as radio frequency (RF) energy harvesting, which in the case of IoT deployments, must consider coverage/capacity relationships in order to comply both with minimum harvested energy thresholds as well as with uplink/downlink signal to interference ratios [19]. Hence, interference analysis and control are one of the fundamental aspects to consider in the scenarios with variable quality of service (QoS) requirements and user density and location, intrinsic to IoT applications [20]. Different solutions have been recently proposed in order to control overall interference levels, such as content-aware cognitive control [21], resource split full duplex mechanisms [22], or machine learning techniques to provide adaptive transmit power control in WSNs [23].

For the successful implementation of the aforementioned dense node deployments, reliable and accurate channel models are necessary, addressing the different topologies and radio links to get reliable service, coverage, and capacity, as well as interference management. Furthermore, hot spots have a non-uniform traffic demand, so it is necessary to have three-dimensional (3D) realistic environments to achieve accurate models, which can lead to network performance improvement. The approaches followed in order to analyze interference in large areas with node density are usually based on statistical channel modelling and under certain model assumptions [13,24], providing certain consideration in relation with scenario characteristics, which can be eventually combined with measurement updates. Spatio-temporal techniques have also been proposed in order to analyze connection establishment phases in massive IoT deployment, based stochastic geometric models [25]. Measurement based interference characterization in IoT scenarios has also been proposed, assisted by supervised learning [26]. However, none of these approaches perform a complete analysis and system performance evaluation considering the whole morphology and topology of the considered scenario. Moreover, realistic wireless system operation exhibits a complex behavior, depending on conditions such as household/office environment, wireless systems under operation within the scenario under analysis or the density of transceivers considered [27].

The analysis on wireless node density and variations within wireless channel characteristics is relevant in functionalities related with applications such as wireless cooperative location systems [28] or in passive location systems [29], in which wireless channel characteristics (line of sight-non-line of sight channel conditions as well as multipath propagation) influence ranging estimations and hence, location performance. Network synchronization is another application in which node density impacts system operation of cooperative systems, influencing the value of the cooperative dilution intensity [30], which is influenced by wireless channel conditions.

Energy analysis is a relevant aspect in the operation of wireless communication systems and particularly in wireless sensor networks, with the existence of inherent limitations given by restrictive energy sources, processing capabilities, and compact form factor requirements. In this sense, wireless sensor network energy balance analysis is compulsory in order to implement efficient system level solutions, such as scheduling algorithms for sleep/active states in order to implement wireless sensor networks operating under partial coverage conditions [31]. By studying required coverage levels (given by receiver sensitivity thresholds, determined by transmission bit rates, adaptive modulation and coding schemes, and electronic device characteristics), transceivers can be dynamically set in sleep modes, resulting in effective energy reduction and hence, enhanced operation lifetime. In this context, estimation of wireless channel behavior, in terms of coverage estimation as well as in distribution of interference sources is relevant in order to analyze overall energy consumption impact, from physical layer as well as in access control and network layer.

In this work, we present a deterministic technique to model electromagnetic propagation in high node density scenarios, specifically an in-house 3D ray-launching (3D-RL) algorithm, based on geometrical optics (GO), geometrical theory of diffraction (GTD), and its extension the uniform theory of diffraction (UTD). With the aid of the 3D-RL simulation tool, the performance evaluation and interference characterization of a dense node density scenario has been performed in order to assess the key performance indicators of the network. The contributions of this work are aimed in providing a precise tool for coverage/capacity estimation, considering relevant multipath propagation phenomena in large complex scenarios. The proposed simulation methodology employs an optimized 3D-RL code with hybrid simulation (combining 3D-RL with neural network interpolators, the electromagnetic diffusion equation for diffraction estimation, and collaborative filtering of deep learning data base algorithms), enabling the possibility to simulate large, complex scenarios. A new simulation module has been implemented in order to perform estimation of error vector magnitude (EVM) for the complete simulation volume, hence enabling further quality of service analysis as a function of the employed modulation scheme.

The remaining parts of the paper are outlined as follow: The proposed simulation technique and the scenario description are explained in Section 2. Section 3 presents the simulation results in the high node density considered scenario, at ISM 2.4 GHz and 5.8 GHz frequency bands, with the received signal strength (RSS), signal to interference noise ratio (SINR), and performance analysis in terms of constellations plots and EVM characterization considering a ZigBee system (infrastructure nodes) as well as Bluetooth transceivers (high mobility devices/users), and coverage/capacity estimations examples. In Section 4, a campaign of measurements has been presented for the same considered scenario, achieving a good match between simulation and measurement results. In addition, the comparison between the scenario full of people and without people is presented in this section. Finally, conclusions and future work are summarized in Section 5.
