Next Article in Journal
Mixed-Potential Gas Sensors Using an Electrolyte Consisting of Zinc Phosphate Glass and Benzimidazole
Previous Article in Journal
Assessment of Pansharpening Methods Applied to WorldView-2 Imagery Fusion
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Fuzzy-Based Approach for Sensing, Coding and Transmission Configuration of Visual Sensors in Smart City Applications

by
Daniel G. Costa
1,*,†,
Mario Collotta
2,†,
Giovanni Pau
2,† and
Cristian Duran-Faundez
3,†
1
Department of Technology, State University of Feira de Santana, Feira de Santana 44036-900, Brazil
2
Faculty of Engineering and Architecture, Kore University of Enna, Enna 94100, Italy
3
Department of Electrical and Electronic Engineering, University of the Bío-Bío, Bío Bío Region 4051381, Chile
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Sensors 2017, 17(1), 93; https://doi.org/10.3390/s17010093
Submission received: 4 October 2016 / Revised: 15 November 2016 / Accepted: 25 November 2016 / Published: 5 January 2017
(This article belongs to the Section Sensor Networks)

Abstract

:
The advance of technologies in several areas has allowed the development of smart city applications, which can improve the way of life in modern cities. When employing visual sensors in that scenario, still images and video streams may be retrieved from monitored areas, potentially providing valuable data for many applications. Actually, visual sensor networks may need to be highly dynamic, reflecting the changing of parameters in smart cities. In this context, characteristics of visual sensors and conditions of the monitored environment, as well as the status of other concurrent monitoring systems, may affect how visual sensors collect, encode and transmit information. This paper proposes a fuzzy-based approach to dynamically configure the way visual sensors will operate concerning sensing, coding and transmission patterns, exploiting different types of reference parameters. This innovative approach can be considered as the basis for multi-systems smart city applications based on visual monitoring, potentially bringing significant results for this research field.

1. Introduction

Modern cities have several issues regarding resources management, security, urban mobility, disaster recovery, among many others, which can be supported by sensors. Actually, different kinds of sensing technologies can be used to help city management by automating some tasks and predicting problems before they occur [1,2]. In this context, Wireless Sensor Networks (WSN) can be used to create a smart city infrastructure, as sensors can monitor different data, e.g., water pressure, noise, luminance, electric current, traffic, among others, activating alarms or automated systems in the occurrence of some events [3,4].
For a lot of monitoring applications in a smart city environment, visual sensors can provide valuable information of monitored areas. Actually, still images and video streams can be processed for different tasks [5,6]: distributed or centralized processing of visual data, usually centered at pattern recognition, has already been exploited for many applications in urban contexts, as in parking management and traffic control. The use of visual sensing can then create a promising scenario for the adoption of Wireless Visual Sensor Networks (WVSNs), but many challenging issues are raised when visual sensors are deployed [7], demanding proper solutions.
Smart city monitoring employing cameras is a relevant research topic that has fostered the development of many research works in last years. In [8], authors proposed an approach to use cameras and sensor nodes in order to provide an efficient surveillance system, calculating the correct position of events of interest for camera rotation. In [6], cameras are used to monitor movement of targets for public security. The work in [9] employs visual sensors to manage parking lot occupancy. Actually, many such systems could be merged to provide integrated services in smart cities, but efficient configuration of visual sensors is still a relevant issue that should be optimized for higher efficiency.
As there may be many concurrent wireless sensor networks operating in a smart city scenario, efficient configuration of sensor nodes is of paramount importance. Therefore, a relevant issue of wireless visual sensor networks is the proper configuration of sensor nodes, as visual data transmission may be too degrading in terms of energy consumption and transmission throughput. In this work, we define configuration as beng centered in three different elements: sensing, coding and transmission. These elements are defined as follows:
  • Sensing: it indicates the sensing behavior of visual sensors. For cameras retrieving still images, it may define the sampling frequency, which reflects in the number of snapshots taken per second. For video monitoring, the sensing behavior may indicate transmission bursts or continuous streaming.
  • Coding: source nodes may apply different coding algorithms, with diverse compression ratios and processing costs. Visual data resolution and color patterns are also relevant coding configurations.
  • Transmission: visual data may be transmitted in real-time, or transmission latency and jitter may not be a concern. Quality of Service (QoS) policies may also be employed over some traffic, which may be prioritized during transmission.
In smart cities, which may be highly dynamic [10], sensor node configurations should regard different parameters from different perspectives and with different significances. In this context, we propose herein a fuzzy-based mechanism to dynamically configure visual sensors, computing and assigning their sensing behavior, coding schemes and transmission approaches according to a series of parameters associated with visual monitoring in smart cities. This fuzzy-based solution takes internal and external parameters to define a unified mechanism to configure the way visual sensors will operate, which is represented as the Sensing/Coding/Transmission Pattern (SCTP). A proposed Fuzzy Logical Controller (FLC) was implemented on a prototyping board and numerical results of SCTP computations are evaluated for different configuration parameters.
Therefore, exploiting fuzzy logic, we expect to provide a generic mechanism to configure visual sensors in any smart city application, requiring only proper definition of the parameters that best represent the monitoring requirements of the considered applications. To the best of our knowledge, such an approach has not been proposed before.
The remainder of this paper is organized as follows. Section 2 presents some related works. Section 3 brings the definitions of visual monitoring in WVSNs. The proposed approach is defined in Section 4. Numerical results are presented in Section 5, followed by conclusions and references.

2. Related Works

Smart city monitoring is a relevant research topic that has fostered the development of many research works in last years. Actually, some of them influence our investigation in different ways, mostly on visual sensing, QoS provisioning, sensor configuration and Internet of Things (IoT) applications.
Many works have been done on smart city monitoring. In [1], authors have developed a smart city monitoring system to detect leakages and failures in a pipeline infrastructure. Based on wireless sensor networks technology, their system is able to detect, to localize and to quantify anomalies in a water supply system. In a different way, the paper in [11] describes a system to help evacuation of a building, determining the best exit route at each part of the building according to the number of evacuees. An application for Structural Health Monitoring (SHM) in smart cities is presented in [12], where a total of 64 nodes are used to monitor vibrations in the Golden Gate Bridge, in the city of San Francisco.
Surveillance applications in smart cities can also benefit from the use of cameras [7]. In [8], authors proposed an approach to use cameras and sensor nodes in order to provide an efficient surveillance system. In their proposal, several sensor nodes are uniformly deployed on an area of interest, and a remote sink uses the collected data from the sensors to calculate the correct position of an event. Then, by having the position of an event, the sink sends rotation commands to cameras near the source of the event so that they can record what caused it. In [6], cameras are used to monitor movement of targets for public security, with many practical applications. Distributed processing of visual data is proposed in [9], when visual sensors are employed to automatically detect cars in parking areas, which can support efficient parking systems in smart cities. The work in [13] employs cameras and laser beams to monitor structural displacements. In fact, when employing cameras, relevant visual information can be retrieved, potentially enriching smart city applications.
As there are many challenges to building smart cities, research efforts have addressed different aspects of this environment. However, a recurrent relevant issue is to determine how visual sensors will behave in terms of sensing, coding and transmission patterns for each of these challenges. Different optimization approaches have been proposed in last years, addressing such configuration problems in different ways (e.g., addressing only sensing or combined coding and transmission), which is actually more critical for visual sensors when compared to scalar sensors due to more configuration options. As a result, context-aware wireless visual sensor networks have been considered as a feasible option for diverse monitoring functions, and many works have contributed to this area.
Optimized transmission in WVSNs allows the dynamic adjustment of the transmission rate, for example due to congestion in relay nodes or to save energy [14]. The work in [15] proposes dynamic adjustment of the transmission rate in order to save energy. In addition, transmission rate adjustment to face congestion is proposed in [16]. For these approaches, transmission rate is adjusted for higher efficiency, but usually only information of source and relaying nodes is taken when optimizing the network operation, which may not be efficient in smart city scenarios.
A promising approach to change the sensing and transmission behavior of sensor nodes was proposed in [17]. The idea in that work is to define the transmission behavior of visual sensors according to their relevance for the entire network, which may be associated, for instance, with the monitoring of critical events. Similarly, the work in [18] employs scalar sensors to rapidly detect critical events, which are then reflected in higher priority to nearby visual sensors. In both works, the sensing, coding and transmission behavior of visual sensors are established according to the priority of source nodes, but external parameters of smart cities are not still considered.
Actually, many previous works have proposed node configuration following some optimization parameter. We define that most of those works have considered at least one of five different parameters to perform configurations: events, media type, node’s status, data content and network QoS. Table 1 presents some examples of optimization approaches based on some of these parameters.
The configuration problem following optimization parameters is indeed a good approach to optimize sensing, coding or transmission patterns of source nodes, although most works are concerned in achieving some specific performance enhancement. While some works are concerned with performance parameters, as energy consumption and QoS-based transmission improvements [26], there is also a need to change the configuration of sensors to attend different application requirements. The work in [27] proposes a framework to reconfigure wireless sensor networks when heterogeneous applications are considered. The idea is to allow different applications to use the same WSN at different times, without requiring redeployment or use of additional sensors. Such reconfiguration, however, requires not only modification of sensing, coding and transmission patterns of sensors, but also changes in the employed protocol stack, which adds additional challenges. In [28], context-aware wireless sensor networks are also investigated. In this work, a task allocation scheme is proposed for heterogeneous applications using the same network infrastructure, with nodes being allocated to the execution of tasks. Additionally, authors in [29] propose a common programing interface to be used for heterogeneous applications when reconfiguration is performed.
Configuration of sensor nodes may then have different perspectives. Optimal configurations of source nodes for performance enhancement may be highly beneficial, as well as the adoption of reconfiguration frameworks to adapt the network to different application requirements. Although such approaches achieve different levels of sensing, coding and transmission configurations in wireless visual sensor networks, they are not necessarily adequate for smart city scenarios composed of multiple concurrent systems. Moreover, they do not consider external parameters of networks, which are relevant to the dynamics of multi-systems smart cities. This is, in fact, the research gap that is being addressed by this article.
Thus, in a different way compared to previous works, it is proposed herein a fuzzy-based approach aimed at the establishment of the sensing, the coding and the transmission behavior of any number of visual sensors according to different parameters related to the operation of visual sensor networks in smart city environments, providing a broader solution that can be used as a reference for future developments in this area.

3. Visual Sensing Configuration

A visual sensor will be typically configured to perform a determined task according to the definitions of an application. For example, a visual sensor may stream video when triggered by a scalar (auxiliary) sensor, or it may continuously transmit image snapshots at some rate. Whatever the case, visual sensors may operate differently according to characteristics of the considered application, which may define sampling frequencies, coding algorithms and transmission patterns for source nodes.
In order to achieve high performance, the operation of visual sensors may be optimized, changing their configurations in some well-defined cases. For example, if energy supply runs low, lower quality visual data may be transmitted by some or all sensors, potentially saving energy. In a different perspective, the occurrence of an event of interest, such as an explosion or traffic congestion, may trigger prioritized transmission from sensors in the vicinity of the event, with tagged packets receiving some QoS guarantees as they are transmitted over the network. Although such configuration approaches may perform well when a single wireless visual sensor network is considered [30], visual sensors may be operating in dissonance with a smart city environment, especially when multiple distinct visual monitoring systems are running concurrently.
When using visual sensors, some applications may consider sensors’ orientations as an important configuration element. If rotatable cameras are employed, sensing directions may be optimized and many works have been concerned with this issue [31,32]. Although we do not consider sensors’ orientations as a configuration element in our fuzzy-based configuration approach, the proposed solution can be combined with algorithms that alter sensors’ orientations. In other words, even if coverage optimization algorithms are employed, sensing, coding and transmission configurations can still be efficiently established using our proposed approach.
Different parameters may impact the overall operation of smart city systems. And although their influence may be neglected in some cases, visual sensing, coding and transmission may be adapted to such parameters, aiming at a more optimal operation of wireless visual sensor networks. Actually, we define that there are two different types of parameters in typical smart city systems, which can influence the operation of visual sensors: internal and external. Internal parameters are produced and have their influence directly inside the scope of a wireless visual sensor network, and thus they may be inferred from the considered network. On the other hand, external parameters are collected from outside the network and they usually will affect all systems in a smart city scenario.
The next subsections will discuss these two types of parameters, which will be considered in the proposed approach.

3.1. Internal Parameters

Much data related to the operation of wireless visual sensor networks can be computed as optimization parameters for configuration purposes. Actually, these parameters should be properly accounted for if optimal configuration is desired, and thus the proposed approach presented in the next section takes a handful of internal parameters.
There may be many internal parameters, with different impacts on the operation of visual sensors. The proper choosing of those parameters is an important design issue, but we expect that some of them will be present in most cases. Those parameters are listed as follows:
  • Camera hardware: sensor nodes may be equipped with different types of cameras, which may have different hardware characteristics. Cameras with zooming and rotation capabilities may need to transmit more information for some applications. Lens quality and supported image resolutions are also important parameters that may impact sensor operation.
  • Processing power: processing and memory capabilities will determine which multimedia compression algorithms may be executed, affecting sensing quality and energy consumption over the network. This parameter can then guide the configuration of data coding in sensor nodes.
  • Event-based prioritization: visual sensors may have different priorities depending on events monitoring [17,30]. Network services and protocols may consider event-based priorities for optimized transmissions. Moreover, most relevant nodes may transmit more data than less relevant sensors, depending on the configurations of the considered monitoring application.
  • Residual energy: sensor nodes may operate using batteries, which provide finite energy. Therefore, the current energy level of sensor nodes may interfere in the way sensors will retrieve visual information. For example, the sensing frequency of sensors nodes may be reduced when their energy level is below a threshold.
  • Security: some security concerns may be exploited to differentiate sensor nodes. As an example, regions with confidentiality requirements may demand the use of robust cryptography algorithms [33], which depends on available processing power and efficient energy management.

3.2. External Parameters

In a smart city scenario, visual sensor networks should be also configured exploiting external parameters. In general, we define external parameters as all data that are equally significant for all wireless sensor networks in a smart city. Although wireless sensor networks usually do not consider those parameters, monitoring and control systems in smart cities may benefit from the exploitation of such data.
We initially selected four relevant external parameters for optimizations, described as follows:
  • Luminance: some visual sensors may be able to retrieve visual information during the night or in dark places, but it may not be true for other sensors. The luminance intensity (measured in lux) can be considered when defining the sensing frequency of visual sensors.
  • Deployment area: depending on the considered application and deployment area, visual sensors may need to transmit more visual data. For instance, in public security applications, visual sensors deployed in areas with high levels of criminality may be required to transmit more data than if they were deployed in other areas (even in the absence of events of interest). As a remark, the deployment area is a parameter that has significance for a scenario that may comprise many different WVSNs, which is different than the (internal) prioritization parameter, whose significance is valid for the considered sensor network.
  • Day of the monitoring: a city is a complex and dynamic environment, where some patterns may be, sometimes, defined. For example, traffic is affected by the day of the week, since fewer cars may be moving around on weekends, or even on holidays. Thus, depending on the application, visual monitoring may be influenced by the day of the monitoring.
  • Relevance of the system: in a smart city scenario, some systems may be more relevant than others. If we have multiple deployed wireless visual sensor networks, a pre-configured relevance level for the different network operations may be considered as an external parameter, impacting the configuration of the sensor nodes.

4. Proposed Approach

In general, visual source nodes may be expected to transmit data in one of three different ways: query-based, triggered-based or time-based [14]. Query-based transmissions are produced by specific queries (usually) from outside the network. For autonomous systems as generally deployed for smart city applications, source nodes will typically transmit information following a triggered-based (in response to some event) or time-based (in response to some schedule) pattern. For these transmission patterns, proper configuration of visual sensors is highly desired, which should be able to be performed dynamically along the time even in the presence of multiple concurrent wireless visual sensor networks.
We propose a Fuzzy Logic Controller (FLC) that will consider some internal and external parameters directly or indirectly inferred from a smart city scenario in order to define the sensing, the coding and the transmission configuration of any number of visual sensors. A representation of a generic smart city adopting the proposed approach is presented in Figure 1.

4.1. Fuzzy Logic Controller

The soft computing techniques fit themselves well in wireless visual sensor networks applications, since they have been proposed for the construction of new generation artificial intelligence (high machine intelligence quotient, human-like information processing) and for solving non-linear and mathematically unmodeled systems. Soft computing techniques can be appropriate for several engineering problems, especially for complex problems, where classical control methods do not achieve comparatively favorable results. In addition, it is useful to note that soft computing techniques can be implemented at low cost. FLCs belong to the soft computing techniques that aim to adapt to the pervasive imprecision of the real world and to obtain robust and low cost solutions.
Actually, the use of rule-based Fuzzy Logic Controllers enables the implementation of multi-criteria control strategies [34]. In fact, fuzzy logic is capable of making real-time decisions, even with incomplete information [4]. Conventional control systems rely on an accurate representation of the environment, which generally does not exist in reality. Since fuzzy logic systems can manipulate the linguistic rules in a natural way, they are particularly suitable in several contexts, such as WVSN applications. Moreover, fuzzy systems can be used by blending different parameters and rules that, combined together, may produce an optimal result. In general, an accurate computation may be too complex and it could also be meaningless due to the quick change of the network conditions. For this reason, the FLCs, based on linguistic rules instead of inflexible reasoning, can be the right choice to describe a sampling, coding and transmission pattern in visual sensor networks. Therefore, fuzzy logic has been adopted because it can deal with uncertain and vague values [35,36,37].
As an important remark, FLC does not usually require high computing power to operate, which is a desired characteristic in WSNs when sensor nodes are battery powered. In addition, many works have applied FLCs in order to prolong the battery life on sensor nodes or to manage the application in the reference scenario [38,39,40,41]. Therefore, for the proposed approach, the use of an FLC to compute and assign the SCTP for the sensors should not add significant processing and energy costs.

4.2. Configuring the FLC

Although there may be many possible parameters for the FLC, we will define a set of reference parameters, which will be considered for the proposed approach, but any other parameter could be easily exploited in future implementations. The chosen parameters may be directly or indirectly inferred by the FLC, which may communicate with source nodes or be previously configured during deployment.
Considering the internal and external parameters shown previously, it would not be appropriate to develop an FLC having six/seven fixed input parameters. Moreover, this would imply the use of a fairly large number of membership functions. To this end, in this paper, we have chosen to implement an FLC that does not depend on a specific application, and, as a consequence, we are introducing a generic solution for any number of configuration parameters. Actually, it will be possible to regulate the FLC based on specific application fields. In doing so, the proposed FLC is defined as having two types of input parameters, referred to as Internal Parameters Indicator ( I P I ) and External Parameters Indicator ( E P I ), respectively. Figure 2 presents a generic scheme of the proposed FLC-based computation of sensors’ configurations. The output of the FLC is the SCTP for visual sensors.
Specifically, these parameters are calculated as a correlation between the internal and external parameters. The I P I and E P I indicators depend on the relevance that each Internal Parameter ( I P i ) and External Parameter ( E P i ) covers for obtaining acceptable performance in any specific application field. Therefore, Table 2 shows the values of an internal or external parameter on the basis of performance requirements defined by the application, which are the considered thresholds. In order to explain better the idea, Table 2 shows a possible representation of these numerical values. The ranges considered in the first column, the corresponding thresholds and the percentage values associated with them, can vary in number and in value depending on the type of application.
The I P I and E P I indicators are calculated as follows:
I P I = i = 1 n ( a i I P i ) 1 ,
E P I = i = 1 m ( b i E P i ) 1 .
In Equations (1) and (2), n is the number of internal and m is the number of external parameters, and a i and b i are the relevance of the I P i or E P i parameter in the referred application field, respectively. In detail, a i and b i are dimensionless parameters and behave like a priority, whose minimum value is 0 and the maximum is 1. Moreover, ( a i + a ( i + 1 ) + a ( i + 2 ) + . . . + a ( i + n ) ) = 1.0 and ( b i + b ( i + 1 ) + b ( i + 2 ) + . . . + b ( i + m ) ) = 1.0. For instance, in the case of a QoS-aware scenario, some parameters can be represented by end-to-end delay, deadline miss ratio, throughput/workload, and so on. In this specific scenario, the application level determines the relevance of each specified parameter. In the case of real-time scenarios, the most important parameter may be the deadline miss ratio of transmitted packets. It could be considered in I P I calculation with a i equal to 0.7. It means that I P I takes into account a deadline miss ratio with a prominence of 70%. Thus, depending on the type of application, each internal parameter can be characterized by a great or small importance (priority) in the reference context.
Moreover, regarding the I P i or E P i values, they are strictly related to the type of internal and external parameters, respectively. For instance, considering the internal parameters presented in the previous section, Camera hardware parameter can be identified by an appropriate value on the basis of zooming and rotation capabilities, Event-based prioritization parameter on the basis of relevance levels [30] and Residual energy parameter on the basis of the battery level. However, as mentioned before, the solution proposed in this work is independent from the number of parameters, which are defined according to application requirements.
The fuzzy control system handles the Sensing/Coding/Transmission Pattern of visual sensors taking as input crisp values of I P I and E P I , which are converted into linguistic values by using a chosen set of membership functions. The used linguistic values are:
  • Very Low (VL);
  • Low (L);
  • Medium (M);
  • High (H);
  • Very High (VH).
The membership functions used for I P I or E P I are shown in Table 3, while their graphical representation is depicted in Figure 3.
The aim of the controller is then to elaborate on these linguistic values using an inference mechanism based on a set of if–then rules. These rules are combined in the FLC, which returns a membership function, represented, in this paper, by Gaussian functional shapes. These types of membership functions have been chosen because, as shown in [42], using Gaussian membership functions, the accuracy increases greatly, without degrading the computational performance.
Through the inference mechanism, it is possible to determine the correct output according to the fuzzy inference rules presented in Table 4. For instance, if the value of I P I is 0.48, the membership function considers the linguistic value of M, while if E P I is 0.73, the linguistic value refers to H. In this way, the final inference value is H. Finally, the conversion of this value into crisp logic decisions suitable to SCTP concludes the proposed FLC (defuzzification process).

5. Results

Visual sensor nodes may operate in different ways, providing information for some monitoring applications. It is clear that the sensors’ hardware parameters, the considered sensor network, the surrounding environment and existing parallel systems may influence such operations, as proposed in this work. The proposed FLC-based approach considers some parameters defined by applications and a smart city environment, presenting as results Sensing/Coding/Transmission Patterns for visual sensors.
In order to validate the proposed approach, we considered different strategies. Initially, a proof-of-concept was created assuming the case of public security in smart cities, which is a promising practical application. Then, since this article brings an innovative approach that comprises parameters that are not considered together by other works, the comparison with related works could be meaningless, which leads us to further evaluate the computation of the SCTP in different scenarios and for different internal and external parameters. At last, relevant issues when employing the proposed approach in real-world applications are discussed.

5.1. Computing SCTP for a Public Security Application

A hypothetical public security application was considered for initial evaluation of the proposed approach. For each considered visual sensor (or even all visual sensors equally), five different linguistic values may be computed: VL, L, M, H and VH. These values may be mapped in different configurations, according to characteristics of the designed monitoring applications.
For the WVSN considered in this example, Table 5 presents the internal and external parameters that will be taken when computing SCTP. In this particular case, we are considering three internal and three external parameters, but any number of parameters could be considered, depending on the application monitoring requirements and the desired level of integration of the wireless visual sensor network with the smart city scenario. For this WVSN, visual sensors transmit only image snapshots.
In the configuration defined at the bottom of Table 5, the FLC will indicate one of five values for each visual sensor, which will be reflected in a sensing frequency, an image resolution and an expected service for transmitted packets from the considered visual source nodes. In this case, sensing is defined as a frequency of snapshots taken from the monitored field, while coding is defined as a resolution that may be SQCIF (128 × 96), QCIF (176 × 144), SCIF (256 × 192), CIF (352 × 288) and 4CIF (704 × 576), all of them based on CIF (Common Intermediate Format) standard. At last, transmission is defined as an expected service, which may be transmissions with no guarantees, transmissions in a reliable way (corrupted packets are recovered) or even transmissions with reliability and timeliness assurance.
It is necessary to highlight that the relevance of the parameters depends on the characteristics of the considered applications. For the considered public security surveillance system, sensor viewing areas with low luminance should be more important. Furthermore, if sensors are battery-operated, sensors with low energy level should also be more important (since they have a shorter expected lifetime).
With the defined parameters for the application (top of Table 5) and adopting the FLC configuration shown in the bottom of Table 5, a configuration for the sensors can be computed. For this example, a model has been built in Simulink/Matlab (version R2015b developed by MathWorks), as shown in Figure 4. The values of I P i and E P i are acquired as input parameters of the block called Parameters Manager. These are random values, generated through uniform random number blocks with the ranges specified in Table 5. The Parameters Manager block manages the internal and external parameters through the Simulink/Stateflow environment, an internal Matlab tool that allows for description of the evolution of a specific system by means of a finite state machine. The output values of the Parameters Manager block are the I P I and E P I , which, subsequently, become the input parameters of the FLC.
An FLC may receive parameters reported by sensor nodes or it may exchange information with external systems. After computing the SCTP for each visual sensor, it may assign the computed information to the corresponding nodes in different ways, but we expect the adoption of application-layer protocols like the one proposed in [17].
The model depicted in Figure 4 has been implemented on the prototyping board that is shown in Figure 1. The processing unit is the Microchip PIC24FJ256GB108 microcontroller [46], which integrates the control features of a microcontroller unit with the processing and throughput capabilities of a digital signal processor. This 16-bit microcontroller has a maximum processing power of 16 MIPS (Millions of Instructions Per Second) and offers multiple serial ports (3 × I2C, 3 × SPI), 4 × UARTS and 23 independent timers. The availability of 16 kB of RAM memory for buffering, of up to 256 kB of enhanced Flash program memory and other characteristics make this microcontroller very suitable for embedded control and monitoring applications. The implementation presented here is just a proof-of-concept to show the feasibility of the proposed solution on COTS (Commercial Off-The-Shelf) devices. The obtained values are displayed on the LCD screen connected to the prototyping board. Moreover, in order to calculate the performance, the microcontroller continuously sends the output data to a computer through a serial cable.
Obviously, this implementation is valid for the case considered in this Section, which takes three internal and three external parameters. However, as this is a proof-of-concept implementation, the hardware for the FLC has to be adapted for other configurations of WVSN in smart cities, or other solutions may even be adopted for computation of SCTP, with no prejudice to the proposed approach.
Considering this generic application for public security surveillance, it is shown in Table 6 how the FLC returns the five different linguistic values for SCTP taking different combinations of I P I and E P I as inputs. In this example, a 1 and I P 1 refer to the Camera hardware internal parameter, a 2 and I P 2 to the Event-based prioritization, a 3 and I P 3 to the Energy, while b 1 and E P 1 refer to the Luminance external parameter, b 2 and E P 2 to the Day and, finally, b 3 and E P 3 to the Deployment area.
It is possible to see different combinations of I P I and E P I and their related parameters. For instance, considering the Case 5 in Table 6, the output related to the SCTP is VH. This means that the sensors will acquire two snapshots/s and the coding and the transmission configuration will be 4CIF and reliable/real-time, respectively (Table 5). It is clear that, in this case, the security surveillance application of the smart city requires more/better data. In fact, the relevance/priority of internal parameters ( a i ) is 48% for Camera hardware, 34% for Event-based prioritization and 18% for Energy. The same remark also applies to the relevance/priority ( b i ) of external parameters. As a result, the FLC processes properly the tuning of these parameters in order to return the SCTP as output. This clearly shows the adaptability of the proposed approach to various and different application contexts. In addition, the obtained results highlight that a fuzzy approach allows for ranking different parameters well and giving them a characterization for importance in the application where the FLC works. As a consequence, it is possible to obtain performance results that fit the application requirements.

5.2. SCTP Computation in Smart City Scenarios

The computed SCTP may typically change along with time, reflecting the dynamic nature of smart cities. In order to demonstrate such dynamism, we initially considered two distinct visual sensors in different wireless sensor networks and computed the SCTP for them. In fact, these tests are valid for generic monitoring applications in smart cities (e.g., traffic control, public security surveillance, disaster management, etc.), but different scopes could be considered [47].
Figure 5 shows the results for computation of SCTP every 4 h for four days, from 00:00 Friday to 24:00 Monday, assuming two different WVSNs and the same configuration for all sensors of each of those wireless visual sensor networks. Three internal parameters were considered for both sensors: I P 1 = 0 . 6 , I P 2 = 0 . 6 and I P 3 = 0 . 6 for a 1 = 0 . 3 , a 2 = 0 . 4 and a 3 = 0 . 3 . As external parameters, two of them are considered: E P 1 for deployment area and E P 2 for luminance. During the day (after 6:00 a.m. and before 6:00 p.m.), the value of E P 2 = 1 . 0 , while E P 2 = 0 . 1 during the night (low luminance). Concerning E P 1 , two different areas were considered, as depicted in Figure 5, for b 1 = 0 . 5 and b 2 = 0 . 5 .
As can be seen in Figure 5, the SCTP will be lower during night periods and for least relevant deployment areas ( E P 1 = 0 . 1 ), which is a very reasonable configuration for real smart city scenarios. However, as defined in the proposed approach, this is only an example, since many WVSNs may not be affected by monitoring during the night or in dark areas.
Figure 6 presents the values for SCTP for the same period of time, but now assumes that monitoring on Sundays is not relevant. For this, a new external parameter, E P 3 , is defined. For example, in Figure 6, E P 3 = 0 . 0 for Sundays and E P 3 = 1 . 0 for the other days of the week. As a remark, the same configuration is taken from Figure 5 for WVSN 1 (only two external parameters), while WVSN 2 takes b 1 = 0 . 1 , b 2 = 0 . 5 and b 3 = 0 . 4 (three external parameters).
At last, for the same WVSN 2 in Figure 6, we assumed three different configurations of internal parameters, as depicted in Figure 7 for a 1 = 0 . 3 , a 2 = 0 . 4 and a 3 = 0 . 3 .
In Figure 7, we want to highlight how internal and external parameters are relevant when defining the SCTP, which will be reflected in sensing, coding and transmission configurations using some rules, such as the one defined in Table 5.
Actually, as sensor-based monitoring in smart cities may be highly dynamic but under a known range of possibilities (parameters may assume only known values), SCTP computation over time may be predicted and even anticipated in some cases, and this characteristic may be highly desired in smart cities with multiple concurrent WVSNs, since their sensing, coding and transmission behavior may be checked and even optimized (with specific parameters) when required.

5.3. Relevant Issues When Implementing the Proposed Approach

There are some relevant issues that need to be considered when implementing the proposed approach in real scenarios. Some of the most relevant ones are discussed here.
After defining the proper values for internal and external parameters, according to the nature of the wireless visual sensor network in the considered smart city, the SCTP can be dynamically computed. The next step is the configuration of the corresponding visual sensor nodes using any configuration protocol. As configuration of sensor nodes by specialized protocols is not necessarily a challenging issue, we left such implementation for future works, since it is not expected to impact the proposed approach. However, some important remarks must be made.
Although energy may not be a major concern in wireless visual sensor networks deployed in smart cities, due to the possibility to use continuous unlimited energy supplies depending on the deployment site, it is reasonable to implement energy-efficient solutions to make the overall approach flexible. In addition, concerning an SCTP assignment protocol, it may be designed having energy as an important performance parameter. In practical means, the amount of information and control messages should be kept as minimal as possible.
An SCTP assignment protocol could have a cross-layer design, exploiting the operation of other protocols [14,19], or it can be designed as an application-layer protocol. Whatever the case, the way it will operate may follow the recommendations in [17], which takes energy efficiency as the basis of the protocol used to assign sensing relevancies to source nodes. Assuming that the SCTP is computed by a single FLC, the computed SCTP can then be directed assigned to each source node using individual messages, or all information from all sensors may be broadcast through the network in combined messages. As control messages already flow in typical wireless sensor networks, SCTP assignment should not impose considerable overhead.
The way the FLC will be implemented is also a relevant issue. Actually, many FLCs may be employed in a city, uniformly or randomly distributed. This could be done, for instance, to reduce the overhead of control messages related to SCTP assignment. The size of the city and the number of concurrent wireless visual sensor networks is relevant information when making such a design choice.
Another relevant issue when employing the proposed approach is the interaction of concurrent systems. In this case, a WVSN may directly interfere in the behavior of other independent WVSN, just sending a specific control message to the FLC. For this, any WVSN may be modelled as an external parameter, which is set according to the definitions of the external sensor network. As an example, WVSN 1 modelled as an external parameter for WVSN 2 may be configured to "tune" that sensor network in the occurrence of an event that can be only detected by WVSN 1. When that event happens, WVSN 1 sends a control message to the FLC, which then takes it as an external parameter of WVSN 2. This kind of interaction may be modelled in different ways and involve any number of wireless visual sensor networks.
Finally, smart cities may be classified as having six main aspects: economy, people, governance, mobility, environment and living [47]. Each of these aspects are related to different elements of the way of life in cities, and they may be all interoperable. Actually, we expect that the proposed fuzzy-based configuration approach can be efficiently applied in any of the different scenarios of modern smart cities, since the proper configuration of visual sensors is strongly associated with the efficiency of monitoring applications.

6. Conclusions

Wireless visual sensor networks can be used to retrieve valuable information in an uncountable number of monitoring and control applications. However, proper configuration of sensor nodes is still a challenging task, since many parameters may influence the way sensors should operate. For higher performance, sensing, coding and transmission configuration of visual sensors should be efficiently accomplished.
An FLC-based approach was proposed in order to configure visual sensors, considering internal and external parameters of wireless visual sensor networks. We believe that this approach can benefit many WVSN applications, especially in complex scenarios like in smart city environments.
One direction for future research on the system addressed here is to improve the approach proposed in this work with a neural network able to forecast some predictable environmental parameters, i.e., to predict the monitoring conditions at different times of the day or on different days of the week. This combination would allow the fuzzy controller to make its decision taking into account not only the current situation, as detected by the sensors, but also the probable short-term evolution of the monitored environment.

Acknowledgments

This work was supported in part by the University of the Bío-Bío, under Grants: DIUBB 161610 2/R and GI 160210/EF.

Author Contributions

All authors contributed to the development of the paper. Daniel G. Costa and Cristian Duran-Faundez contributed with experiences and ideas for the definition of sensors’ configurations, as well as with support when achieving results. Mario Collota and Giovanni Pau designed the fuzzy logical controller and performed significant verifications. All authors contributed to the writing of the manuscript and performed revisions.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Stoianov, I.; Nachman, L.; Madden, S.; Tokmouline, T.; Csail, M. PIPENET: A Wireless Sensor Network for Pipeline Monitoring. In Proceedings of the International Symposium on Information Processing in Sensor Networks, Cambridge, MA, USA, 25–27 April 2007; pp. 264–273.
  2. Gungor, V.; Lu, B.; Hancke, G. Opportunities and Challenges of Wireless Sensor Networks in Smart Grid. IEEE Trans. Ind. Electron. 2010, 57, 3557–3564. [Google Scholar] [CrossRef]
  3. Hancke, G.P.; Silva, B.D.C.E.; Hancke, G.P., Jr. The Role of Advanced Sensing in Smart Cities. Sensors 2012, 13, 393–425. [Google Scholar] [CrossRef] [PubMed]
  4. Collotta, M.; Bello, L.L.; Pau, G. A novel approach for dynamic traffic lights management based on Wireless Sensor Networks and multiple fuzzy logic controllers. Expert Syst. Appl. 2015, 42, 5403–5415. [Google Scholar] [CrossRef]
  5. Lo, S.W.; Wu, J.H.; Lin, F.P.; Hsu, C.H. Visual Sensing for Urban Flood Monitoring. Sensors 2015, 15, 20006–20029. [Google Scholar] [CrossRef] [PubMed]
  6. Calavia, L.; Baladrón, C.; Aguiar, J.M.; Carro, B.; Sánchez-Esguevillas, A. A Semantic Autonomous Video Surveillance System for Dense Camera Networks in Smart Cities. Sensors 2012, 12, 10407–10429. [Google Scholar] [CrossRef] [PubMed]
  7. Almalkawi, I.; Zapata, M.; Al-Karaki, J.; Morillo-Pozo, J. Wireless multimedia sensor networks: Current trends and future directions. Sensors 2010, 10, 6662–6717. [Google Scholar] [CrossRef] [PubMed]
  8. Chen, W.T.; Chen, P.Y.; Lee, W.S.; Huang, C.-F. Design and Implementation of a Real Time Video Surveillance System with Wireless Sensor Networks. In Proceedings of the IEEE Vehicular Technology Conference, Singapore, 11–14 May 2008; pp. 218–222.
  9. Baroffio, L.; Bondi, L.; Cesana, M.; Redondi, A.E.; Tagliasacchi, M. A visual sensor network for parking lot occupancy detection in Smart Cities. In Proceedings of the 2015 IEEE 2nd World Forum on Internet of Things (WF-IoT), Milano, Italy, 14–16 December 2015; pp. 745–750.
  10. Batty, M.; Axhausen, K.; Giannotti, F.; Pozdnoukhov, A.; Bazzani, A.; Wachowicz, M.; Ouzounis, G.; Portugali, Y. Smart cities of the future. Eur. Phys. J. Spec. Top. 2012, 214, 481–518. [Google Scholar] [CrossRef]
  11. Kruger, C.; Hancke, G.; Bhatt, D. Wireless sensor network for building evacuation. In Proceedings of the IEEE International Instrumentation and Measurement Technology Conference, Graz, Austria, 13–16 May 2012; pp. 2572–2577.
  12. Kim, S.; Pakzad, S.; Culler, D.; Demmel, J.; Fenves, G.; Glaser, S.; Turon, M. Health Monitoring of Civil Infrastructures Using Wireless Sensor Networks. In Proceedings of the International Symposium on Information Processing in Sensor Networks, Cambridge, MA, USA, 22–24 April 2007; pp. 254–263.
  13. Myung, H.; Lee, S.; Lee, B.J. Structural health monitoring robot using paired structured light. In Proceedings of the IEEE International Symposium on Industrial Electronics, Lausanne, Switzerland, 8–10 July 2009; pp. 396–401.
  14. Costa, D.G.; Guedes, L.A. A Survey on Multimedia-Based Cross-Layer Optimization in Visual Sensor Networks. Sensors 2011, 11, 5439–5468. [Google Scholar] [CrossRef] [PubMed]
  15. Lee, S.; Lee, S.; Bovik, A. Optimal image transmission over Visual Sensor Networks. In Proceedings of the IEEE International Conference on Image Processing, Brussels, Belgium, 11–14 September 2011; pp. 161–164.
  16. Lee, J.H.; Jung, I.B. Adaptive-Compression Based Congestion Control Technique for Wireless Sensor Networks. Sensors 2010, 10, 2919–2945. [Google Scholar] [CrossRef] [PubMed]
  17. Costa, D.G.; Guedes, L.A. Exploiting the sensing relevancies of source nodes for optimizations in visual sensor networks. Multimed. Tools Appl. 2013, 64, 549–579. [Google Scholar] [CrossRef]
  18. Costa, D.G.; Guedes, L.A.; Vasques, F.; Portugal, P. Adaptive monitoring relevance in camera networks for critical surveillance applications. Int. J. Distrib. Sens. Netw. 2013, 2013, 836721. [Google Scholar] [CrossRef]
  19. Rosario, D.; Zhao, Z.; Braun, T.; Cerqueira, E. A Cross-Layer QoE-Based Approach for Event-Based Multi-Tier Wireless Multimedia Sensor Networks. Int. J. Adapt. Resilient Auton. Syst. 2014, 5, 1–18. [Google Scholar] [CrossRef]
  20. Costa, D.G.; Guedes, L.A.; Vasques, F.; Portugal, P. A routing mechanism based on the sensing relevancies of source nodes for time-critical applications in visual sensor networks. In Proceedings of the IEEE/IFIP Wireless Days, Dublin, Ireland, 21–23 November 2012.
  21. Zhang, L.; Hauswirth, M.; Shu, L.; Zhou, Z.; Reynolds, V.; Han, G. Multi-priority multi-path selection for video streaming in wireless multimedia sensor networks. Lect. Notes Comput. Sci. 2008, 5061, 439–452. [Google Scholar]
  22. Lecuire, V.; Duran-Faundez, C.; Krommenacker, N. Energy- efficient transmission of wavelet-based images in wireless sensor networks. J. Image Video Process. 2007, 2007, 47345. [Google Scholar] [CrossRef]
  23. Costa, D.G.; Guedes, L.A.; Vasques, F.; Portugal, P. Energy-Efficient Packet Relaying in Wireless Image Sensor Networks Exploiting the Sensing Relevancies of Source Nodes and DWT Coding. J. Sens. Actuator Netw. 2013, 2, 424–448. [Google Scholar] [CrossRef]
  24. Duran-Faundez, C.; Costa, D.G.; Lecuire, V.; Vasques, F. A Geometrical Approach to Compute Source Prioritization Based on Target Viewing in Wireless Visual Sensor Networks. In Proceedings of the IEEE World Conference on Factory Communication Systems, Aveiro, Portugal, 3–6 May 2016.
  25. Aghdama, S.M.; Khansarib, M.; Rabieec, H.R.; Salehib, M. WCCP: A congestion control protocol for wireless multimedia communication in sensor networks. Ad Hoc Netw. 2014, 13, 516–534. [Google Scholar] [CrossRef]
  26. Steine, M.; Viet Ngo, C.; Serna Oliver, R.; Geilen, M.; Basten, T.; Fohler, G.; Decotignie, J.D. Proactive Reconfiguration of Wireless Sensor Networks. In Proceedings of the 14th ACM International Conference on Modeling, Analysis and Simulation of Wireless and Mobile Systems (MSWiM ’11), Miami, FL, USA, 31 October–4 November 2011; pp. 31–40.
  27. Szczodrak, M.; Gnawali, O.; Carloni, L.P. Dynamic Reconfiguration of Wireless Sensor Networks to Support Heterogeneous Applications. In Proceedings of the 2013 IEEE International Conference on Distributed Computing in Sensor Systems, Cambridge, MA, USA, 20 May–23 May 2013; pp. 52–61.
  28. ElGammal, M.; Eltoweissy, M. Distributed dynamic context-aware task-based configuration of wireless sensor networks. In Proceedings of the 2011 IEEE Wireless Communications and Networking Conference, Cancun, Mexico, 28–31 March 2011; pp. 1191–1196.
  29. Cecílio, J.; Furtado, P. Configuration and data processing over a heterogeneous wireless sensor networks. In Proceedings of the 2011 International Conference on Distributed Computing in Sensor Systems and Workshops (DCOSS), Barcelona, Spain, 27–29 June 2011; pp. 1–5.
  30. Costa, D.G.; Guedes, L.A.; Vasques, F.; Portugal, P. Research Trends in Wireless Visual Sensor Networks When Exploiting Prioritization. Sensors 2015, 15, 1760–1784. [Google Scholar] [CrossRef] [PubMed]
  31. Munishmar, V.; Abu-Ghazaleh, N. Coverage algorithms for visual sensor networks. ACM Trans. Sens. Netw. 2013, 9, 45. [Google Scholar]
  32. Osais, Y.; St-Hilaire, M.; Yu, F. Directional sensor placement with optimal sensing ranging, field of view and orientation. Mob. Netw. Appl. 2010, 15, 216–225. [Google Scholar] [CrossRef]
  33. Gonçalves, D.; Costa, D.G. Energy-Efficient Adaptive Encryption for Wireless Visual Sensor Networks. In Proceedings of the Brazilian Symposium on Computer Networks and Distributed Systems, Salvador, Brazil, 30 May–3 June 2016.
  34. Zadeh, L.A. The concept of a linguistic variable and its application to approximate reasoning—II. Inf. Sci. 1975, 8, 301–357. [Google Scholar] [CrossRef]
  35. Patricio, M.; Castanedo, F.; Berlanga, A.; Perez, O.; Garcia, J.; Molina, J. Computational Intelligence in Visual Sensor Networks: Improving Video Processing Systems. In Computational Intelligence in Multimedia Processing: Recent Advances; Studies in Computational Intelligence; Hassanien, A.E., Abraham, A., Kacprzyk, J., Eds.; Springer: Berlin/Heidelberg, Germany, 2008; Volume 96, pp. 351–377. [Google Scholar]
  36. Sonmez, C.; Incel, O.; Isik, S.; Donmez, M.; Ersoy, C. Fuzzy-based congestion control for wireless multimedia sensor networks. EURASIP J. Wirel. Commun. Netw. 2014, 2014. [Google Scholar] [CrossRef]
  37. Lin, K.; Wang, X.; Cui, S.; Tan, Y. Heterogeneous feature fusion-based optimal face image acquisition in visual sensor network. In Proceedings of the IEEE International Instrumentation and Measurement Technology Conference (I2MTC), Pisa, Italy, 11–14 May 2015; pp. 1078–1083.
  38. Collotta, M.; Pau, G.; Salerno, V.M.; Scatà, G. A fuzzy based algorithm to manage power consumption in industrial Wireless Sensor Networks. In Proceedings of the 2011 9th IEEE International Conference on Industrial Informatics, Lisbon, Portugal, 26–29 July 2011; pp. 151–156.
  39. Maurya, S.; Jain, V.K. Fuzzy based energy efficient sensor network protocol for precision agriculture. Compute. Electron. Agric. 2016, 130, 20–37. [Google Scholar] [CrossRef]
  40. Collotta, M.; Cascio, A.L.; Pau, G.; Scatá, G. A fuzzy controller to improve CSMA/CA performance in IEEE 802.15.4 industrial wireless sensor networks. In Proceedings of the 2013 IEEE 18th Conference on Emerging Technologies Factory Automation (ETFA), Cagliari, Italy, 10–13 September 2013; pp. 1–4.
  41. Chen, J. Improving Life Time of Wireless Sensor Networks by Using Fuzzy c-means Induced Clustering. In Proceedings of the World Automation Congress (WAC), Puerto Vallarta, Mexico, 24–28 June 2012; pp. 1–4.
  42. Olunloyo, V.O.S.; Ajofoyinbo, A.M.; Ibidapo-Obee, O. On Development of Fuzzy Controller: The Case of Gaussian and Triangular Membership Functions. J. Signal Inf. Process. 2011, 2, 257–265. [Google Scholar] [CrossRef]
  43. Rahimi, M.; Baer, R.; Iroezi, O.I.; Garcia, J.C.; Warrior, J.; Estrin, D.; Srivastava, M. Cyclops: In situ image sensing and interpretation in wireless sensor networks. In Proceedings of the International Conference on Embedded Networked Sensor Systems, San Diego, CA, USA, 2–4 November 2005.
  44. Hengstler, S.; Prashanth, D.; Fong, S.; Aghajan, H. MeshEye: A hybrid-resolution smart camera mote for applications in distributed intelligent surveillance. In Proceedings of the International Symposium on Information Processing in Sensor Networks, Cambridge, MA, USA, 25–27 April 2007.
  45. Rowe, A.; Rosenberg, C.; Nourbakhsh, I. A Second Generation Low Cost Embedded Color Vision System. In Proceedings of the Computer Vision and Pattern Recognition Conference, San Diego, CA, USA, 20–25 June 2005.
  46. Microchip. PIC24FJ256GB108—16-Bit PIC and dsPIC Microcontrollers; Microchip Technology: Chandler, AZ, USA, 2009. [Google Scholar]
  47. Albino, V.; Berardi, U.; Dangelico, R.M. Smart Cities: Definitions, Dimensions, Performance, and Initiatives. J. Urban Technol. 2015, 22, 3–21. [Google Scholar] [CrossRef]
Figure 1. A generic smart city employing the proposed approach.
Figure 1. A generic smart city employing the proposed approach.
Sensors 17 00093 g001
Figure 2. General scheme of the proposed fuzzy-based computation approach.
Figure 2. General scheme of the proposed fuzzy-based computation approach.
Sensors 17 00093 g002
Figure 3. Membership functions for I P I or E P I .
Figure 3. Membership functions for I P I or E P I .
Sensors 17 00093 g003
Figure 4. General scheme of the simulation model.
Figure 4. General scheme of the simulation model.
Sensors 17 00093 g004
Figure 5. Computed SCTP for four days.
Figure 5. Computed SCTP for four days.
Sensors 17 00093 g005
Figure 6. Computed SCTP for four days. Monitoring is changed on Sunday.
Figure 6. Computed SCTP for four days. Monitoring is changed on Sunday.
Sensors 17 00093 g006
Figure 7. Computed SCTP for different values of internal and external parameters.
Figure 7. Computed SCTP for different values of internal and external parameters.
Sensors 17 00093 g007
Table 1. Examples of different parameters for configuration of sensor nodes.
Table 1. Examples of different parameters for configuration of sensor nodes.
WorkParameterDescription
[19]EventsEvents of interest are detected and used to trigger transmissions from sensor nodes, using a proposed multi-tier architecture.
[18]EventsScalar sensors are used to detect events of interest. Different levels of configurations of visual sensors are established based on the priority of detected events.
[20]EventsSource nodes with higher event-based priorities transmit packets through transmission paths with lower latency.
[21]Media typeThe original media stream is split into image and audio, giving to each resulting sub-stream a particular priority when choosing transmission paths.
[22]Node’s statusRelaying nodes may decide to drop packets according to their residual energy level and the relevance of DWT (Discrete Wavelet Transform) subbands.
[23]Node’s statusThe energy level of sensor nodes are considered when processing packets to be relayed.
[24]Data contentThe viewed segments of targets’ perimeters are associated with priority levels. Most relevant sources transmit higher quality visual data.
[25]Network QoSThe transmission rate of source nodes is adjusted when facing congestion, silently dropping lower-relevant packets at source nodes.
Table 2. Range of Degradation (D) of I P i or E P i parameters.
Table 2. Range of Degradation (D) of I P i or E P i parameters.
Range of Degradation I P i or E P i (%)
v 4 < D v m a x 100 1
v 3 < D v 4 75 0.75
v 2 < D v 3 50 0.50
v 1 < D v 2 25 0.25
v m i n D v 1 10 0.10
Table 3. Membership functions for I P I or E P I .
Table 3. Membership functions for I P I or E P I .
Linguistic ValuesInterval
VL < 0.20
L 0.20 ÷ 0.40
M 0.41 ÷ 0.60
H 0.61 ÷ 0.80
VH > 0.80
Table 4. Inference rules of the proposed Fuzzy Logic Controller.
Table 4. Inference rules of the proposed Fuzzy Logic Controller.
SCTP I P I
VLLMHVH
E P I VLVLVLLMH
LLLMMH
MLMMHH
HMMHHVH
VHMHHVHVH
Table 5. FLC configuration: example of input parameters.
Table 5. FLC configuration: example of input parameters.
ParameterD v ( m i n ) v ( m a x )
Internal (Camera’s hardware)
Cyclops [43]0010
MeshEye [44]5010
CMUCam [45]10010
Internal (Prioritization)
Sensing priority0–15015
Internal (Energy)
Energy level0–20,000 J20,000 J0 J
External (Luminance)
Luminance10–100,000 lux100,000 lux10 lux
External (Day)
Monday-Friday10010
Saturday5010
Sunday0010
External (Deployment area)
Avenues0010
Streets5010
Public parks8010
Crowded areas10010
SCTPSensingCodingTransmission
VL0.1 snapshot/sSQCIFNo guarantees
L0.2 snapshot/sQCIFNo guarantees
M0.5 snapshot/sSCIFReliable
H1 snapshot/sCIFReliable and real-time
VH2 snapshots/s4CIFReliable and real-time
Table 6. Some results of the FLC.
Table 6. Some results of the FLC.
Case a 1 I P 1 a 2 I P 2 a 3 I P 3 b 1 E P 1 b 2 E P 2 b 3 E P 3 SCTP
1 I P I = 0.38 E P I = 0.17VL
0.340.320.450.350.210.540.580.120.260.210.160.28
2 I P I = 0.19 E P I = 0.56L
0.630.120.210.240.160.400.380.640.450.620.170.21
3 I P I = 0.37 E P I = 0.73M
0.430.350.130.550.440.340.510.710.250.830.240.67
4 I P I = 0.79 E P I = 0.63H
0.250.880.080.970.670.730.190.540.690.690.120.43
5 I P I = 0.95 E P I = 0.74VH
0.480.930.340.980.180.940.380.670.370.970.250.51

Share and Cite

MDPI and ACS Style

Costa, D.G.; Collotta, M.; Pau, G.; Duran-Faundez, C. A Fuzzy-Based Approach for Sensing, Coding and Transmission Configuration of Visual Sensors in Smart City Applications. Sensors 2017, 17, 93. https://doi.org/10.3390/s17010093

AMA Style

Costa DG, Collotta M, Pau G, Duran-Faundez C. A Fuzzy-Based Approach for Sensing, Coding and Transmission Configuration of Visual Sensors in Smart City Applications. Sensors. 2017; 17(1):93. https://doi.org/10.3390/s17010093

Chicago/Turabian Style

Costa, Daniel G., Mario Collotta, Giovanni Pau, and Cristian Duran-Faundez. 2017. "A Fuzzy-Based Approach for Sensing, Coding and Transmission Configuration of Visual Sensors in Smart City Applications" Sensors 17, no. 1: 93. https://doi.org/10.3390/s17010093

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop