Next Article in Journal
A Kriging Model Based Finite Element Model Updating Method for Damage Detection
Previous Article in Journal
Fast Reconfigurable SOA-Based Wavelength Conversion of Advanced Modulation Format Data
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Communication Architecture for Grid Integration of Cyber Physical Wind Energy Systems

1
College of Information and Communication Engineering, Sungkyunkwan University, 2066 Seobu-ro, Jangan-gu, Suwon-si, Gyeonggi-do 16419, Korea
2
Department of Communications and Electronics, Higher Institute of Engineering and Technology-King Marriott, Alexandria 23713, Egypt
*
Author to whom correspondence should be addressed.
Appl. Sci. 2017, 7(10), 1034; https://doi.org/10.3390/app7101034
Submission received: 4 September 2017 / Accepted: 8 October 2017 / Published: 10 October 2017
(This article belongs to the Section Energy Science and Technology)

Abstract

:
As we move toward increasing the grid integration of large-scale wind farms (WFs), reliable monitoring, protection, and control are needed to ensure grid stability. WFs are considered to be large and complex cyber physical systems owing to coupling between the electric power system and information and communication technologies (ICT). In this study, we proposed a framework for a cyber physical wind energy system (CPWES), which consists of four layers: a WF power system layer, data acquisition and monitoring layer, communication network layer, and application layer. We performed detailed network modeling for the WF system, including the wind turbines, meteorological mast (met-mast), and substation based on IEC 61400-25 and IEC 61850 standards. Network parameters and configuration were based on a real WF (Korean Southwest offshore project). The simulation results of the end-to-end delay were obtained for different WF applications, and they were compared with the timing requirements of the IEC 1646 standard. The proposed architecture represents a reference model for WF systems, and it can be used to enable the design of future CPWESs.

1. Introduction

There is a growing interest in increasing the penetration rate of renewable energies, such as wind power, solar energy, and biomass. Among them, great attention has been given to wind energy, and several large-scale wind farm (WF) projects are scheduled to be constructed in the near future. In South Korea, the cumulative installed wind power was about 610 MW and 835 MW by the end of 2014 and 2015, respectively [1]. In September 2014, the Korean government announced a long-term plan for new and renewable energies, with a target of obtaining 5.0% of the total primary energy supply from renewable energies by 2020, and 11.0% by 2035. Wind power will supply and contribute about 18.2% of the total power supplied by new and renewable energies by 2035 [2]. The planned WF projects include Southwest phase 1, Tamra, Hanlim, Daejeong, North-East, Shinchong, and Quiduck. The complete list of WF projects is available on the Korea wind power industry association (KWEIA) website, including site location, total capacity, manufacturer, and number of turbines [3].
As an increasing number of WFs are integrated into the power grid, communication infrastructure will play an important role in enabling the real-time operation, monitoring, and control of both wind turbines and the electric power grids to ensure grid stability [4]. Communication infrastructure in WF is considered as an example of an industrial network that requires special consideration in network design owing to the following reasons [5,6,7,8,9,10,11,12,13]: (1) WFs are built in remote locations (onshore/offshore) where abundant wind resources are available. The remote sites lack established communication infrastructure or cellular coverage; (2) access to WF is difficult, especially in the case of offshore sites where the only means of access is by boat or helicopter. Furthermore, weather conditions may postpone/prevent access to WF for an unspecified period. Therefore, remote monitoring and control are highly required; (3) as advancements in the wind turbine industry enable wind turbines to migrate from shallow water to deep water with taller towers and blades, recent studies show the increase in the failure rate of large wind turbines, which impacts the availability of wind turbines and their operation and maintenance costs [14]. Therefore, condition monitoring systems (CMS), structure health monitoring (SHM), and supervisory control and data acquisition unit (SCADA) systems are used for real-time monitoring; (4) the monitoring scope of large-scale WFs has been expanded to cover the operation status of wind turbines, generated power, meteorological masts (met-mast), and substations. Therefore, a considerable amount of data needs to be transferred between WFs and their control centers, which will increase the burden on the communication infrastructure.
The main components of a WF are wind turbines, electric systems, a substation, and met-masts. Communication infrastructure is used to connect WF components together using wired/wireless technologies. Owing to coupling between the physical system and the cyber communication network, a WF is considered to be a cyber physical system. Considerable research studies and investigations have been conducted to study the WF electric power system. However, information on the corresponding communication networks of the proposed models is limited. Furthermore, few research studies have been carried out to investigate the SCADA system and the underlying communication infrastructure. Moreover, it is difficult to find detailed information about how to design the communication network for a wind energy system, as the designs are the proprietary information of each manufacturer. The only available published information consists of outline descriptions of the communication infrastructure of a number of real WF projects [4,15,16,17]. The extent of the research on communication infrastructure and its role in supporting the grid integration of large-scale WFs is insufficient.
In this study, we propose a framework for cyber physical wind energy systems (CPWESs), which consists of four layers: a WF power system layer, data acquisition and monitoring layer, communication network layer, and application layer. We specified and explained the communication architecture of a wind turbine, met-mast, substation, and control center. We modeled the data transmission of the monitoring and protection systems in wind turbines, a met-mast, and a substation based on IEC 61400-25 and IEC 61850 standards. We considered an actual WF (Southwest offshore project, South Korea) as a case study to evaluate the network topology and configuration. We developed a network simulation platform for the performance evaluation of the proposed architecture. The proposed framework contributes by providing a reference architecture model that can be used for the design and implementation of future CPWESs.
The remainder of this paper is organized as follows. Section 2 presents the proposed cyber physical wind energy system. Section 3 explains WF modeling and assumptions. Section 4 discusses and analyzes the simulation results. Finally, Section 5 concludes and gives direction for future work.

2. Cyber Physical Wind Energy System

The CPWES is considered to be a large and complex system, as it comprises multiple heterogeneous domains that interact with each other. Figure 1 shows the proposed CPWES, which consists of four layers: a WF power system layer, data acquisition and monitoring layer, communication network layer, and application layer.

2.1. WF Power System Layer

This layer represents the WF electric power system, which is defined as a group of wind turbines connected together and tied to the utility through a system of transformers, transmission lines, and substations. A typical wind turbine consists of a wind turbine generator, step-up transformer, and a circuit breaker. The step-up transformer is used to step-up the generation voltage of each wind turbine. Wind turbines are divided into groups, and each group is connected to the collector bus through a circuit breaker. A high-voltage transformer is used to step-up the voltage to the transmission level.

2.2. Data Acquisition and Monitoring Layer

The data acquisition layer includes sensor nodes and measurement devices, which are the basic data sources. Sensor nodes and measurement devices are connected to different elements of the WF (inside wind turbine, at the met-mast, etc.), and their main function is to report measurements, such as temperature, speed, voltage, current, and pressure. The CMS of a wind turbine continuously collects a large volume of data signals in real time. Based on data collected from CMS, sensors, and other devices, the wind turbine controller (WTC) performs the control operation for the wind turbine (different operation modes).

2.3. Communication Network Layer

The communication infrastructure provides the connection between WF elements, and it is divided into two parts: the communication network inside the wind turbine (turbine area network, TAN) and between wind turbines, and the control center (farm area network, FAN). Inside a wind turbine are different communication protocols, such as the field bus, industrial ethernet protocols, and control area network (CAN). The architecture of the WF communication network is switch-based, consisting of ethernet switches and communication links in every wind turbine. The network configuration is based on point-to-point communication and a local area network (LAN). In order to link the WF network with a remote control center, different wide area network (WAN) technologies, either wired or wireless, could be configured, such as optical fiber cables, microwaves, and satellites. Each WF has a dedicated connection to a local control center for real-time monitoring and control. However, one control center can remotely manage and control one or more WFs.

2.4. Application Layer

The SCADA systems are used for data acquisition, remote monitoring, real-time control, and data recording. There are multiple applications covered by the SCADA systems in a WF. The three main applications are as follows: the turbine SCADA system provides the connectivity among wind turbines and enables remote monitoring and control for each wind turbine and their associated sub-systems, the WF SCADA system connects all devices from all wind turbines, as well as the electric substation together, and the security SCADA system provides the IP telephony services and video surveillance. The SCADA system remotely collects the process information from WF components and stores it in historical databases and servers (historical servers, metering server, meteorological server, etc.). Based on the collected information, the control center executes appropriate actions.

3. CPWES Modeling and Assumption

In this study, the Southwest offshore wind farm located in South Korea was considered as a case study. We aimed to design the communication infrastructure for phase 1 of the project, which consists of 20 wind turbines with a total capacity of 60 MW, as shown in Figure 2. The output voltage of each turbine was 690 V. The voltage is stepped-up to the collector bus with a typical voltage of 34.5 kV. All turbines were connected to an offshore substation. The spacing between turbines was about 800 m (along the rows and between the rows). The longest cable length was about 1.2 km between turbine No. 3 and the offshore platform, while the shortest cable length was about 524 m between turbine No. 1 and the offshore platform [18]. The electric topology configuration consisted of three feeders, one bus, and one HV transformer. We assumed that the communication network topology followed the WF electric topology, where the optical fiber cables are integrated with the submarine cables. The WF communication architecture is shown in Figure 3.

3.1. Traffic Model for Wind Turbine Based on IEC 61400-25

In order to model the wind turbine subsystems, we considered the IEC 61400-25 standard, which is an adaption of IEC 61850 [19]. IEC 61400-25 provides information for the monitoring and control of wind power plants. We assumed that a wind turbine consists of 10 logical nodes (LNs): WROT, WTEM, WGEN, WCNV, WNAC, WYAW, WTOW, WTRF, WMET and WFOU, as shown in Equation (1). Each LN represents a wind turbine sub-system, as given in Table 1.
W T L N =   { W R O T ,   W T E M ,   W G E N ,   W C N V ,   W N A C ,   W Y A W ,   W T O W ,   W T R F , W M E T , W F O U } ,
Different types of sensor nodes and measurement devices were connected to different turbine parts to measure different parameters, such as voltage (V), current (A), wind speed (WdSpd), wind direction (WdDir), temperature (Tmp), Humidity (Hum), displacement (Disp), and pressure (Pres). Equation (2) defines types of sensor nodes (SNTYPE) inside a wind turbine sub-system.
S N T Y P E   { V ,   I ,   W d S p d , W d D i r ,   T m p , H u m , D i s p , P r e s , } ,
Each sensor node (SNi) was identified by a senor identification ID (SNID), sensor type (SNTYPE), and the physical location inside the wind turbine (WTLN), as given in Equation (3).
S N i =   { W T L N , S N I D ,   S N T Y P E } ,
Regarding the data acquisition and monitoring layer, the authors in Ref. [20] designed a data acquisition system to be installed on a real wind turbine. The total number of sensor nodes was about 29. The sensor nodes were divided into two groups with sampling rates: 50 Hz (low speed) and 20 kHz (high speed). In our model, the total number of sensor nodes defined inside a wind turbine was 108 [21]. Figure 4 shows a schematic diagram of the cyber physical model for a wind turbine system where seven LNs are located at the turbine nacelle and three LNs are in the bottom of the tower. Two data collection units were considered for aggregating the traffic from different turbine subsystems.
To determine the amount of data generated from each sensor node (data rate, SNDR), we defined the sample size (NB), the sampling rate (FS), and the number of channels (NC), as shown in Equation (4).
SN D R = N B × F S × N C ,
Given the displacement sensor as an example, the sensor generates 10 samples/s and the number of channels required for the measuring of data is 2 channels. The total amount of data is 320 bits/s, considering a sample size of 16 bits. Table 2 shows how to calculate the measuring requirement for different sensor nodes. We calculated the sensing data for all sensor nodes inside a wind turbine (108 sensor nodes), and Table 3 shows the traffic configuration at each wind turbine subsystem.

3.2. Traffic Model for an Offshore Substation Based on IEC 61850

In this work, a WF is divided into different protection zones: the wind turbine zone, collector feeder zone, collector bus zone, and high-voltage transformer zone. Each zone has one or more protection devices associated with it, such as a wind turbine protection relay, step-up transformer protection relay, feeder protection relay, bus protection relay, and distance relay for the transmission line [22]. We considered three types of IEDs: a breaker IED, merging unit (MU) IED, and protection and control (P&C) IED [23,24,25,26]. Based on IEC 61850, the function of a MU IED is to acquire the voltage and current signals from the field CT and PT. The function of a CB IED is to monitor the state and control the condition of the circuit breaker. Furthermore, it receives control signals from the P&C IEDs and updates the status change to the protection IEDs. The P&C IED is a universal device that integrates protection and control functionalities and operates at the bay level in the substation.
All acquired signals from different zones were transmitted to the central protection and control (CPC) system on the offshore substation using point-to-point fiber communication. We considered one CPC system, which is responsible for all protection and control devices in the WF, as shown in Figure 5. In addition, each wind turbine generator (WTG) was configured with a merging unit to acquire current, voltage, and breaker signals for wind turbine protection. Data packets were transmitted over a communication network to a central relaying unit (CRU) for P&C functions of the whole WF. For MU-IEDs, we assumed that the sampling frequency of the voltage and current data was 6400 Hz for a 50 Hz power system, and each piece of sampling data was represented by 2 bytes [27]. Considering the 3-phase voltage and current measurement, the MU-IEDs send updated values of 76,800 bytes/s to the P&C server at the local control center [25]. Moreover, the CB-IED sends a status value of 16 bytes/s to the P&C server. The details of the process level, bay level, and station level are as follows:
  • Process Level: The primary equipment includes merging units (MUs), actuators, and I/O devices.
  • Bay Level: The secondary equipment includes different IEDs and protection relays.
  • Station Level: It is located on the offshore platform and includes a central protection and control unit, remote terminal units (RTU), and a human machine interface (HMI).
The communication network was constructed based on ethernet-based architecture. The offshore substation consisted of three feeders, one bus, and one transformer. Protection devices at wind turbines, feeders, and bus were modeled as a subnet consisting of one CB-IED, one MU-IED, one P&C-IED, and one ethernet switch. The step-up transformer was modeled as a subnet consisting of one CB-IED, one MU-IED, two P&C IED, and an ethernet switch. Table 4 shows the IEDs configuration. The traffic configuration and data flow for the CB-IED and MU-IED considered in our model are given in Table 5. IEDs and relays were connected through a 100 Mbps ethernet-based architecture. At the station level, there were the following: a station PC, a protection and control server, ethernet switches, and communication links.

3.3. Traffic Model for Meteorological Mast

We considered a met-mast to measure the wind condition in the WF area. Based on Reference [28], we defined the number, types, and locations of the senor nodes and measuring instruments installed at the met-mast, as shown in Figure 6. The platform height was 10 m with respect to the mean sea level, and the top of the met-mast was about 97 m. Anemometers were installed at 97 m, 96 m, 86 m, 86 m, 76 m, 66 m, 56 m, 46 m and 26 m. Wind vanes were installed at 96 m, 76 m, 56 m and 46 m. Barometers and temperature and humidity sensors were installed at 94 m and 13 m. The details of the sensor types and their locations at the met-mast are shown in Table 6. The acoustic doppler current profile (ADCP) is used to measure the current direction and velocity at sea level.

3.4. Wind Farm Timing Requirements

In a power system, it is important to meet the latency requirement for different real-time monitoring and protection applications. Therefore, the WF communication network should ensure the end-to-end delay for data transmission between wind turbines, the met-mast, the substation, and the control center. The IEC 61400-25 standard does not provide any specific timing requirements for the WF. In this study, we considered the latency requirement based on the electric power system. Table 7 shows the communication timing requirements for electric substation automation based on IEEE 1646 [29]. We mapped WF monitoring and control information as follows: protection information from IEDs were mapped for protection information, analogue measurement and status information were mapped for monitoring and control information, and meteorological data from the met-mast were mapped for operation and maintenance.

4. Simulation Results

The WF communication architecture is modeled using the OPNET Modeler [30], as shown in Figure 3. The proposed communication architecture for the Southwest WF consists of 22 subnetworks (20 wind turbines, one met-mast, and one offshore substation) [31]. Upstream data from WF subsystems are transmitted to the local control center at the offshore substation. We simulated different scenarios for WF monitoring, protection, and control, as shown in Table 8. The details of simulation cases are as follows:
  • Standalone wind turbine: includes data transmission of the condition monitoring system during different operation modes.
  • Wind farm: consists of 20 wind turbines configured in a cascade architecture.
  • Met-mast: includes data transmission of sensors and measurement devices installed at the met-mast.
  • Substation: consists of protection and control devices (IEDs) at wind turbines, feeders, and offshore substation.

4.1. Standalone Wind Turbine Results

We considered four operation modes for a wind turbine, which are idle, start-up, power generation, and shutdown, as shown in Figure 7. Based on wind conditions, the WTC determines the turbine operation mode [8]. We assumed that the control center operator turned on a wind turbine for operation. The WTC first checked the condition of the wind turbine. If no faults were detected in the wind turbine system, the WTC activated the operation in the idle mode. During the idle mode, if the average wind speed was greater than the cut-in wind speed for a certain time, the WTC activated the start-up operation mode. During the power production mode, the WTC adjusted the blades’ pitch angle and position of the nacelle in order to optimize the turbine operation. In case of higher wind speeds, which were greater than the cut-off, the WTC activated the shutdown mode.
We classified the sensor nodes inside the wind turbine into four different categories: mechanical measurement (rotor speed, pitch angle, displacement, vibration, etc.), electrical measurements (voltage, current, power, frequency, etc.), meteorological data (wind speed, wind direction, temperature, humidity, etc.), and foundation measurements, as shown in Table 9. Based on the turbine operation mode, the amount of monitoring data forwarded to the WTC was different. This classification meant that some of the sensor nodes were continuously working, while other sensor nodes were generating the monitoring data based on the turbine operation mode.
The OPNET Modeler was used to simulate data transmission of condition monitoring system for a wind turbine during three operation modes (shutdown, idle, and operation). The nacelle dimension was configured as 12 m × 4 m. The total number of monitoring parameters and sensors was 108. The height of the tower was configured as 85 m (distance between DCU and main ethernet switch). Based on the IEC 61400-25 standard, different profiles were defined, configured, and assigned to each LN. The simulation time was 20 min.
Figure 8a shows the total received traffic at the WTC for different operation modes. The maximum received traffic was about 227,036 bytes/s (225,544 AM, 58 SI, and 1434 FOU) during turbine operation, while the received traffic was about 5808 bytes/s during the idle mode. The difference was due to the sensing data of the electric measurement (voltage, current, power, power factor and frequency), which were about 221,228 bytes/s. No traffic was received at the WTC while the turbine was in the shutdown mode.
Figure 8b shows the end-to-end delay inside the wind turbine between sensor nodes and the WTC). The link capacities inside the wind turbine were configured with 10 Mbps and 100 Mbps. During the power production mode, the end-to-end delay was about 0.098 ms and 1.1 ms using link capacities of 10 Mbps and 100 Mbps, respectively. The received traffic from the pitch angel sensor, wind speed sensor, and tilt sensor were 18 bytes/s, 36 bytes/s, and 40 bytes/s, respectively, as shown in Figure 9a. The measurements were received at the WTC just after the control center operator activated the wind turbine for operation at t = 300 s. For the electric measurement, the received traffic for the current and voltage were 73,728 bytes/s and 147,456 bytes/s, respectively. Note that the electric measurements were transmitted to the WTC at t = 420 s, and no data was received during the idle mode, as shown in Figure 9b.
In the future, wind turbines will integrate more and more sensor nodes and monitoring devices. This will require the communication network to support even higher bandwidth. We studied the end-to-end delay for the case where the number of sensor nodes inside the wind turbine is duplicated two-times (case2×) and three-times (case3×). The results in Figure 10 shows the received traffic and the end-to-end delay at the WTC during different operation modes. The end-to-end delay increased from 1.1 ms (base case with 108 sensor nodes) to 3.08 ms for case3×. The results of the ETE delay were proportional to the network traffic and number of sensor nodes inside the wind turbine, as shown in Table 10.

4.2. Wind Farm Results

We configured the WF communication network based on the data from real large-scale WF projects (Greater Gabbard WF in UK [16], Horn Rev WF in Denmark [15], and Yeung Heung in South Korea [17]). The network topology was configured as cascaded architecture. A main ethernet switch was located on the offshore platform. Dedicated optical fibers were connected between the main ethernet switch and the nearest wind turbines. The connectivity between turbines was provided through the connection between ethernet switches installed at the base of each turbine.
Figure 11 shows the total traffic received at the control center (SCADA server). The traffic received for analogue measurements, status information, foundation measurements, and protection information was 4,510,880 bytes/s (225,544 × 20), 1160 bytes/s (58 × 20), 28,680 bytes/s (1434 × 20), and 1,536,320 bytes/s (76,816 × 20), respectively. Figure 12 shows the average ETE delay of real-time monitoring data for all wind turbine applications. The average ETE delay was about 7.83 ms and 0.78 ms for link capacities of 100 Mbps and 1 Gbps, respectively. For the 20 wind turbines in the WF, Table 11 shows the maximum and minimum ETE delays for SCADA, MU-IED, and CB-ID between the wind turbines and the control center.

4.3. Met-Mast Results

We assumed that the dimension of the Met-mast platform were 10 × 8 m. Four data collection units (DCU) were installed at different levels (90 m, 70 m, 50 m, and 20 m) for collecting the measurement data. All sensing data were collected at the measurement PC located at the met-mast platform. Each DCU had a dedicated communication link to the measurement PC to maintain reception of data in case of any faults in other DCUs. The electric power needed for the measurement equipment and other devices at the met-mast could be supported from the WF or from photovoltaic panels. Figure 13 shows the received traffic at the measurement PC for pressure, ADCP, humidity, wind speed, and wind direction, which were 150 bytes/s, 400 bytes, 4 bytes/s, 48 bytes, and 24 bytes/s, respectively. Figure 14a shows the total received data at the measurement PC of about 638 bytes/s, while the ETE delay is about 0.54 ms and 0.058 ms for link capacities of 10 Mbps and 100 Mbps, respectively. The amount of data stored at the measurement PC was about 55 Mb per day, and data was transmitted to the met-mast server at the local control center for storage.
In the real system, the data acquisition system at the met-mast uses wireless data transmission using WCDMA communication to transmit the collected data from the measurement PC to the remote control center [32]. We assumed that the met-mast was connected to the nearest wind turbine in the WF (WT16). The real-time measured data were stored at the measurement PC and transmitted to the local control center (met-mast server) through the WF communication network. The IP ETE delay of the met-mast data through the WF network is shown in Figure 15.

4.4. Substation Automation Results

The communication network for the offshore WF substation is shown in Figure 3. The architecture is based on IEC 61850 with three levels: a process level, bay level, and station level [33]. We assumed that all MU IEDs and CB IEDs were sending metering values (76,800 bytes/s) and the status of the breaker (16 bytes/s) to the central P&C unit at the station level on the offshore platform. Three scenarios were configured with different link capacities of 10, 100 and 1000 Mbps. To validate the results, we compared the generated amount of traffic from different IEDs with the received traffic from the server. The server FTP traffic received was about 416 bytes/s (26 CB-IEDs × 16 bytes/s) and 1,920,000 (25 MU-IEDs × 76,800 bytes/s) for the CB-IEDs and MU-IEDs, respectively. Figure 16 shows the traffic received, which agrees with our calculations.
Figure 17 shows the average ETE delay for all protection and control information considering link capacities of 100 Mbps and 1 Gbps. Compared with the timing requirement of the IEEE 1646 standard, the link capacity of 100 Mbps cannot satisfy the delay requirement owing to a high delay of about 8.5 ms. The link capacity of 1 Gbps can satisfy the protection requirement with a delay less than 1 ms, as shown in Figure 17.
It was observed that using 10 Mbps for channel capacity was not sufficient, as the amount of generated traffic was much higher than the link capacity. Figure 18 and Figure 19 show the worst case scenario in case of 10 Mbps where data loss and higher delays occur. Table 12 shows the ETE delay of the MU-IEDs and CB-IEDs for the feeders (F1, F2, and F3), bus, and transformer with link bandwidth of 100 Mbps and 1 Gbps. Figure 20 shows the ETE delay of the MU-IED and CB-IED at the collector bus.

5. Conclusions

In this work, we proposed a framework for the grid integration of a cyber physical wind energy system (CPWES) that consists of four layers: a wind farm power system layer, data acquisition and monitoring layer, communication network layer, and application layer. A real WF project (Korean Southwest –phase 1) was considered as a case study. We developed a communication network model for the WF based on the OPNET Modeler. The network architecture was modeled based on IEC 61400-25 and IEC 61850 standards. Different scenarios were configured for evaluating the communication network performance at different levels, including: data transmission of condition monitoring system for a standalone wind turbine during different operation modes, communication network for a real WF consists of 20 wind turbines, data transmission of sensors and measurement devices installed at the met-mast, and communication network for protection and control devices at wind turbines, feeders, and offshore substation. The proposed architecture was evaluated with respect to network topology, link capacity, and end-to-end delay for different applications. Simulation results indicated the channel capacity of 1 Gbps satisfied the requirement of the IEEE 1646 standard for the WF system. This work contributes by providing a reference architecture for modeling, simulating, and evaluating the communication network of a WF and can be used for the design and implementation of future CPWESs.

Acknowledgments

This work was supported by the National Research Foundation of Korea under Grant funded by the Korean government (MSIP) (No. 2015R1A2A1A10052459).

Author Contributions

All the authors contributed to publish this paper.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Global Wind Energy Council. Global Wind Statistics 2015. 2016. Available online: http://www.gwec.net/wp-content/uploads/2015/02/GWEC_GlobalWindStats2014_FINAL_10.2.2015.pdf (accessed on 1 August 2017).
  2. Report on Wind Power Industry 2015, South Korea, Jaehee Park, Project Officer Wind Energy & High Tech. 2016. Available online: https://www.rvo.nl/sites/default/files/2016/03/Rapport%20Windenergie%20Zuid-Korea.pdf (accessed on 1 August 2017).
  3. Korea Wind Power Industry Association (KWEIA). Available online: http://www.kweia.or.kr/for wind power statistics information (accessed on 1 August 2017).
  4. Yu, F.; Zhang, P.; Xiao, W.; Choudhury, P. Communication systems for grid integration of renewable energy resources. IEEE Netw. 2011, 25, 22–29. [Google Scholar] [CrossRef]
  5. Gao, Z.; Geng, J.; Zhang, K.; Dai, Z.; Bai, X.; Peng, M.; Wang, Y. Wind Power Dispatch Supporting Technologies and Its Implementation. IEEE Trans. Smart Grid 2013, 4, 1684–1691. [Google Scholar]
  6. Gallardo-Calles, J.-M.; Colmenar-Santos, A.; Ontañon-Ruiz, J.; Castro-Gil, M. Wind control centres: State of the art. Renew. Energy 2013, 51, 93–100. [Google Scholar] [CrossRef]
  7. Singh, B.K.; Coulter, J.; Sayani, M.A.G.; Sami, S.M.; Khalid, M.; Tepe, K.E. Survey on communication architectures for wind energy integration with the smart grid. Int. J. Environ. Stud. 2013, 70, 765–776. [Google Scholar] [CrossRef]
  8. Karthikeya, B.R.; Schutt, R.J. Overview of Wind Park Control Strategies. IEEE Trans. Sustain. Energy 2014, 5, 416–422. [Google Scholar] [CrossRef]
  9. Alizadeh, S.M.; Ozansoy, C. The role of communications and standardization in wind power applications—A review. Renew. Sustain. Energy Rev. 2016, 54, 944–958. [Google Scholar] [CrossRef]
  10. Moness, M.; Moustafa, A.M. A Survey of Cyber-Physical Advances and Challenges of Wind Energy Conversion Systems: Prospects for Internet of Energy. IEEE Internet Things J. 2016, 3, 134–145. [Google Scholar] [CrossRef]
  11. Pettener, A.L. SCADA and communication networks for large scale offshore wind power systems. In Proceedings of the IET Conference on Renewable Power Generation (RPG 2011), Edinburgh, UK, 6–8 September 2011; pp. 1–6. [Google Scholar]
  12. Kunzemann, P.; Jacobs, G.; Schelenz, R. Application of CPS within Wind Energy—Current Implementation and Future Potential. In Industrial Internet of Things: Cybermanufacturing Systems; Springer: Cham, Switzerland, 2017; pp. 647–670. [Google Scholar]
  13. Ahmed, M.A.; Kang, Y.-C.; Kim, Y.-C. Modeling and simulation of ICT network architecture for cyber-physical wind energy system. In Proceedings of the 2015 IEEE International Conference on Smart Energy Grid Engineering (SEGE), Oshawa, ON, Canada, 17–19 August 2015; pp. 1–6. [Google Scholar]
  14. Tchakoua, P.; Wamkeue, R.; Ouhrouche, M.; Slaoui-Hasnaoui, F.; Tameghe, T.A.; Ekemb, G. Wind Turbine Condition Monitoring: State-of-the-Art Review, New Trends, and Future Challenges. Energies 2014, 7, 2595–2630. [Google Scholar]
  15. Kristoffersen, J.R.; Christiansen, P. Horns Rev Offshore Windfarm: Its Main Controller and Remote Control System. Wind Eng. 2003, 27, 351–359. [Google Scholar] [CrossRef]
  16. Goraj, M.; Epassa, Y.; Midence, R.; Meadows, D. Designing and deploying ethernet networks for offshore wind power applications—A case study. In Proceedings of the 10th IET International Conference on Developments in Power System Protection (DPSP 2010), Manchester, UK, 29 March–1 April 2010; p. 84. [Google Scholar]
  17. Park, J.Y.; Kim, B.J.; Lee, J.K. Development of condition monitoring system with control functions for wind turbines. World Acad. Sci. Eng. Technol. 2011, 5, 269–274. [Google Scholar]
  18. Demonstration Project: Southwest Offshore Wind Park Development; Ministry of Trade, Industry and Energy. 2015. Available online: http://www.motie.go.kr/common/download.do?fid=bbs&bbs_cd_n=6&bbs_seq_n=63153&file_seq_n=2 (accessed on 1 August 2017).
  19. IEC 61400-25-2. International Standard, Wind Turbines Part 25-2: Communications for Monitoring and Control of Wind Power Plants—Information Models; International Electrotechnical Commission (IEC): Geneva, Switzerland, 2006.
  20. Cruden, A.; Booth, C.; Catterson, Y.M.; Ferguson, D. Designing wind turbine condition monitoring systems suitable for harsh environments. In Proceedings of the 2nd IET Renewable Power Generation Conference (RPG 2013), Institution of Engineering and Technology, Beijing, China, 9–11 September 2013; pp. 1–4. [Google Scholar]
  21. Ahmed, M.; Kim, Y.-C. Hierarchical Communication Network Architectures for Offshore Wind Power Farms. Energies 2014, 7, 3420–3437. [Google Scholar] [CrossRef]
  22. Cardenas, J.; Muthukrishnan, V.; McGinn, D.; Hunt, R. Wind farm protection using an IEC 61850 process bus architecture. In Proceedings of the 10th IET International Conference on Developments in Power System Protection (DPSP 2010), Managing the Change, Manchester, UK, 29 March–1 April 2010; pp. 1–5. [Google Scholar]
  23. Wei, M.; Chen, Z. Intelligent control on wind farm. In Proceedings of the 2010 IEEE PES Innovative Smart Grid Technologies Conference Europe (ISGT Europe), Gothenburg, Sweden, 11–13 October 2010; pp. 1–6. [Google Scholar]
  24. Golshani, M.; Taylor, G.A.; Pisica, I. Simulation of power system substation communications architecture based on IEC 61850 standard. In Proceedings of the 2014 49th International Universities Power Engineering Conference (UPEC), Cluj-Napoca, Romania, 2–5 September 2014; pp. 1–6. [Google Scholar]
  25. Thomas, M.S.; Ali, I. Reliable, Fast, and Deterministic Substation Communication Network Architecture and its Performance Simulation. IEEE Trans. Power Deliv. 2010, 25, 2364–2370. [Google Scholar] [CrossRef]
  26. Sidhu, T.S.; Yin, Y. Modelling and Simulation for Performance Evaluation of IEC61850-Based Substation Communication Systems. IEEE Trans. Power Deliv. 2007, 22, 1482–1489. [Google Scholar] [CrossRef]
  27. Wei, M.; Chen, Z. Communication Systems and Study Method for Active Distribution Power Systems. In Proceedings of the 9th Nordic Electricity Distribution and Asset Management Conference, Aalborg, Denmark, 6–7 September 2010. [Google Scholar]
  28. Oh, K.-Y.; Kim, J.-Y.; Lee, J.-K.; Ryu, M.-S.; Lee, J.-S. An assessment of wind energy potential at the demonstration offshore wind farm in Korea. Energy 2012, 46, 555–563. [Google Scholar] [CrossRef]
  29. 1646-2004—IEEE Standard Communication Delivery Time Performance Requirements for Electric Power Substation Automation. 2005. Available online: http://ieeexplore.ieee.org/servlet/opac?punumber=9645 (accessed on 1 August 2017).
  30. OPNET Technologies. OPNET Modeler. Available online: https://www.riverbed.com (accessed on 1 August 2017).
  31. Ahmed, M.; Kim, Y.-C. Communication Network Architectures for Southwest Offshore Wind Farm. J. Korean Inst. Commun. Inf. Sci. 2017, 42, 1–10. [Google Scholar] [CrossRef]
  32. Ko, S.-W.; Ju, Y.-C.; Jang, M.-S. The Measurement using a Wireless Transmission System with a WCDMA Communication for Metrological Data. J. Int. Counc. Electr. Eng. 2013, 3, 330–333. [Google Scholar] [CrossRef]
  33. Zhabelova, G.; Vyatkin, V. Multiagent Smart Grid Automation Architecture Based on IEC 61850/61499 Intelligent Logical Nodes. IEEE Trans. Ind. Electron. 2012, 59, 2351–2362. [Google Scholar] [CrossRef]
Figure 1. Proposed architecture of the cyber physical wind energy system. SCADA: supervisory control and data acquisition; CCTV: closed-circuit television; VOIP: voice over internet protocol; IED: intelligent electronic device; WTG: wind turbine generator; WTC: wind turbine controller; HV: high voltage; Met. Mast: meteorological mast.
Figure 1. Proposed architecture of the cyber physical wind energy system. SCADA: supervisory control and data acquisition; CCTV: closed-circuit television; VOIP: voice over internet protocol; IED: intelligent electronic device; WTG: wind turbine generator; WTC: wind turbine controller; HV: high voltage; Met. Mast: meteorological mast.
Applsci 07 01034 g001
Figure 2. Configuration of the Southwest wind farm project (Test WF).
Figure 2. Configuration of the Southwest wind farm project (Test WF).
Applsci 07 01034 g002
Figure 3. Proposed OPNET model for the wind farm communication architecture. OPNET: optimized network engineering tool.
Figure 3. Proposed OPNET model for the wind farm communication architecture. OPNET: optimized network engineering tool.
Applsci 07 01034 g003
Figure 4. Schematic diagram of the cyber physical model for the wind turbine system.
Figure 4. Schematic diagram of the cyber physical model for the wind turbine system.
Applsci 07 01034 g004
Figure 5. Schematic diagram of the cyber physical model for the wind farm substation.
Figure 5. Schematic diagram of the cyber physical model for the wind farm substation.
Applsci 07 01034 g005
Figure 6. Schematic diagram of the cyber physical model for met-mast.
Figure 6. Schematic diagram of the cyber physical model for met-mast.
Applsci 07 01034 g006
Figure 7. Power curve and general operation modes of a typical wind turbine.
Figure 7. Power curve and general operation modes of a typical wind turbine.
Applsci 07 01034 g007
Figure 8. (a) Total received traffic at WTC for different operation modes; (b) End-to-end delay for the internal communication inside the wind turbine.
Figure 8. (a) Total received traffic at WTC for different operation modes; (b) End-to-end delay for the internal communication inside the wind turbine.
Applsci 07 01034 g008
Figure 9. Traffic received at WTC for different sensors. (a) Wind speed, pitch angle, tilt; (b) Voltage and current.
Figure 9. Traffic received at WTC for different sensors. (a) Wind speed, pitch angle, tilt; (b) Voltage and current.
Applsci 07 01034 g009
Figure 10. (a) Total received traffic at WTC for different operation mode; (b) ETE delay for the internal communication inside wind turbine.
Figure 10. (a) Total received traffic at WTC for different operation mode; (b) ETE delay for the internal communication inside wind turbine.
Applsci 07 01034 g010
Figure 11. Total received traffic at the SCADA server in the control center (20 wind turbines).
Figure 11. Total received traffic at the SCADA server in the control center (20 wind turbines).
Applsci 07 01034 g011
Figure 12. Average ETE delay for the wind farm data (20 wind turbines).
Figure 12. Average ETE delay for the wind farm data (20 wind turbines).
Applsci 07 01034 g012
Figure 13. Traffic received at measurement PC for different sensors. (a) Voltage and current; (b) Wind speed, pitch angle, and tilt.
Figure 13. Traffic received at measurement PC for different sensors. (a) Voltage and current; (b) Wind speed, pitch angle, and tilt.
Applsci 07 01034 g013
Figure 14. (a) Total received traffic at measurement PC for different sensors (b) ETE delay for the met-mast sensors.
Figure 14. (a) Total received traffic at measurement PC for different sensors (b) ETE delay for the met-mast sensors.
Applsci 07 01034 g014
Figure 15. ETE delay for met-mast data (link capacity 100 Mbps and 1 Gbps).
Figure 15. ETE delay for met-mast data (link capacity 100 Mbps and 1 Gbps).
Applsci 07 01034 g015
Figure 16. Traffic received at the measurement central P&C unit (channel capacity 100 Mbps, 1 Gbps). (a) CB-IEDs (b) MU-IEDs.
Figure 16. Traffic received at the measurement central P&C unit (channel capacity 100 Mbps, 1 Gbps). (a) CB-IEDs (b) MU-IEDs.
Applsci 07 01034 g016
Figure 17. Average ETE delay for P&C information in the WF (channel capacity 100 Mbps, 1 Gbps).
Figure 17. Average ETE delay for P&C information in the WF (channel capacity 100 Mbps, 1 Gbps).
Applsci 07 01034 g017
Figure 18. Traffic received at measurement central P&C unit (channel capacity 10 Mbps). (a) CB-IEDs (b) MU-IEDs.
Figure 18. Traffic received at measurement central P&C unit (channel capacity 10 Mbps). (a) CB-IEDs (b) MU-IEDs.
Applsci 07 01034 g018
Figure 19. ETE delay for P&C information in the WF (channel capacity 10 Mbps).
Figure 19. ETE delay for P&C information in the WF (channel capacity 10 Mbps).
Applsci 07 01034 g019
Figure 20. ETE delay for the MU-IED and CB-IED at the collector bus (Link capacity 100 Mbps, 1 Gbps).
Figure 20. ETE delay for the MU-IED and CB-IED at the collector bus (Link capacity 100 Mbps, 1 Gbps).
Applsci 07 01034 g020
Table 1. Wind turbine logical nodes [18].
Table 1. Wind turbine logical nodes [18].
LNDescriptionLNDescription
WROTWind turbine rotor informationWNACWind turbine nacelle information
WTRMWind turbine transmission informationWYAWWind turbine yawing information
WTGENWind turbine generator informationWTOWWind turbine tower information
WCNVWind turbine converter informationWFOUWind turbine foundation information
WTRFWind turbine transformer informationWMETWind power plant meteorological information
Table 2. Measuring requirement for sensor nodes.
Table 2. Measuring requirement for sensor nodes.
Sensor TypeSampling FrequencySample SizeNumber of ChannelsData Rate
Temperature1 Hz16 bits12 bytes/s
Wind Speed1 Hz16 bits12 bytes/s
Displacement10 Hz16 bits240 bytes/s
Vibration200 Hz16 bits31200 bytes/s
Voltage2048 Hz16 bits312,288 bytes/s
Table 3. Traffic configuration for a wind turbine.
Table 3. Traffic configuration for a wind turbine.
LN# SensorsData RateLN# SensorsData Rate
WROT14642 bytes/sWNAC12112 bytes/s
WTRM182828 bytes/sWYAW7220 bytes/s
WTGEN1473,764 bytes/sWTOW48 bytes/s
WCNV1474,060 bytes/sWFOU61434 bytes/s
WTRF1273,740 bytes/sWMET7228 bytes/s
Table 4. IEDs configuration in the wind farm substation. IED: intelligent electronic device.
Table 4. IEDs configuration in the wind farm substation. IED: intelligent electronic device.
ZoneCB IEDMU IEDP&C IED
Wind Turbine111
Collector Feeder111
Collector Bus111
Substation Transformer211
Table 5. Traffic configuration for IEDs.
Table 5. Traffic configuration for IEDs.
ZoneMeasurementFromTo
CB IEDBreaker StatusCB IEDStation Server
MU IEDSampled 3-phase V, I messageMU IEDStation Server
Table 6. Locations, types of sensors, and measurement devices installed at the met-mast.
Table 6. Locations, types of sensors, and measurement devices installed at the met-mast.
TypeLevelSampling RateData RateTypeLevelSampling RateData Rate
Anemometer97 m3 Hz6 bytes/sAnemometer56 m3 Hz6 bytes/s
Anemometer96 m3 Hz6 bytes/sWind Vane56 m3 Hz6 bytes/s
Wind Vane96 m3 Hz6 bytes/sAnemometer46 m3 Hz6 bytes/s
Temperature93 m1 Hz2 bytes/sWind Vane46 m3 Hz6 bytes/s
Humidity93 m1 Hz2 bytes/sAnemometer26 m3 Hz6 bytes/s
Pressure Sensor93 m100 Hz200 bytes/sHumidity14 m1 Hz2 bytes/s
Anemometer86 m3 Hz6 bytes/sTemperature14 m1 Hz2 bytes/s
Anemometer76 m3 Hz6 bytes/sPressure Sensor14 m100 Hz200 bytes/s
Wind Vane76 m3 Hz6 bytes/sRain Sensor10 m4 Hz8 bytes/s
Anemometer66 m3 Hz6 bytes/sADCPFoundation-150 bytes/s
Table 7. Wind farm communication requirements based on the IEEE 1646 standard.
Table 7. Wind farm communication requirements based on the IEEE 1646 standard.
Information TypesDelay Requirements
Protection Information (high speed)4 ms
Analogue Measurement and Status Information16 ms
Meteorological data1 s
Audio and Video data stream1 s
Image files10 s
Table 8. Summary of simulation cases.
Table 8. Summary of simulation cases.
ScaleLevelOptions
1Standalone Wind Turbine10 Mb/s, 100 Mb/s
2Wind Farm100 Mb/s, 1 Gb/s
3Met-Mast10 Mb/s, 100 Mb/s
4Substation10 Mb/s, 100 Mb/s, 1 Gb/s
Table 9. Classification of data types based on different operation modes.
Table 9. Classification of data types based on different operation modes.
Data TypeIdle ModeOperation ModeShutdown Mode
StatusX
MechanicalX
ElectricalXX
MeteorologicalX
FoundationX
Table 10. ETE delay inside WTs with different numbers of sensor nodes.
Table 10. ETE delay inside WTs with different numbers of sensor nodes.
Case# Sensor NodesIdle ModeOperation Mode
Case1×1080.53 ms1.1 ms
Case2×2160.65 ms1.77 ms
Case3×3240.69 ms3.08 ms
Table 11. Simulation results of the ETE delay from the wind turbine zone to the control center.
Table 11. Simulation results of the ETE delay from the wind turbine zone to the control center.
Wind Turbine ZoneChannel SpeedETE Delay
MinMax
SCADA100 Mbps11.8713.06
1 Gbps1.161.28
CB-IED100 Mbps0.565.40
1 Gbps0.200.45
MU-IED100 Mbps9.3611.62
1 Gbps0.531.13
Table 12. End-to-end delay of the IEDs at different zones.
Table 12. End-to-end delay of the IEDs at different zones.
ZoneIED TypeETE Delay
100 Mbps1 Gbps
Collector Feeder 1CB-IED1.31 ms0.20 ms
MU-IED14.05 ms1.35 ms
Collector Feeder 2CB-IED1.14 ms0.18 ms
MU-IED13.87 ms1.33 ms
Collector Feeder 3CB-IED1.32 ms0.19 ms
MU-IED13.96 ms1.35 ms
Collector BusCB-IED1.00 ms0.16 ms
MU-IED13.77 ms1.32 ms
Substation TransformerCB1-IED1.62 ms0.32 ms
CB2-IED0.82 ms0.15 ms
MU-IED13.62 ms1.31 ms

Share and Cite

MDPI and ACS Style

A. Ahmed, M.; Kim, C.-H. Communication Architecture for Grid Integration of Cyber Physical Wind Energy Systems. Appl. Sci. 2017, 7, 1034. https://doi.org/10.3390/app7101034

AMA Style

A. Ahmed M, Kim C-H. Communication Architecture for Grid Integration of Cyber Physical Wind Energy Systems. Applied Sciences. 2017; 7(10):1034. https://doi.org/10.3390/app7101034

Chicago/Turabian Style

A. Ahmed, Mohamed, and Chul-Hwan Kim. 2017. "Communication Architecture for Grid Integration of Cyber Physical Wind Energy Systems" Applied Sciences 7, no. 10: 1034. https://doi.org/10.3390/app7101034

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop