1. Introduction
Connectivity technologies are an essential part of modern agriculture called smart farming (SF), which utilizes various digital technologies to improve the efficiency of farming operations. SF utilizes the newest technologies to make farming sustainable simultaneously in environmental, social, and economic aspects. These developments are catalyzed by drivers such as the United Nations’ Sustainable Development Goals (SDGs) and the EU’s Green Deal. Sustainable farming aims to be achieved with precision farming (PF) technologies that provide customized and right-timed treatment for each individual location inside the fields. Recently, intensive acquisition of site-specific and market-related data has been introduced for farm management information systems (FMISs). In an FMIS, the use of real-time data and decision support software increases farmers’ situational awareness and adds intelligent features to the decision-making process of farms. To be realized, SF demands communication solutions that are reliable and operational in rural and remote areas.
The current trend in SF technologies is increased automation and remote control of robots, tractors, and tractor implements. Major agricultural tractors and equipment manufacturers aim to achieve fully autonomous systems. One of the steps toward fully autonomous operation is the development and assessment of remotely controlled tractors. This development increases the need for data and hence communication between machines [
1]. One special case of increased automation, the principle of which was initially developed more than a century ago, is a master–slave tractor system in which the farmer controls one or several slave tractors from a master one [
2]. The slave tractor can operate mainly independently, but the farmer takes control whenever needed. The first implementations were mechanical, but radio waves soon replaced them. Nowadays, the solutions, if applied, are fully digital. A remotely controlled tractor, regardless of whether it is controlled from the farm office or the cabin of another tractor, relies on reliable, high-quality communications and extended coverage of the network. Farms are usually located in rural and sparsely populated areas, where the data connection often limits or prevents farmers from utilizing solutions that demand high amounts of data [
3]. Consequently, the main idea of the experiment reported here was to establish an ad hoc private network or a private network bubble to enable remote operations of a tractor performing a fertilization task in a field.
Autonomous and remote-controlled tractors have gained high interest from researchers in increasing productivity [
4,
5,
6,
7,
8]. Approaches such as cloud-based dynamics control have been studied to improve the maneuverability, mobility, and handling performance of unmanned ground vehicles [
9]. The remote control study defined in this paper focuses on supporting commonly used systems in agriculture, such as ISO 11783 [
10], commonly known as ISOBUS. ISOBUS is a widely used communication standard for providing an open, interconnected system for on-board electronics. It allows connections between tractors and implements without dependence on the brand. ISOBUS is based on SAE J1939 protocol [
11], including controller area network (CAN), which enables electronic control units (ECUs) for communicating through CAN buses; thus, together, they can create a uniform system. Autonomous control algorithms have been studied for agricultural tractors using ISOBUS standard-based communication [
12]. Our study aims to enable remote control of the tractor interconnecting the in-vehicle CAN of the tractor and the CAN bus of the remote control system. CAN traffic relaying over IP networks has been identified as a potential solution for monitoring and remote controlling vehicles [
13,
14,
15,
16]. Cannelloni [
17] is a software that allows controller area networks to be connected over a local area network but does not support secure data transmission over the internet or a mobile network. This paper describes the study of a solution based on Cannelloni, Socat [
18], and secure shell (SSH) to securely transfer control messages over networks. We propose a solution that adapts to the limited resources in networks by enabling the transfer of only selected CAN frames.
Emerging communication technologies—5G, sensor networks, satellite systems, and tactical networks as wireless backbones—provide new opportunities for SF [
19,
20,
21]. 5G provides a reliable, low-latency, high-capacity channel for vehicle communications, control, and surveillance [
22]. Wi-Fi, satellite, and long range wide area network (LoRaWAN)-based sensor networks support monitoring, surveillance, and situational awareness in the farm environment. As SF and autonomous tractor scenarios involve external stakeholders and connectivity with different sensors and assets, cybersecurity is an essential requirement [
23,
24]. Cybersecurity architecture can build on wireless security [
25], application layer security solutions for SF [
26], the internet of things (IoT) [
27], and vehicles [
28]. However, there is a need to identify requirements for a combined SF cybersecurity architecture that addresses threats and challenges stemming from farm vehicles, IoT sensor networks, cooperative business models, scarce cybersecurity resources in SFs, and private network technologies. More trials and comparative research are also needed to verify the feasibility of different wireless networks and network interconnection solutions for SF applications.
This study is part of the PRIORITY project that develops and trials communication and digitalization technologies for business- and mission-critical end users. One of our agricultural scenarios—grass cultivation in a dairy farm—consists of four uses cases: (1) the sensor systems in the animal shelter and fields, (2) drones and satellite remote sensing (Airbus Verde [
29]) for data collection, (3) artificial intelligence for recognizing fertilizer spreading and foreign objects, and 4) unmanned tractors in which communication plays a significant role. Based on the scenario, a field trial with a private mobile network was built on an educational farm. This farm consisted of several different locations. Hence, the private network consisted of several private network bubbles [
30], and elements that can form independent ad hoc private networks. These were used for tests in different environments to guarantee network performance in every location.
The study contributes to the scientific community by verifying the feasibility of different private networking and internetworking technologies for emerging smart farm and unmanned tractor applications. We collected and consolidated the requirements for telecommunications networks and cybersecurity. Practical experiences enabled us to define qualitative and quantitative demands related to functionality, performance, interoperability, and cybersecurity, as well as to identify technology gaps for future research. We also explored approaches to customize and optimize applications and networks based on needs in SF.
The remainder of this article is organized as follows. In
Section 2, we describe the requirements for networks and security solutions.
Section 3 describes the technology enablers—private networks, remote control software, and security components—that were trialed. The section also highlights the main results and insights gained during the development and trial.
Section 4 describes our contributions in light of related research. In
Section 5, we discuss the conclusions and recommend potential directions for future research.
2. Requirements from Autonomy and Remote Control in Smart Farms
Unmanned tractors with different capabilities and different ways to operate impose various technical requirements on the underlying information technology and communication infrastructure. Similarly, smart farm environments set their own restrictions and demands. This section explores these requirements, particularly from the perspective of network quality in
Section 2.1 and cybersecurity in
Section 2.2.
The autonomous and remotely controlled operations of farm machinery are actually the end points of a spectrum of operation modes. The International Society of Automotive Engineers (SAE) has defined the autonomy levels of vehicles ranging from no driving automation (Level 0) to full driving automation (Level 5) [
31], as listed below:
Level 0—No Driving Automation
Level 1—Driver Assistance
Level 2—Partial Driving Automation (“hands off”)
Level 3—Conditional Driving Automation (“eyes off”)
Level 4—High Driving Automation (“mind off”)
Level 5—Full Driving Automation
A range of assisting technologies enables intermediate modes, where some of the functions are explicitly controlled by an operator, while others can be independently decided by the machinery. Such assisting technologies include global navigation satellite system positioning, video cameras, and proximity sensors. The tendency is from remote control to fully autonomous operation, as this will lessen the operator workload and hence improve efficiency. However, legislation and regulations set the boundaries for the operation. Currently, legislation in several countries, such as Finland, does not permit unmanned vehicles to operate outside closed environments. Before fully unmanned autonomous or remotely controlled tractors may become more common, the development of legislation is needed [
32] to clarify liabilities.
2.1. Requirements for Communications Networks
The communication needs of remote control depend on the autonomy level of the vehicle, where no driving automation will be most demanding for remote control, and full driving automation may not need any remote control support.
A remote control system with control services, including cameras, sensors, and control devices, may involve many other services, such as weather forecasts, engine remote maintenance, cultivation plans, navigation, collision avoidance, and logistic information services. However, the most demanding service in terms of wireless communication is remote control in which the tractor is operated by a person from a remote area based on data from tractor cameras and sensors. For the scenario of remote control of the tractor, the communication medium must meet the requirements of latency and bandwidth for video and control data transfer.
Control data include steering-related information as well as feedback information. The status information from different tractor subsystems is also included in the transmission, such as temperature and battery levels. In the current trial, the tractor control was based on the CAN bus [
33]. The CAN protocol specification allows bit rates of up to 1 Mbps. Our implementation optimized CAN communication by filtering CAN frames so that the tunneled network traffic utilized a 85-kbps bitrate from the tractor and a 53-kbps bitrate to the tractor. These bitrates were from measurements that may involve small amounts of other data traffic. The trialed tractor had five cameras: the forward-facing camera required a minimum of 2 Mbps constant bit rate, while the forward side-cameras, which did not need as good a video quality, could use 1 Mbps. In total, the five cameras required an uplink bitrate of 8 Mbps. A forward-moving tractor requires a minimum view of three cameras.
Concerning human remote control based on video camera views, humans will need 100 ms to recognize or become aware of stimuli, and a minimum 180 ms to see an object and make movements such as pressing a key [
34]. Thus, the end-to-end latency of the system can have 180 ms in human intervention remote control. The end-to-end system latency consists of information processing delays, such as video processing and packet transmission delays, where video processing will have a dominant role [
30].
When vehicle-to-everything (V2X) remote control involves autonomous services without human interaction, different latencies [
35] and quality requirements apply. 5GAA Automotive Association [
36] specified that the maximum latency tolerable for gaining awareness of pedestrians in danger is 100 ms, of which the recommended latency for communication is 20 ms when the assumed vehicle speed is 50 km/h. This indicates that the vehicle will move an additional 28 cm due to a network delay. 3GPP has also specified V2X requirements for cellular networks, as listed in
Table 1, which vary between 5 ms and 100 ms (video stream to a remote cabin and CAN messages to the tractor). The trialed scenario did not involve use cases with close cooperation with other vehicles, such as platooning interactions, but it supported pedestrian safety, and 10 ms was the minimum latency caused by a network for one directional packet. If we assume a tractor with a velocity of 15 km/h and a safety limit due to a network delay of 28 cm, the allowed latency is 67.2 ms (33.6 ms in one direction). However, when tractors are equipped with sensors that can autonomously prevent collisions or when tractors are not used in road traffic, but in private fields without external pedestrians, these requirements may be more relaxed.
2.2. Requirements for Cyber and Network Security
Security in smart farming and unmanned tractor scenarios is motivated by different cyber threats such as discontinued business due to ransomware, network unavailability due to botnet signaling, and human, animal, and food safety due to integrity violated machinery and vehicles or tampered situational awareness. Security challenges in the farm context vary. First, the heterogeneous device landscape indicates that security management and delivering security updates to devices are more challenging. Different devices are also dependent on each other. Business models are becoming more complex from a networking point of view. For instance, specialized service providers, such as tractor entrepreneurs, may need to connect to information sources on the farm. External input, such as satellite images or weather forecasts, may come from various sources. Moreover, maintenance and repair services are often outsourced and external users or devices may need to be granted access to assets, such as tractor systems, for a temporary period of time. Further, security cultures in farms are more varied. Farm personnel often have limited experience, skills, and resources for security configurations, making security error-prone. Finally, physical protection is often limited. The farm network can span large geographical areas without human guarding or monitoring.
The cyber and network security architecture for private network bubbles must be able to address these challenges and support known applications and devices. At the same time, the architecture must be flexible to address emerging applications and security threats, and support restrictions in smart farms (e.g., demand of being easy to use and suitable for harsh conditions). Central objectives as well as functional and qualitative requirements for the security architecture are highlighted in
Table 2.
3. Field Trialed System
This section describes the components and subsystems that were integrated into the field trial that was held in Saarijärvi, Finland, in June 2021. Different technologies and assets in the smart farm were interconnected with several alternative wireless private networks. The network architecture and technology configurations are described in
Section 3.1. A smart unmanned tractor prototype, Valtra N175D Direct Model [
40], from AGCO, was remotely controlled through the network.
Section 3.2 describes our implementation and optimization to enable remote control. We also tested tractor’s autonomous driving capabilities, which were in level 2 (“hands off”). A large number of sensors exist on smart farms. These sensors provide situational information that may be applied, for example, when planning tractors’ operations and when operating tractors. Access to sensitive information, tractor subsystems, and other farm assets is restricted to identified, authorized, and trusted users and devices.
Section 3.3 describes cyber and network security building blocks.
3.1. Local Private Network
This section describes the architecture of the local private network used in the smart farming trial for remote areas.
Figure 1 shows the trial network topology at a high level.
The topology enables remote control of a tractor via multiple local access points and via the internet. It also offers a connection to and inside a smart barn for remote surveillance. More specifically, the optical and thermal cameras and air quality sensors allowed us to monitor the welfare of the animals and the quality of their surroundings inside the cow shed. Soil sensors provided environmental information about the field.
The trial topology is a hybrid combination of multiple ad hoc network bubbles.
Figure 2 shows the setup of the trialed ad hoc private network. Here, the Bittium Tactical Wireless IP Network (TAC WIN MESH) [
41] offers the backbone network for three local 4G access points and one local 5G access point. The backbone network is built with three tactical nodes, for example, tactical routers with radio heads. These routers can automatically reroute any given data from the source to the destination via alternative routes, if the primary route fails. One tactical router has a wired connection to the internet. The internet connection is then shared with the office and with the end users of the access points.
Every tactical router has a wired connection to either a 4G or a 5G access point, or both. Next to the office, there is a 5G stand-alone access point for wireless users. It uses a stand-alone 5G core for authentication. The office bubble has a stand-alone 4G access point in a van. This 4G access point uses an EPC in the van to authenticate the end users. The van EPC is also used for a 4G access point next to the field that needs to be fertilized with the tractor. There is a separate stand-alone 4G access point outside the smart barn. This access point uses its own 4G EPC for authentication. With this kind of setup, bubbles can also be used independently.
In addition to the setup shown in
Figure 1, air sensors are connected to the internet via LoRaWAN [
42,
43] and soil sensors in the fields are connected with LoRaWAN and with ISM bands and 3G [
44]. This technology provides a long range and a good battery life. Moreover, the tractor can use a commercial 4G network with a higher priority subscriber identity module (SIM), denoted as QC128.
A satellite connection is also available as a backup communication system [
45]. The geostationary (GEO) satellite provides sufficient throughput but cannot support remote control of the tractor due to its large latency. Nevertheless, GEO satellites can be efficiently used to share situational awareness data for farms. Although commercial mobile network infrastructure is widespread, there are still gaps in rural and sparsely populated areas. Another aspect is that it might be impossible to deploy terrestrial infrastructure. Satellite systems can provide a fast deployment solution. Once the antenna is deployed, and line-of-sight to the satellite is in place, transmission is possible. See
Table 3 for more specific information about the trial access points.
3.2. Remote Controlled Tractor
In the field test, a tractor was remotely controlled over mobile networks. The objective was to drive the tractor remotely to the field using a fertilizer work machine. After that, the work machine was able to automatically fertilize the field together with the tractor. Remote control was then used to monitor and guide the tractor. To do this, the remote control cabin had a steering wheel, pedals, and other controls needed for operating the tractor (
Figure 3). Three screens were attached to display video feeds from the tractor.
Figure 4 shows an overview of the network and devices for remotely controlling the tractor. All network traffic went over a virtual private network (VPN) connection, and from the device’s perspective, they were all in the same local area network. A router [
48] from Goodmill Systems connected the tractor to the tactical network using various mobile networks. By default, the router uses all these connections simultaneously and switches between them to provide the best possible service. For the trial, the router was configured to use the networks one at a time for testing purposes.
3.2.1. Remote Controlling Tunnel between the Tractor and the Remote Control Cabin
The target for the remote control was to verify the remote controllability of the tractor through different trial networks. This was done by transmitting CAN messages over the network. While exploring the possible technologies, SocketCAN [
49] was found. SocketCAN allows for controlling and programming the CAN interface as a standard network interface. SocketCAN userspace utilities and tools make it possible to set up and test the CAN network with ease. Furthermore, the Cannelloni tool allows the transfer of CAN data over an Ethernet tunnel.
Cannelloni supports tunneling over user datagram protocol (UDP) or stream control transmission protocol (SCTP). The UDP is faster but unreliable, while the optional SCTP provides reliable transport. At this point in the research, there was a perception that all the messages should go through. Therefore, SCTP was chosen.
Initial testing was conducted using virtual CAN interfaces from SocketCAN. Functionality was then tested using real hardware. For this, Kvaser Leaf Light v2 was chosen, which has a SocketCAN driver available directly on the Linux kernel. Kvaser adapters were verified to work using a test CAN network. The test CAN network was a cable made using a single termination resistor between the high and low CAN lines. After CAN tunneling was possible in a local area network, the next goal was to conduct traffic securely over the public network. The SSH port forwarding was chosen to transfer the CAN traffic over the network and encrypt it. However, SSH port forwarding does not work with SCTP. Therefore, one more step was required to convert SCTP to the transmission control protocol (TCP). Protocol conversion was done using the Socat utility.
Figure 5 shows the components and protocols used to tunnel CAN traffic over networks.
Both the tractor and remote control cabin have Raspberry Pi 4 computers connected to the local CAN using a Kvaser USB adapter. The Raspberry Pi on the tractor side works as a client to establish a connection to the server Rasberry Pi computer on the remote cab. The idea is that the remote control cabin server has a known address on the internet, while the tractor can be behind the mobile operator’s network address translation (NAT) and or firewall. The software on the tractor side will consistently attempt to establish a connection to the server until the connection is made. If the connection to the server is lost, the client continues to try to make a new connection.
Devices on the tractor generate a lot of CAN traffic, most of which are not relevant for remote control. The same is also true for the remote control cabin. Therefore, message filtering was added for both ends to pass through only the required messages. Filtering was implemented by modifying the Cannelloni software to accept filtering rules for outgoing and incoming traffic. During the field trial, it was sufficient to filter outbound traffic at both ends.
3.2.2. Filtering, Frame Aggregation, and Overhead Measurements
In the Cannelloni program, we can specify how long to wait for new CAN messages before forwarding them through the tunnel. A longer time allows sending several CAN frames in one Ethernet frame, which improves throughput but causes latency. For time-critical messages, it is possible to provide their CAN IDs and customized timeout values. During the field trial, a very short timeout of 5 ms was used to minimize latency. A short timeout causes a lot of network overhead, but this was not an issue because filtering also reduces the number of CAN frames to be sent to the minimum.
Filtering rules can be assigned to the SocketCAN interface. The filtering rule includes a mask and filter ID. A received CAN frame is passed if:
where & is a bitwise AND operator, and
is equal comparison.
The Cannelloni program was modified to allow for filtering rules from files. A rule file defines masks and filters IDs. These rules can be applied to both incoming and outgoing CAN frames. Outgoing frames from the local CAN network are filtered using SocketCAN, located directly in the Linux kernel. A similar implementation was added to the Cannelloni for incoming frames over a network, which checked CAN frames after they were decoded.
The Cannelloni performs CAN frame aggregation by adding a 5 bytes header containing the protocol version, frame type, sequence number, and number of CAN frames included. This data is then encapsulated in the SCTP message, which adds around 32 extra bytes, depending on how much chunk padding is needed. Next, the SCTP message is converted to TCP using Socat. Conversion removes chunk padding and replaces the SCTP header with the TCP header, which is 4 bytes longer. Finally, data are encrypted and sent using SSH, which also increases packet size. In the worst case, for sending one 13-byte CAN frame, 171 extra bytes are needed, which indicates 93% overhead.
Table 4 shows the amount of network overhead based on how many CAN frames are sent together. The CAN data column is the number of bytes for actual CAN traffic. SCTP and TCP indicate packet size after the aggregation and conversion steps. SCTP and TCP values are without Internet Protocol version 4 (IPv4) and Ethernet II headers. The total bytes column contain all the bytes sent over the SSH, including the ACK message. The extra bytes and overhead columns show how many extra bytes are needed and how much of the transmitted bytes are needed to tunnel the payload over the networks.
A customized CAN message was used to control a tractor. The information required to drive a tractor was packed into a single message. The electronic control unit (ECU) was programmed to control the tractor based on the message as long as messages arrived within at least 100 ms of each other. If the interval between messages is longer than 100 ms, the ECU starts to stop the tractor for safety reasons. The interval target introduces a requirement for even network communication and how long Cannelloni can wait for frame aggregation. When the interval requirement is not met, steering and driving are jerky because the ECU occasionally tries to stop the tractor.
3.2.3. Remote View from the Tractor
For the remote control of a tractor, it is essential to see where the tractor is going. Therefore, various cameras (listed in
Table 5) were tested during the trial to provide views from the tractor. For safety reasons, a human driver was also in the tractor cockpit, ready to take manual control if needed.
The shortest latency was obtained using the H.264 codec. Other codecs available from the cameras were H.265 and MJPEG. The Hikvision camera with the Chrome browser extension provided the best results on the screen-to-screen, with a delay of about 100 ms. For the other cameras, the delay was about 250 ms. Multiple cameras were tested simultaneously by opening their views on separate screens, but the best results were obtained using a single-camera setup. When several video transmissions were used, the video would lag and freeze from time to time.
The cameras were connected using two different router setups. With the Goodmill router, the camera traffic was forwarded through a cloud server, and with the Teltonika router, the traffic was directly forwarded using port forwarding. Dynamic DNS (DDNS) was also used to allow a connection to the cameras with dynamically changing IP addresses.
3.2.4. Network Quality Measurements
This section describes how we conducted a performance analysis of the available networks. Network quality must be measured to verify reliable remote control. In this trial, adequate remote control required at least two high-definition videos. Furthermore, these videos require a total throughput of 5 Mbps in the uplink direction. An uplink throughput of 5 Mbps was then obtained with the 2.3 GHz time division duplex van mounted base station when the reference signal received power (RSRP) was over −98 dBm.
Figure 6 shows the RSRP measurements of the base station. From this figure, it is possible to roughly estimate the remote driving range of this base station configuration. Note that the uplink-downlink frame configuration had two subframes for uplink and six for downlink due to the obtained frequency license.
The round-trip time (RTT) of the same base station is shown in
Figure 7. Note that the one-way delay is approximately half of the RTT. Thus, the control messages reaches the tractor faster. Whenever a network is good enough for videos, the control messages are fast enough with the local network. We also measured commercial network round-trip times. The RTT was approximately 30–40 ms faster with a local base station than with a commercial network. The delays were sufficient for the use case.
3.3. Cyber and Network Security
This section describes the applied security solutions that fulfil the objectives for security architecture, which are presented in
Table 2.
The baseline security—confidentiality and authenticity—of communications is based on standard security architectures [
39,
47] for access and backhaul networks. Radio access is secured with appropriate security protocols for 3GPP or Wi-Fi networks. Key distribution is based on physical SIM in the case of 4G and 5G networks and on passwords, which can be shared with QR codes, for example, in the case of Wi-Fi in the satellite bubble. Further, different application-layer security protocols were utilized to provide additional security, such as SSH for CAN and TLS for sensor communication and web-based information services.
Communication between different private networks, which were inter-connected with a tactical core network, was secured with frequency hopping and adaptive modulation [
41] and connections that spanned over the internet were secured with IPsec based VPN. The internet and other external services were behind firewalls, which were hosted within the tactical core network or outsourced for commercial service provider.
Custom security needs in smart farming and remote-controlled tractor scenarios include fine-grained access control mechanisms to segregate users and devices, thus protecting assets and minimizing the threat of malware. Custom security needs in remote-controlled tractor scenarios emphasize the trustworthiness, reliability, and integrity of the safety-critical components as well as the availability of low latency communication needed for navigation.
3.3.1. Access Control and Security Posture Management for the IoT
We trialed a solution for identity, access and trustworthiness management. The proposed access control service authenticates and verifies IoT devices that request access to services within the network. The widely used OAuth 2.0 identity and access management framework [
50] authorizes new users and devices for the information and application services in the farm. In addition to authenticating IoT devices, the solution verifies that devices with trusted execution environments are running the expected, trusted software. To support networked business models in SF, authorizations can be temporary.
During the trial we verified the feasibility of our extensions to the OAuth 2.0 framework. The extension, which we specified in [
51], allows IoT devices to prove their software integrity using device attestation as part of OAuth’s delegated authorization protocol. The approach is able to detect and locate compromised devices and malware as well as verify the trustworthiness of devices requesting access to services. The solution and performance measurements [
51] demonstrated the feasibility of the approach. The latency (during the security handshake phase) depends on the processing capabilities of the IoT platform. Total latency of handshake—made when a device first accesses a service and that contains authentication and attestation phases—was around 11.1 s for the ARM TrustZone-based NuvotonM2351 class micro-controller. The attestation procedure is a relatively rare operation and will not impair the feasibility of sensor applications.
3.3.2. Resilience through a Satellite Link
The network can recover from detected threats and failures in different ways, for example, by replacing compromised components with trusted ones. We also studied recovery with alternative connections, particularly using alternative private networks, commercial networks and a satellite link.
A satellite link provides a backup backhaul channel that may be utilized when commercial backbaul is unavailable. Our satellite solution included a GEO satellite, a connection that was arranged by Dawson Ka-Sat SC-Zero 70K nomadic terminal from Viasat. This satellite terminal is capable of providing broadband connectivity. Transmission was performed using the Ka-band. The satellite terminal was then connected to the Goodmill router. The router is a multi-channel solution that allows us to monitor the networks and change the wireless transmission technology based on the QoS or other rules employed at the router. It is also possible to direct specific traffic flows to certain tunnels and set the desired backhaul for them. This approach can add an extra layer of protection for security-critical endpoints (such as a UE) by directing sensitive traffic through the most trustworthy route. Thus, when security incidents are detected in the commercial network, based on the specific traffic rules the Goodmill router can switch connectivity to use the secured satellite backup link.
Apart from the backup satellite link, the utilization of satellite systems can be beneficial in terms of traffic balancing. The terrestrial network could be used to stream the video and the satellite network could be used to broadcast situational awareness data.
4. Related Works
This section reviews and compares related research with our contributions. In particular, we will survey and analyze efforts related to vehicle remote control as well as to 5G networks and cybersecurity architectures for smart farms.
The CAN bus protocol allows communication between the electronic control units inside the vehicle and is widely researched as an embedded system to get tactical and status information from the vehicle and its surroundings. The protocol has been applied for data collection and monitoring, such as temperature acquisition for mining equipment [
52] and status monitoring for off-road studies [
53]. There are also research studies concentrating on steering system based headlight control using CAN [
54], and utilizing the CAN protocol to obtain data from the driver for further analysis [
55]. However, many of the reviewed works related to the CAN protocol concentrate on modeling and simulations and not actual integrated solutions. One of the few implementation works [
56] describes a combination of real-time kinematic positioning and global navigation satellite system and the CAN bus protocol for control algorithms of the tractor. The work was concentrated on the accuracy of the automated tractor movement rather than network performance. Hu et al. [
57] focused on remote steering of the wheel of the tractor using the CAN bus and CANtest software. Their results showed sufficient response speed and accuracy, satisfying the requirements of an agricultural automatic navigation system.
Our study proposes implementing remote control of the tractor by relaying the in-vehicle CAN messages of the tractor between the CAN of the remote connection system through the mobile network over the IP. Ditze et al. [
13] studied relaying ISOBUS messages and implemented over IP over CAN by implementing an IP address into a CAN identifier. Lindgren et al. [
14] proposed a similar approach to a modular lightweight IP stack for embedded platforms. A wireless solution with an IEEE 802.11b WLAN was proposed by Bayilmis et al. [
15]. Similar to our approach, Johanson et al. [
16] developed a CAN-over-IP tunneling protocol for diagnostic and monitoring purposes. Our approach utilize Cannelloni [
17] gateway software to relay the selected CAN messages of two systems over an IP tunnel. The solution was implemented in the real tractor and tested in a mobile network environment.
Several reviews [
19,
20,
21] have identified the opportunities that mobile communication technologies, particularly 5G, promise for agriculture and SF scenarios. 5G has been proposed as a reliable, low-latency, high-capacity channel that can enable new applications, such as automated machines, real-time monitoring, virtual consultation, predictive maintenance, data analytics, and cloud repositories. Consequently, 5G will facilitate the transformation of farms to become more secure, reliable, environmentally friendly, and energy-efficient. For instance, 5G can provide communication channels for controlling and monitoring different aerial and ground vehicles [
22], including unmanned tractors [
58] and bulldozers [
59]. Other wireless technologies—Wi-Fi, satellite, and LoRaWAN networks—have been considered alternative channels to gather, for example, sensor data, situational awareness, and tractor localization information [
60]. To enable interoperability between different systems in SFs, both application [
61,
62] and network layer solutions have been proposed. We focused on the connectivity between wireless networks and studied wireless backbones. These reliable mesh-based tactical network architectures have previously been used in both enterprise [
63] and military [
64] domains. Existing field trials and pilots have addressed, for examle, IoT and mobile technologies for SF [
65], LoRaWan for IoT [
66], as well as 5G for V2X communications [
67]. We extend these verification efforts by trialing a unique combination of mobile, tactical, sensor, and satellite network technologies.
Existing cybersecurity research [
23,
24,
26,
68,
69,
70] has identified threats, challenges, and requirements that must be addressed when deploying wireless networks for SF scenarios. Central challenges include external threats and insider issues due to a lack of security awareness, information technology expertise, and resources. Cybersecurity research has also covered vehicular communications [
28,
71,
72,
73], including tractors [
74] and other unmanned work vehicles [
75], and has reviewed [
76,
77] threats, risks, and solutions for 5G-based IoT [
76] and critical communications [
77]. Our proposal for the security architecture objectives is built on these studies. We combine different viewpoints and propose an approach that highlights SF specific security threats and characteristics. Our approach addresses the risks of farm vehicles but also acknowledges that modern farms and agricultural tractors do not operate in isolation but require large amounts of connected users and assets, including sensors and external services, which must be protected and trustworthy. Remote attestation solutions have been proposed to verify the trustworthiness of IoT-sensors [
78] and components in vehicles [
79]. Our delegated device attestation proposal [
51], which was part of our SF security architecture, specified how integrity verification could be integrated into an identity and access management framework OAuth2. The framework is commonly used on the web, for example, by Google, Microsoft, Amazon, and Apple. Previous research [
80,
81] demonstrated the integration of OAuth2 and remote attestation. Our novel contribution demonstrated how the approach works with IoT devices and the device identifier composition engine protocol [
82]. Consequently, we demonstrated a practical approach to managing the security and trust of different kinds of devices and components within SFs.
5. Conclusions and Future Research
We expect the remote control of machinery to assume a role in future smart farming. As technological advances in various fields are increasing the range of functions that machinery can perform autonomously, we are experiencing analogous developments in road traffic.
The PRIORITY project has concentrated on researching and testing the capability for remote control using a local private mobile network as the communication medium. We have successfully demonstrated the viability of the remote operation of a tractor in a realistic environment. Future research is needed to support the transition toward more autonomy and thus improved efficiency and productivity. Appropriate regulatory and legislative changes must also be addressed within the boundaries of maintaining a high level of security and safety. The emergence of such changes will likely differ between different countries and regions. In fact, the application of new technologies may actually promote better safety by removing human error.
The development of the trialed system enabled us to learn the need to customize applications and optimize protocols for private networks, where the available resources can be limited. In particular, we demonstrated the optimization of control channel signaling. In the future, emerging network and application technologies and architectures may provide new opportunities. More research is needed to find innovations to a) adapt applications to restrictions caused by the network or b) customize the network to support applications. For instance, tractor remote control traffic was implemented using the SSH protocol. The SSH tunnel is convenient, but because SCTP was used with Cannelloni, it introduced the need to perform a protocol conversion using the Socat program. In the future, there is room to replace SSH with end-to-end secure connections that offer better efficiency by eliminating the processing coming from conversion and more security by removing the need for the trusted conversion component. Furthermore, Cannelloni could use UDP instead of SCTP, allowing the dropping of some packets as congestion control. Dropping packets is preferable to inserting old messages into the CAN bus after retransmission is achieved.
Approaches to adapting the quality parameters of low-latency video streams to support different 5G use cases were evaluated in our prior work [
83,
84] within the project. Video adaptation solutions provide a promising approach for optimizing the video channels used by unmanned tractors. Future research is needed to develop and verify adaptation approaches for unmanned tractor scenarios.
The security approach for coupling identity and access management and our new protocol [
51] for attesting the trustworthiness of IoT devices were now demonstrated with farm sensors. Future research is needed to explore whether our combined identity management and delegated attestation architecture would be feasible for unmanned tractors.
The trialed GEO satellites are able to provide large coverage and good throughput capabilities, but the main disadvantage of this solution is large latency, which limits the applicability of the GEO satellites mostly to monitoring use cases. Future work and trials are needed to explore the opportunities provided by low-earth orbiting (LEO) satellites. LEO satellites allow for dramatically decreased transmission latency due to their shorter distance from Earth (e.g., from 500–1400 km compared to 35,786 km for GEO). LEO satellite connectivity possibilities have been trialed, resulting in throughput of 195 Mbps and round-trip latency levels of 70 ms [
85]. This connectivity performance would be sufficient not only for monitoring use cases but potentially also for the remote control of an unmanned tractor.