The ethical governance proposed in this paper is only similar to the ethical governance mentioned in the literature in that it will evaluate the ethical appropriateness of a mission before the mission is executed. Most conversations concerning the ethics of the machine have to do with how the machine conducts itself either autonomously or semi-autonomously during the mission’s operation. The value model that is being proposed will create metrics out of the various ethical concerns that arise when conducting sUAS operations amongst the civilian population. This will include operations regarding civil applications as well as applications used by law enforcement, e.g., surveillance.
3.1. Value Model Formulation of Autonomy Level and Technological Readiness Level
The first part of the value model will concern the characteristics of the sUAS itself that will influence how this ethical governor will affect the sUAS. One of these characteristics is the autonomy level (AL) of the sUAS. The AL, shown in
Table 1 below, influences how the ethical governor will be used. This goes back to the notion of who carries the blame for ethical violations. If the AL is high, then the ethical governor will be encoded into the sUAS systems itself so that it evaluates if a mission is ethical or not and acts accordingly. With this, there needs to be an accompanying metric of the UASs technological readiness level (TRL), shown in
Table 2 below, which addresses how much testing has been done on the system itself along with the AL. These two characteristics allow an Autonomy and Technological Readiness Assessment (ATRA) to be used [
15]. An example of the relationship between the TRL and AL as they conduct the ATRA is shown in
Figure 1.
3.2. Incorporating Ethical Concerns
As mentioned, there is a notion that the perceived social benefit can be a way of measuring an application as more acceptable to the public. This could be turned into a metric where various applications are rated based on their benefits to society. The application of UASs to structural health monitoring, for example, benefits society by not only enabling monitoring the health of infrastructure but from also eliminating the need for roads, bridges, etc. to be shut down for monitoring to occur because of reduced human personnel. On the other hand, the application of a hobbyist flying his sUAS around town to take pictures for monetary values that only he receives has a relatively low social benefit. Through this, there needs to be a distinction between applications that can be only acceptable at the time they are needed and not at a constant rate. For example, the application of using sUAS for surveillance by the police has a perceptively high social benefit. But there needs to be a distinction between surveillance only being used by the police as a mission calls for it and constant surveillance. Constant surveillance can lead to detrimental effects on society [
10] that may outweigh its perceived benefit to society. Outside of the social benefit, the possible ethical concerns that could be violated by certain applications should be quantified into the model. Since privacy is a broad ethical concern, it can be divided into the aforementioned categories: Intrusion upon seclusion, appropriation of a name or likeness for profit, public disclosure of private facts, and presentation of an individual in a false light [
8]. This privacy concern will apply to any operation where a sUAS is operating a camera, i.e., aerial visual inspection, surveillance, journalism, etc. Another ethical concern is safety; this is about ensuring that the public will not come to harm while sUAS are in operation. This directly correlates to the ATRA. The ATRA needs to indicate that with a high AL, there is a high TRL otherwise there may be a concern for the sUASs susceptibility to hacking and system malfunctions which could result in crashes harmful to the populous. This, in turn, affects the ethical concern regarding safety. This is where the mission objectives and the environment wherein the mission being conducted comes into play (
Figure 2).
Depending on the environment and the objective of the mission, certain ethical concerns may or may not come into play. For example, a structural health monitoring operation on a wind turbine in an unpopulated area will not have any ethical concerns regarding the civilian population whereas a health monitoring operation done on a busy bridge has ethical concerns regarding safety as the sUAS conducts its mission.
3.3. Proposed Model
The challenge with quantifying such qualitative concerns stems from deciding how metrics can be used to illustrate the various factors effectively in the model. These factors cannot be as easily quantified as the autonomy level and technological readiness for example. The proposed way to quantify these concerns is based on determining and evaluating the tradeoff between perceived social benefits and possible ethical violations. Once again, the example of using sUAS for police surveillance illustrates that tradeoff. If the police are pursuing a suspect and that suspect has gone into hiding, the social benefit of the police using a sUAS overhead to find the suspect is high. On the other hand, constant generalized surveillance of an area has different implications. The social benefit would be solely based on the use of a UAS in a specific event in anticipation of a favorable outcome that could be captured on film. Outside of that instance, the constant surveillance will more likely than not be collecting information on everyone in the area without their permission or possibly their knowledge. This violates the concern for privacy. Therefore, an ethical governor may keep this sort of mission of continuous surveillance from being conducted with the sUAS that it is encoded on. The question remains, how can a metric be created for ethical concerns? To create a metric, the ethical concerns would have to be valued for them to be used in the model. If the literature is any indication, privacy is the top ethical concern in regards to sUAS used amongst the civilian population and safety being the next in line [
8]. While this may be counterintuitive, the public is more concerned with what the drone is doing flying around and less about the possibility of it crashing. The evaluation of ethical concerns is proposed to be the probability of an ethical violation multiplied by the magnitude of said violation. This value should be compared to the value given to the social benefit of the mission depending on where that mission falls within the social benefit metric. If the ethical concern value exceeds the perceived social benefit, then the ethical governor will keep the sUAS from executing that particular mission.
As stated the Learned Hand formula model will be broken down into three parts. The social benefit (B), the likelihood of an ethical concern (L) being violated, and the value assigned to the magnitude (M) based on the number of ethical concerns within the mission space. The social benefit of the mission will be based on the task being conducted. Essentially, if a mission is potentially saving lives, it has the highest social benefit, and if the mission is only being conducted for the pilot’s personal gain, then it has the lowest social benefit.
Table 3 gives a metric for social benefit below.
In this table, Public Service refers to jobs such as police or firefighters use. Construction refers to any work being done using UAS on infrastructure. Journalism is using UAS for video and pictures that are being submitted for news reporting purposes. Photography simply describes the various businesses that are contracted to take aerial pictures, and the Hobbyist category is anyone that decides to use a UAS for operations where no monetary gain is given. This metric gives the value to which the probability and magnitude must either match or be below for the mission to be deemed ethical to be conducted.
The second part, the “Magnitude of Ethical Concerns” has to do with the factors of the environment that could lead to ethical concerns being violated. Currently, the factors are population density, structures in mission space, size of UAS, and prevalence of private property. The population density simply put, says how many people are potentially in danger if a UAS fails and crashes, and how many people could potentially have their privacy violated by a UAS flying overhead. Structures in mission space states how many buildings are in the area that the UAS would have to avoid during the mission. Size of the UAS is not based on all UAS but on UAS that are allowed to operate within civilian airspace (i.e., UAS weighing less than 55 lbs). Prevalence of private property is the number of private residences and other properties within the mission space. Each of these factors is given a value of 1–3 based on the actual measurement of the factor itself shown in
Table 4.
The third part, the “Likelihood of violating an ethical concern” is ultimately going to decide if a mission is ethical or not within this model. This part is broken down into the factors of Proximity to restricted airspace, Proximity to private property, TRL, AL, Obstacle avoidance sensors, and Payload. The Proximity to restricted airspace has to do with how close the operation is to airport airspace or other airspaces where flight is prohibited, or permission is needed. Obstacle avoidance sensors will rule out the possibility of a UAS crashing into a structure if still operational, and Payload will state whether the payload of the UAS in considered intrusive or non-intrusive. An example of an intrusive payload would be infrared imaging compared to non-intrusive being regular photography.
It should be noted, as shown above in
Table 5, that the likelihoods are equally assigned the same metric stating that they could each contribute to the likelihood of a violation equally with each factor being able to contribute 15% to the likelihood of a violation. The assumption is that there is never a 100% likelihood of an ethical violation. Each of the aforementioned factors for both the likelihood and the magnitude have their own ways of being measured and based on those measurements, the value they are given within the metric will be determined. If the product of M and L is less than the value of B, then the mission is deemed ethical.
3.4. Example 1 Using Proposed Model
A hobbyist photographer is taking pictures of the landscape in his neighborhood using a DJI Mavic Pro, and his neighborhood is near the airport.
Starting with the “social benefit” metric, this operator is a hobbyist meaning his social benefit would receive a value of 1. For establishing the magnitude (M), the “population density” would be given a value of 2 because of his conducting the mission in a suburban neighborhood with a moderate population density. There are many residences in the area giving a value of 3 to “structures in the mission space” as well as giving a value of 3 to “prevalence of private property” because of the numerous private residences in the area. The size of the Mavic Pro can be given a value of 2 for not being the largest UAS that someone can purchase but not a micro UAS either. These values are shown in
Table 6.
Next, the likelihood of an ethical violation occurring needs to be evaluated. For this example, since the neighborhood is near an airport, the value assigned to “proximity to restricted airspace” factor would be 0.15. Since the mission is being operated in a residential area with many private properties, the “proximity to private property” factor is assigned a 0.15. The TRL of a Mavic Pro is high, so that factor is assigned 0 and its AL is medium, so the AL factor is assigned a 0.08. The Mavic Pro has obstacle avoidance, so that factor is given a 0, and the camera on a Mavic pro is a regular HD camera, so the “payload” factor is given a 0 as well.
Table 7 displays these values.
In this example, the product of the magnitude of the ethical concerns and the likelihood of ethical concerns being violated is 3.42 and outweighs the social benefit of the mission being 1. Per this model, the mission in this example is deemed unethical by the proposed model.
3.5. Example 2 Using Proposed Model
The local police department is planning on using in-House built UAS modeled after the DJI Matrice 900 equipped with an infrared camera to search for a murder suspect in hiding within a downtown apartment building.
Since this mission type falls under “Public Service” the social benefit of this mission is a 5. The population density of a downtown residential area is high so that factor receives a value of 3. There are numerous structures including skyscrapers in the area, so the “structures in mission space” factor receives a 3. The size of the UAS is given a value of 2, and the prevalence of private property in the area is moderate so that factor is given a 2. The magnitude of this example is displayed in
Table 8.
For likelihood shown in
Table 9, the mission does not take place anywhere near restricted airspace, so the “proximity to restricted airspace” factor receives a value of 0. The mission will operate close to some private properties in that downtown area, so the “proximity to private property” factor receives a value of 0.15. Being in-house built, the UAS has a medium TRL receiving a value of 0.08 and is semi-autonomous, so it receives a value of 0.08 for the AL factor. Sensors were placed on the UAS, so the obstacle avoidance factor receives a value of 0, and since the UAS is equipped with an infrared camera which is considered an invasive payload, its “payload” factor is given a value of 0.15.
In this example, the product of the magnitude and likelihood do not outweigh the value of the social benefit, so this model deems this operation ethical.
These examples illustrate how this model can be used. While the results of each example can seem intuitive, it is important to see how different values in any of the metrics, especially the metrics within the likelihood, would make even a mission with a high social benefit become unethical. This leads to the notion that the metrics for the social benefit need to be re-evaluated, potentially breaking the social benefit element into multiple factors that affect that value like the likelihood and the magnitude. This will require further research into the best way to do so. In addition, it should be noted that the metrics were assumed based on the environment. In the future, these metrics will have defining measures, i.e., population density will need to be a certain number to fall under a certain value. For example, in example 2, the assumption is that all downtown residential areas will have a high population density. Further research will be conducted to validate those types of notions.