Next Article in Journal
Construction of a Static Model for Power Generation of OTEC Plant Using Uehara Cycle Based on Experimental Data
Next Article in Special Issue
Monitoring Litter Inputs from the Adour River (Southwest France) to the Marine Environment
Previous Article in Journal
Numerical Investigation of Extreme Wave-Induced Loading on Box Girder in Marine Environment
Previous Article in Special Issue
A Marine Information System for Environmental Monitoring: ARGO-MIS
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Level of Automation in Emergency Quick Disconnect Decision Making

1
Institute of Maritime Operations, University College of Southeast Norway, 3603 Kongsberg, Norway
2
Department of Science and Industrial Systems, University College of Southeast Norway, 3603 Kongsberg, Norway
*
Author to whom correspondence should be addressed.
J. Mar. Sci. Eng. 2018, 6(1), 17; https://doi.org/10.3390/jmse6010017
Submission received: 28 November 2017 / Revised: 30 January 2018 / Accepted: 5 February 2018 / Published: 12 February 2018
(This article belongs to the Special Issue Maritime Environment Monitoring)

Abstract

:
As a key measure for safety and environmental protection during offshore well operations, drill rigs are equipped with Emergency Quick Disconnect (EQD) systems. However, an EQD operation is in itself considered a risky operation with a major economic impact. For this reason, it is of great importance to aid the operators in their assessment of the situation at all times, and help them make the best decisions. However, despite the availability of such systems, accidents do happen. This demonstrates the vulnerability of our human decision-making capabilities in extremely stressful situations. One way of improving the overall human-system performance with respect to EQD is to increase the level and quality of the automation and decision support systems. Although there is plenty of evidence that automated systems have weaknesses, there is also evidence that advanced software systems outperform humans in complex decision-making. The major challenge is to make sure that EQD is performed when necessary, but there is also a need to decrease the number of false EQDs. This paper applies an existing framework for levels of automation in order to explore the critical decision process leading to an EQD. We provide an overview of the benefits and drawbacks of existing automation and decision support systems vs. manual human decision-making. Data are collected from interviews of offshore users, suppliers, and oil companies, as well as from formal operational procedures. Findings are discussed using an established framework for the level of automation. Our conclusion is that there is an appropriate level of automation in critical situations related to the loss of the position of the drill rig, and that there is the promising potential to increase the autonomy level in a mid- and long-term situation assessment.

1. Introduction

Accidents in the offshore petroleum industry have proved to have devastating consequences for human lives, the marine environment, and economies. Tragic examples include major gas leaks with resulting explosions and fires such as the Piper Alpha [1] and oilrig collapses [2]. Ship groundings and collisions with large oil spills caused massive environmental damage [3]. The tragic incident in April 2010 with the offshore drilling platform Macondo illustrates the dramatic effect of losing control of a subsea oil well, in terms of the natural environment [4,5], loss of human lives, and impact on the wider society [6].
Health, Safety, and Environment (HSE) has thus become a central and integrated part of offshore operations, culture, and systems. Numerous industrial regulations point to safety barriers as a key concept to ensure safe operations, such as the Seveso II directive [7], the Machinery directive [8], and the IEC 61508 standard on functional safety [9]. According to [10], “safety barriers are physical and/or non-physical means planned to prevent, control or mitigate undesired events or accidents”.
One such safety barrier is Emergency Quick Disconnect (EQD), which is the process of rapidly, but safely, disconnecting a mobile offshore drilling unit (MODU) from a subsea oil and gas well. While in operation, MODUs are connected to the subsea wells via a marine riser as illustrated in Figure 1, and this system is exposed to dynamic forces caused by waves and currents [11]. In order to minimize the effect of these forces, components such as tensioners and telescopic and flexible joints are in place. A large Blowout Preventer (BOP) is placed on top of the subsea wellhead located at the sea bottom, as an additional safety barrier to prevent hydrocarbon leakage in case the riser system is damaged.
However, the main measure to prevent such loads and potential damage is to keep the rig in position directly above the well, i.e., within the green zone. For this, MODUs are equipped with a dynamic positioning (DP) system [12,13], which controls a set of powerful thrusters. However, different events may occur [14] that cause the MODU to lose position. This is denoted as a drift-off or drive-off event. A drift-off is when the thrusters are not producing enough force to compensate for wind and waves, while a drive-off is when failure in the DP-system causes the thrusters to push the rig out of position.
Drifting away, i.e., exceeding the yellow and even red zone generates excessive strain, which may compromise the mechanical integrity of the riser and the well [15]. If the vessel goes beyond the green limit and enters what is called the yellow zone, then all operations must be stopped and the crew must prepare for a possible disconnect from the well. In shallow water, the green zone will be smaller than in deep water, as the riser is shorter, and the small movement of the rig away from the central position will add significant tension to the wellhead. If the rig moves outside the red limit, or if other critical events arise, a decision needs to be made if an EQD should be initiated.
In this paper, we explore the decision making process related to initiating an EQD, in particular to what degree this process is manual or automated. As technology is evolving more rapidly than the capabilities of human beings, it seems reasonable to expect that automation will play an increasingly important role in the further development of EQD. Thus, the objective of this paper is to explore the current level of automation in the EQD decision making process, as well as the benefit and challenges of current automation, and to discuss if an increased level of autonomy is a way forward to further strengthen EQD as a safety barrier.
This paper is organized by first presenting a literature review of EQD and theoretical aspects of the decision making process. Then, we provide a description of our research framework, methods, and more specific research questions, followed by our results and analysis. Finally, we provide a discussion and conclusion.

2. Literature Review on the EQD Process

2.1. The Emergency Quick Disconnect Function

Chen et al. [16] present the safety system of DP drilling operations in terms of three barrier functions: prevent loss of position, arrest vessel movement, and prevent loss of well integrity. These three barrier functions were included to safeguard the DP drilling operations, and they correspond to the green, yellow, and red circles as illustrated in Figure 1. The DP control system works continuously to prevent a loss of position event (green circle). If the vessel goes beyond the green limit and enters what is called the yellow zone, then all the operations must be stopped and the crew must prepare for a possible disconnect from the well. Verhoeven et al. [14] discussed the failure modes, applicable frequencies, and probabilistic modeling for both the loss and recovery of position.
In the unfortunate case that the MODU drifts into the red zone, the third barrier function “prevent loss of well integrity” should be activated. There are three elements that together achieve this third Barrier Function [16], namely the EQD (Emergency Quick Disconnection system), SDS (Safe Disconnection System), and well shut-in function.
Normally, all these three barrier elements are activated by one operation and fully automated once an operator has pushed the activation. We will, for simplicity, name the whole sequence of activating the barrier as the Emergency Quick Disconnect System (EQD). The main purpose is to safely seal the well, disconnect the riser, and ensure that a later re-connection to the well is possible. This enables the rig to move away from the hazardous area and return for further operation later. The time from activation to physical disconnection is approximately 40 s on shallow water operations.
However, an EQD is considered as a brutal disconnection of the system compared to a planned disconnection, as the EQD sequence also includes cutting what is inside the BOP. This is normally a drill string of some inches, coil tubing, or an intervention string. Recovering the well includes retrieving these detached tools and strings. This is timely, difficult and costly, and in some cases, risky. The industry is working to create better and stronger cutting mechanisms in the BOPs, but still, not all equipment can be cut, such as sand screens and casings. If such items are inside the BOP, the EQD cannot be activated immediately. Normally, the Dynamic Positioning Operator (DPO) gives a signal/alarm to the rest of the crew warning them about a situation, and the drilling crew are the ones to pull the EQD trigger, as the driller always knows the well status.
The final disconnection sequence will typically involve an operation of 15–20 (or more) functions [15], which are done automatically. The intention is to be able to seal off the well with the BOP, close the bottom of the riser, and at the same time not damage the BOP as the riser is disconnected, thus avoiding oil spills to sea.

2.2. Parameters Related to “Loss of Position”

As the rig and riser system behaves according to laws of physics, the likely development, and thus expected criticality, of a DP drive-off/drift-off event can be estimated. It is possible to provide simulation of expected drive-off trajectories of a DP drillship vessel using vessel attitude and environmental parameters, along with the mechanical characteristics of the riser system [15], but we found no indications that this is applied in real operations.
Changes in vessel position are determined by vessel speed and direction, which in turn is determined by a large and complex set of parameters including initial speed, initial direction, and applied forces, i.e., vessel thruster schedule (how much thrust is applied, in what direction, and for how long). Thruster force may be automated or run manually. It is further affected by rig displacement (inertia), shape (hydrodynamics, friction), and the degree of heave, yaw, pitch, and roll.
There is a number of relevant environmental parameters, the most prominent being wind, waves, and currents, but also water depth, and the characteristics of the sea bed soil (the soil model) have impact. The distance to shore, other nearby vessels, and installations will also play a role.

2.3. Auto-EQD

The EQD sequence can be manually activated either by the driller or by the DP operator. Manual activation is the general practice in DP drilling operations worldwide. The automatic method is called auto-EQD. With auto-EQD, the sequence is activated by the DP control system when the rig position registered in the DP software crosses the pre-defined red limit. Prior to 2007, auto-EQD had only been implemented on two DP drilling units for shallow water operations on the Norwegian Continental Shelf (NCS). After the Macondo accident, the auto-EQD became more common.
According to Chen et al. [16], the design of auto-EQD is effective for drive-off/drift off scenarios that involve change of the estimated position in the DP software. During one drive-off incident on the NCS, the auto-EQD system was successfully activated when the estimated position crossed the red limit. However, there are drive-off scenarios in which the estimated position in the DP software is erroneously registered to be within the limit area, while the vessel is actually beyond the red limit. Under such circumstances, where there is a delay or error in the GPS data, the auto-EQD will not function. The recommendations by Chen et al. are as follows: The auto-EQD is effective at enhancing in-time activation of EQD, and should be considered for DP drilling operations where there are small rig position error margins, such as in shallow waters, or in operations where the wellhead is fragile. Given the current design of auto-EQD, DP operators should have sufficient knowledge, training, and experience with respect to the scenarios that the auto-EQD will not function.

2.4. The Well Specific Operating Guidelines

There can be different reasons for activating the EQD such as a loss of position, damaged equipment, a faulty operation, or other safety issues. The main tool for safety evaluation and decision making in DP operations situations is the Well Specific Operating Guidelines (WSOG). This type of document has roots back to the UK offshore industry in the early eighties, but found its present form about 20 years later. In 2001, a revised version of the WSOG process and structure, based on numerous case studies and input from the industry, was presented at the International Association of Drilling Contractors (IADC) Northern Deepwater Conference in Norway. The work was followed up in 2004 by Det Norske Veritas (DNV), bringing standardization further in the report “Guidance on safety of well testing” (4273776/DNV). A large number of drilling contractors adopted to this in their operations manuals for dynamic positioning, resulting in the Recommended Practice (RP) guidelines DNVGL-RP-E307 [17].
These guidelines applies the barrier functions described earlier. The core of the WSOG is a matrix that contains a column with parameters that has proven to be of the most critical importance for rig and well safety and integrity. These parameters need to be continuously monitored by the operators. Each parameter has three or four states, or conditions, ranging from “normal” (green, everything is ok), to “advisory” (confer with defined team members), then to “yellow” (critical, stop operations, prepare for disconnection), to the stage of “red” (safety is breached, minimize consequences by clearing away crew and disconnect from the well).
It [17] also gives example tables that are used to create WSOG (Well Specific Operation Guidelines). We see that some of the parameters give numbers, while others are situation-specific. Table 1 is a simplified illustration of a WSOG (adapted from [17]). The numbered parameters like DP position footprint and Riser Limitation Upper Flex Joint (UFJ) have defined limits, while, e.g., how to evaluate the wind direction is less clear.

2.5. Reported EQD Incidents

Despite all these efforts to improve the safety of MODU operations, accidents and incidents do occur. The annual International Marine Contractor Association (IMCA) have made available voluntary and anonymous incident reports from DP vessels worldwide [18]. The IMCA database consists of in total 1171 reported incidents from the year 2000 to the year 2016, which is an average of 69 reported incidents per year. The IMCA reports show (amongst other) different causes that led to incidents with loss of position.
In a separate database, Petrobras has made available 571 incident reports from year 1992 to 2005 [19]. This kind of company report is rare to find in the public domain. It normally stays inside each company, thus out of reach of common knowledge.
Hauff [20] used input from the IMCA database combined the reports, and presented the mean percentage of the different causes. His article shows that a majority of the unsafe actions were caused by “rule-based acts”, while “knowledge based” and “slip/lapse” together accounted for less than half of the 46 investigated unsafe acts.
Broyde et al. [21] searched for relevant information among 1171 reported incidents in databases by repeatedly using relevant keywords. They identified in total 73 emergency disconnections reported to the International Marine Contractors Association [18] and in the Petrobas DP Incident Database [19]. The IMCA database reported between two and three EQD per year. The oil company Petrobras alone reported on average two EQDs per year (1995–2006). It is thus obvious that a high number of incidents are never reported to the IMCA database. Broyde et al. conclude that a high number of the EQD activations were unnecessary, and a result of human error. Their analysis showed that the three most frequent main causes were (1) environmental, (2) human error, and (3) procedure. Human error and/or procedure were the secondary cause in over 50% of the emergency disconnections.
Martinsen [22] interviewed 13 experienced DP Operators (DPO) and found the main characteristics of critical incidents in DP operations: situation awareness, experience, human/automation issues, and decision strategies. Her work showed that the DP operator’s mental simulation of potential future incidents was an important part of the decision-making process. Pedersen [23] analyzed dynamic positioning systems during drilling operations in the Arctic with emphasis on the dynamic positioning operator. He performed a rather extensive review of DP and risk analysis in his master thesis, and investigated the vulnerability of using DP in arctic areas.

2.6. EQD Scenario Development with DP Operators

A detailed, but hypothetical, EQD scenario is presented in the Petro-HRA guidelines [24], which is organized in a four-stage decision process model resembling Parasuraman et al. The scenario is developed by the authors together with a group of DPOs for the purpose of extracting knowledge about the EQD process.

2.6.1. Stage 1: Information Acquisition: Detect Loss of Position, 10 s

The underlying, root cause of this hypothetical event is a DP malfunction, which results in increased and seemingly uncontrollable thruster activity, gradually reaching full power (100%). The activity level of the thrusters is registered by the DP system, and is registered by the operator by looking at the human-machine interface (HMI). In addition, the change in thruster sound is a direct cue identifiable by the operators. When the thruster force level reaches 80%, it triggers a preset alarm sound in the HMI. The extra power consumption also triggers a startup of standby generators, which is registered and presented in the HMI of the power management system. Also here, direct sound (from the generators) is available as an additional direct cue to the operators. The resulting movement of the rig is registered by the system, and presented visually to the operator in the HMI. A visual alarm in the HMI is triggered when the rig has reached a 3 m offset. The operator further collects information about the speed and bearing of the rig, as well as the angle of the riser, and then iteratively examines the trends of these and other various (unspecified) parameters, in the HMI.

2.6.2. Stage 2: Information Analysis: Diagnose Drive off Event, 5 s

The operator is aware that there is a delay from changes in thruster activation until there is an effect on the acceleration and speed of the rig (due to inertia); this constitutes critical knowledge and is used for the assessment of the situation. It is also stated as a fact that the riser angle is less important than the position offset, but supports the diagnosis. Based on this knowledge, the available information from direct cues, and the parameters in the HMI, the operator concludes (by “cognitive action”) that there is a drive-off, a critical event.

2.6.3. Stage 3: Decision Selection: Decide on Mitigating Actions

Based on this information, operators are also able to project the development of the event. This includes estimating how much time they have available for further analysis and decision-making. Based on his own assessment, the operator finds time to be very short—there is no time to analyze or seek to solve the initial problem, only time to minimize the consequences.
The first decision made by the operator is how to deal with the problem of the uncontrollable thruster activation. Two options are available: either initiate an emergency stop of the thrusters, or initiate an automatic EQD (which also will shut down the thrusters). The scenario does not elaborate on how these two options are evaluated, but concludes that the operator chooses to first initiate the emergency stop of the thrusters, and then initiate the EQD.

2.6.4. Stage 4: Action Implementations: Initiate Emergency Disconnect Sequence

In order to prepare the crew and rig for EQD, the operator manually changes the rig status to yellow (in the HMI). He then further pushes emergency stop buttons for each thruster. The operator can verify that the thruster is shut down by checking a visual signal and detecting changes in sound levels. The next action is to press the enable button and the EQD button simultaneously, which light up when activated, and which also trigger public rig alarms. He also informs other crew members of the situation. Then, an automated process disconnects the rig from the well in about 30 s.

3. Literature Review on the Process of Decision-Making

This section reviews theoretical aspects of decision making processes in general. First, we investigate the nature, strengths, and weaknesses of human decision making, followed by a similar presentation of machine (computer) decision making.

3.1. The Rational Approach

The, rational, or analytical, approach for human decision making has its roots in utilitarian theory and the later notion of homo economics [25]. It is based on the premise that the human choice is motivated by two primary drivers: to optimize value and to minimize cost. In order to accomplish this, Barth Emely et al. [26] define four conditions that need to be met:
  • The decision maker needs to be capable of generating all possible scenarios and potential outcomes of the situation.
  • The decision maker should be able to evaluate differences in the attractiveness of available alternatives.
  • It should be possible to aggregate the partial, or local, evaluations into a global evaluation.
  • The decision maker chooses the global alternative that has the most favorable evaluation.
This approach has later been developed into practical tools and guiding principles for organizational decision-making, often characterized by complex matters, multiple stakeholders, and long-time perspectives. One example of such a tool is the Analytic Hierarchy Process (AHP) illustrated in Table 2 (adopted from [27]).

3.2. Criticism of Rationalistic Decision Making

A significant and long-lasting critique of the rational approach arose with the work of the scholar Herbert Simon, who in his seminal paper [28] questioned whether rational decision making was a description of how decisions are actually made, or merely an idea of how they should be made. He argued that human rationality is restricted by limitations in the traceability (inherent complexity) of the decision problem itself, the actual time available for making the decision, as well as limitations in human cognitive capacity. These restrictions result in an incomplete and subjective representation of the context and nature of the problem, in which the decision maker seeks to rationalize. Simon called this the principle of bounded rationality [29]. Based on these limitations, the goal and process of human decision-making is not directed towards finding the optimal choice, but a satisfactory one [30]. Later research revealed that humans apply various heuristics and biases in the process of decision making [31], such as availability, representativeness, and framing effects. The Moving Basis Heuristics [32] is another example of non-rational aspects of human decision making:
  • Parsimony: the decision maker uses only a small part of available information
  • Reliability: the information is considered to be sufficiently relevant to justify the decision made
  • Decidability: the information applied in the decision-making process may vary between people and decisions.

3.3. NDM—Recognition Primed Decision Making

There was a growing interest in the 1980s to explore, in an inductive manner, how humans actually make their decisions, resulting in a branch of research labelled naturalistic decision making (NDM). The recognition-primed decision making model [33] is based on a number of studies of how operational experts, such as fire fighters, took decisions in time restricted, critical situations. As the name implies, recognition was found to be a central component in the decision-making process. The model defines four aspects of recognition: plausible goals (given the circumstances), relevant cues (central pieces of information observed or sensed in the environment), expectancies (replacing missing information with expectations, based on previous experience), and action alternatives. During the development of situations, the decision-maker continuously compares his/her expectations with the realities, in order to adjust their mental model, and, if needed, to seek more information on critical issues. Once a sufficient degree of recognition of the situation has been achieved, the decision-maker undertakes mental simulations of action alternatives based on previous experience. As different from analytical decision-making, but in line with Simons concept of bounded rationality, the evaluation of alternatives is stopped once the decision maker has found a solution that is likely to work. The chosen action does not need to be the optimal one, as long as it solves the problem at hand, in due time.

3.4. Situational Awareness

Situational Awareness (SA) has been identified as an important construct for operators in modern complex socio-technical systems [34]. SA can be defined as a perception of the information elements in the dynamic environment, comprehension of their meaning, and projection of their future status [35]. Consistent with the above definition, there are three levels of SA identified—Level-1, Level-2, and Level-3 (see Figure 2), corresponding to perception, comprehension, and projection stages for the operator. Taxonomic classification as per these levels is also used in determining whether the operator made an error in perceiving the information or comprehending its meaning and, accordingly, whether the changes in system design, procedures, or training can be applied. The SA of an operator acts an input for decision making and performance, and it is affected by various factors such as experience, training, attention, etc. It is theorized that the more experienced the individual is, the better developed mental models he or she has, and this again can enable them to better generate SA. Conversely, novices spend significant cognitive resources in generating SA.

3.5. Computers as Decision Makers

While the human brain evolves very slowly, computer processing power, memory, and data transmission capacity has undergone immense improvements during the last twenty years [37]. Rooted in the early development of artificial intelligence (AI), the modern computer has proved itself as a valuable advisor for human decision makers. These expert systems (ES) are able to receive input in terms of human expertise, often in terms of task-specific knowledge. This knowledge and expertise have been explored and defined using a set of established knowledge elicitation techniques [38]. The computer may then be connected with a sensor network in order to obtain real time information about the environment, combine the input, and give as output advice to the operator in a reactive or proactive manner [39].
In later years, computers have proven to be increasingly capable, not only as advisors, but also as decision makers in domains previously ruled by the human mind. One such example is the famous chess match in 1997, when IBM deep blue defeated champion Garry Kasparov [40]. More recent research indicates that using human expertise as a basis for computer-based assessment and decision making may not be necessary, or even beneficial. Information elicited from experts provides a useful guideline, but it also has significant drawbacks such as high cost, lack of reliability, or that it is simply unavailable. A novel approach to AI and ES development is to enable technology systems to learn, train, and improve themselves by combining recent technology developments in data analytics and reinforcement learning algorithms [41]. As a result, technologies may have potential to exceed human assessment and decision-making capabilities, and to handle new types of tasks and domains in which there is a lack of human expertise. A recent example of powerful self-learning algorithms is related to the highly complex Chinese board game called Go. The former computer champion, Alpha Go, was the first algorithm to beat human champions, but was in 2016 defeated by a margin of 100 to 0 by a new software, the Alpha Go Zero [42]. The most prominent feature of Zero is that it is trained without any use of human data or supervision whatsoever, solely by self-play reinforcement learning, starting with totally random play.

3.6. Challenges with Computers as Decision Makers

Despite the promising capacity of AI in terms of decision-making, we know from previous research that a computer is far from perfect as a decision maker, and may cause different types of safety issues. The fact that automation caused problems was a particular issue within the airline industry in the 1980s and 1990s. This issue was explored by various scholars (see for example [43,44,45,46,47]). Onken and Schulte [48] describe five phenomena that reduce the quality of computer-based decision-making, summarized in Table 3.

3.7. Allocation of Functions between Human and Machines

So how do we organize the work and distribute the tasks in human-machine systems to obtain the best from both humans and machines? This intriguing question is not new. It became relevant along with the industrial revolution, and in 1951 Fitts [49] published a report that has become a widely used and referenced standard for allocating tasks and functions between man and machine [50]. Although Fitts was concerned with automation, the ability of post-war technology to take over cognitive work from humans was very limited. Machines were clearly superior in terms of speed, power, precision, and replication, while humans reigned in tasks that required evaluation and judgment of complex and dynamic information about more holistic system parameters and interfaces.
The increasing capabilities of computer and software technology have later led to increased automation and even autonomous technology systems, making the division of cognitive work between man and machines less clear. This has created a need for new, more elaborate function allocation models for human-automation collaboration.
The distinction between automation and autonomy is an issue under current clarification among scholars. In this paper, we apply the same definitions as [51] in considering autonomy as an extreme degree of automation. Thus, it follows that a purely autonomous system is different from purely manual systems by various degrees (levels) of automation (not by degrees of autonomy), as presented in Figure 3.
In order to analyze, discuss, and define these interfaces, scholars have proposed various matrix models. These are typically represented with the decision-making process as one axis, and different “levels of automation” as the other. Despite the complexities of human cognitive processing, as demonstrated in the principles of naturalistic decision making, many of these models apply a simplified, sequential model. For a thorough review, see Vagia et al. [52].
The second axis of these models is the “levels of automation”, which is an elaboration on Fitts’ concept of function allocation, but with a higher fidelity on cognitive work. The levels are defined in various numbers of steps, in which the lowest level is human doing (deciding) everything (computer doing nothing), and the highest is computer doing (deciding) everything (i.e., humans doing nothing).

4. Research Questions and Method

4.1. Research Framework

We base our research on the level of automation model by Parasuraman et al. [51], a process for improving human-machine systems with correct levels of automation. It consist of six steps, the first addressing and defining the scope of the work, i.e., what is, or may be, automated. The second step is to identify the cognitive decision making subtasks of information acquisition, information analysis, decision selection, and action implementation. The third step is to evaluate the current levels of automation for each subtask. The final steps are to apply evaluative criteria related to the relative strengths and weaknesses of humans vs. automation, ending with suggestions on new types and levels of automation. Our framework is an adoption of this model, and is shown in Figure 4.

4.2. Research Questions

Based on this framework, we specify the following research questions:
RQ1:
Which information is available and relevant for EQD decision making?
RQ2:
How is the information acquired?
RQ3:
How is the information evaluated and analyzed?
RQ4:
Once decided, how is an EQD implemented?
RQ5:
To what degree is automation applied in the process, i.e., what is the level of automation?
RQ6:
May increasing the level of automation improve EQD as a safety barrier?

4.3. Methods for Data Collection and Analysis

In order to explore these questions, we use a qualitative research approach [53] and draw on two types of information sources. The first type is existing relevant documentation such as research publications, industry reports, and guidelines, all of which are related to the EQD decision making process. This has already been presented in the previous literature review. The second source is first-hand, primary data from the actual decision makers, which has been collected by means of semi-structured interviews [54]. These data are presented in the following chapter.
The resulting data set has been analyzed by the research team which includes two domain experts with several years of working experience with sub-sea drilling operations, according to principles for document analysis [55]. Our analysis include deducing a set of higher level common attributes (a coding system) among the information parameters, which seems of relevance to the current and future level of automation. In order to identify the current level of autonomy for each of the four decision making stages, we found little practical guidance in [51]. Thus, this has been evaluated subjectively by the research group.
Our findings is organized and discussed in accordance with the research framework in Figure 4. Based on this, we reach a conclusion with recommendations on areas for further improvement and exploration on levels of automation in the EQD process.

5. Results

5.1. Interviews of Operators

These interviews of personnel directly involved with operation of EQD were conducted by one of the authors during summer 2017. To investigate how the operators make the decision to activate the EQD, a semi-structured, qualitative interview method [54] was chosen.
We performed interviews of one Offshore Installation Manager (OIM), two DP Operators (DPO), and one expert drilling advisor with experience from shallow water operations on the Norwegian Continental Shelf (NCS). Table 4 shows the profiles of our interviewees.
Three of the interviewees have operational experience from DP rig/ships with manual activation of EQD by pushbutton. One of the interviewees have operational experience from DP rigs with auto-EQD. Two of the interviewees have experienced activation of EQD. The interviews gave also information of several operational scenarios where an EQD was considered, and avoided. In total, our interviewees have an operational experience of close to 40 years.
Prior to conducting the interviews, we received a consent from the interviewees and informed of the purpose of our research. The interviews were recorded and lasted about 1–1.5 h each. The recordings were then transcribed into mind maps, as this was considered the most beneficial way to visualize the interview results, and share with the project team. The researchers have translated the quotes from Norwegian.
Our interviewees were very open an eager to share their experience, and found the research purpose to be important for their work.
In Table 5, we present findings from the interviews related to information parameters relevant for the EQD decision.
We also extracted information about how operators perceive the auto-EQD system, which is presented in Table 6.
A short story from one of the interviews illustrate a weakness of auto-EQD. This was during an operation where the weather was increasing in intensity, and the rig was kept in a stable position by the DP system. Then a large wave pushed the rig out of position. It happened so fast that the DP system was not able to keep track of the situation, resulting in the rig moving outside limits without warning (red status) was being given. The DPO assessed data directly from the GPS, and after discovering that the rig was outside limits, he chose to disconnect. This story is an example of a situation where the operator reacted before auto-EQD (if turned on), as auto-EQD is affected by delays between the raw GPD data and the DP system.
One clear advantage of auto-EQD that was emphasized by one of the DPOs is that once the limit is breached (no matter how much), the sequence for disconnect starts. It is not possible to stop it. As it is the system that is correctly responding to the given parameters, which is in full accordance with procedures, it is easy afterwards to justify the EQD.
But as one operator said, “if you see that the rig is moving outside the limits, and you keep a clear mind, it is possible to disable the auto-EQD”. This is done by buttons on the desk, but disabling is only meant to be done during maintenance. Our interviews indicate that disabling is a touchy topic in the industry, because it implies breach of formal procedures.
We found our interviewees to be were generally positive to auto-EQD, despite some weaknesses such as loss of GPS-data, causing the system to react too late. Auto-EQD provides additional safety, and also makes operators less concerned that they need to defend the decision to make an EQD. However, comments on auto-EQD also shows that operators sometimes push beyond predefined limits, if they think the rig will recover within the next seconds.

5.2. Analysis of the Current Level of Automation

Here we present our analysis of both the interviews and the reviewed literature. It is organized in four sections according to our research model (Figure 2). For each section, we propose a level of autonomy according to [51].

5.2.1. Information Acquisition

From our interviews and the literature study we have identified 45 different information parameters of relevance for EQD decision making. The full list of information parameters are provided in Table A1. As these parameters are of very different nature, we have used a deductive coding approach (Bowen, 2009) to organize the parameters by four different attributes.
The first attribute is how static (ST) or dynamic (DY) is the information. We term this attribute variability. Examples of static information is the displacement, or weight, of the rig, as this does not change significantly during the development of a critical event. We also classify general operational knowledge (expertise) that is applied during the decision process as static information. Dynamic information, on the other hand, may change significantly and affect the need for an EQD, such as for example wave conditions.
The second attribute we identified is the time perspective, i.e., how long time ahead of the decision the information may be acquired. This ranges from the very imminent, such as the occurrence of a sudden high voltage blackout that in some cases may directly lead to an EQD in a matter of seconds. In contrast, weather forecasts are very long term, counting in hours or even days. We distinguish between three time perspectives, short-term where information occurs or changes in matter of seconds (S), mid- term where time scale is minutes (M), and the latter, long-term measures in hours (H).
A third attribute is the source of the information. Some key information, such as the current situation on the tool deck, needs to be acquired directly by human observation (HO). Another example of such source is irregular sounds, such as broken bearings in machinery. In contrast, other information parameters such as riser angle, or wind speed, is measured by the technology monitoring and control system, using sensors and computer processing, and is presented to the decision maker by means of instruments and displays. Thus, we label this source monitoring systems (MS). We also found that DPOs use information which originates from on their own knowledge and experience, i.e., expertise (EX). An example of this is that there is a delay from thruster force is being applied, until the resulting movement of the rig. This is due to the physics of inertia and rig displacement. It should be noted that there are principal differences between knowledge and information, but for the sake of this analysis we have chosen to combine this.
The fourth attribute we have extracted are types of operational rules for the parameter, which should lead to specific decisions or actions. Rules may be in the form of limits, such as the position offset, where there are green, yellow and red zones with defined thresholds values (TV). There are also more simple yes/no rules (YN), for example if motion sensors are operational, or not (does not work as they should). Other parameters have no clear rules (NR), leaving it to the decision maker to make subjective interpretations. An example of the latter is the wind direction parameter, which is, according to the WSOG presented earlier (in Table 1), to be interpreted in a “situation-specific” manner.
The full coding of the parameters are provided in Table A1, while Table 7 provides an overview of our findings.
We have found that 39 of the 45 information parameters are dynamic, and that 30 of the parameters affect the short-term (seconds) development of events. Most of the information are obtained by the monitoring and control systems (27 parameters), but there is also a substantial amount of information parameters (20) that are acquired directly by the human operator. Of the latter, 10 parameters are results of direct observation - including hearing, as sounds were key cues in several situations. The remaining 10 is based on knowledge and experience (expertise). All of the static information parameters (5) comes from expertise (as for example that the shape of the rig affects how it behaves in the ocean), but expertise is also applied during short-term development of events (for example delays in position updates from the Global Positioning System, GPS, into the DP system). Further, we found that most of the information parameters need to be evaluated in a holistic and subjective manner, based on the specific situation at hand, without clear rules or thresholds guiding further action. We also identified 12 parameters which is basically to verify that certain equipment is working as it should (yes or no). There were also 6 parameters where we did not find any indication of what type of rule that should be applied.
These numbers indicate that most of the information acquisition happened automatically. The monitoring and control system does acquire information automatically and informs the human, which corresponds to a level of automation of 7.
At the same time, we find that the information that operators collect manually is highly critical, such as sounds (normal vs. abnormal), visual information about the outside environment (in particular the sea state and individual waves), and activities in the work area (moon pool). Thus, some direct visual and auditory information is acquired at automation level 1 (computer offers no assistance).

5.2.2. Information Analysis

Based on the above analysis, the amount of information available for analyzing the need for doing an EQD is parameters very large. The information is also to a large degree dynamic, and should be analyzed in events that may develop during seconds only. This appears to be a daunting tasks for the human mind.
During critical, short term situations (seconds and minutes), our data indicate that only a handful of the parameters are used in practice. Power blackout and the position offset appears to be very central parameters for the EQD decision. We found little evidence that the monitoring and control systems provide much decision making support for operator. It is up to the operator to interpret the available information, while drawing on expertise and previous experience. However, the system does provide alarms when the system recognizes short term problems, and this recommendations to change status and eventually do an EQD).
Thus, we define the short term analysis stage without auto-EQD to correspond to level 4 (the computer suggest one alternative). Auto-EQD uses on one hand only very limited information (position offset), but on the other hand this is a very central parameter to assess if EQD is needed. Auto-EQD is fully automated, but informs the operator if any actions will be taken. Thus, we find the information analysis stage when auto-EQD is applied, to correspond to autonomy level 7.
We also found that the operators are very concerned about assessing the likely development of events, in line with Endsley’s third level (projection of future state). The “future” here means hours or maybe days, in contrast to short terms events analyzed above. Due to the high number of parameters available, this is potentially a very complex task. We found no evidence that technology systems are helping in these projections, as decision support or automation during operations. Our findings indicate that these predictions are done in a purely manual manner, based on subjective assessment and the “feel” of the operators. Thus, we perceive the level of automation for long term predictions to be very low, stage 1—the operator must do all of the assessment manually.

5.2.3. Decision Selection

The decision we address in this paper is if an EQD should, or should not, be initiated. Thus, the alternative decisions are limited to a yes or a no. Our interview data indicate that operators are very aware of the potentially high costs of an EQD, which may influence operators to push limits. As one operator stated, sometimes it is possible to “ride the position”, meaning that the rig is brought out of position due to a large wave, and is expected to return within limits when the wave has passed. This shows that the decision is not only made based on the current situation, but on projections, or mental simulations, of the near future state. When the operator is making the decision manually, the system will suggest, by means of alarms, to do an EQD if certain parameter limits are breached, indicating an autonomy level of 4. In contrast, if auto-EQD is activated, the level of automation is very high, at level 10, as the decision to do the EQD is taken instantly once certain parameter thresholds are reached.

5.2.4. Action Implementation

Once the decision to do an EQD is reached, either manually or by auto-EQD, our data indicate that the implementation is fully automated (autonomous) and takes about 40 s to execute. This implies the highest autonomy level of 10.

5.3. Current Level of Automation in EQD

Based on these data and this analysis, we summarize our evaluation of the current level of automation in the EQD decision making process in Table 8.
We find that the current level of automation varies significantly depending on differences in information attributes, whether auto-EQD is installed and in use, and between the stages of the EQD decision process.

6. Discussion

Here we discuss our findings in terms of whether an increase in autonomy level may improve EQD as a safety barrier.

6.1. Information Acquisition

There is one issue in particular that stands out in terms of information acquisition, and this is the position data. Correct and timely position data are related to the most central criteria for EQD, the position offset. The auto-EQD is also based on these position data, and as long as this is not fully reliable, true position and positon offset need to be continuously assessed by the operator. This monitoring and comparison process appears to be a central and demanding matter related to information acquisition. Lack of reliable position data also limits the potential of automation, which should not be increased from its current level. Manual control and monitoring will still be a necessity until these issues are resolved.

6.2. Analysis and Decision of Initiating EQD

Is it complex, or difficult, to analyze if there is a need to do an EQD? Our findings point in two directions, depending on time perspective. In short-term, imminent situations occur when limits are actually breached; the time perspective is in the order of seconds. Then, actuating an EQD may be based on one single parameter, such as the position offset. If the rig has so much position offset that is has entered red zone, procedures simply state that EQD should be made. This type of rule-based decision-making is, according to our data, successfully taken care of by automation with the so-called auto-EQD. However, there is another factor adding to complexity, which is that some operators, in some situations, do make holistic assessment about not doing an EQD, despite breaching the defined limits. They may let the rig drift off limits for short periods due to one wave, while the next is expected to bring them back within limits. However, it is hard to see arguments for how automation could increase barrier quality by allowing specific limits, stated in established procedures, to be exceeded even for short periods of time. We suggest that in a short time perspective, the level of automation is appropriate by means of the existing auto-EQD system, but the reliability of the positional data applied in the system needs to be improved. This also indicates that installing auto-EQD on MODUs with current manual systems will increase the barrier quality of the EQD.
The other time perspective is the future, which includes situations that are gradually developing over some minutes up to several hours. Here the operator’s analysis is focused on projecting and anticipating what is going to happen some time ahead. This corresponds to Endsley’s third level of SA, the projection of future status, and Kleins’ concepts of recognition and mental simulation of actions. We found that operators consider this the more difficult analysis, in particular the analysis of making a planned disconnect, which is generally a safer and better alternative than risking an EQD. We found no evidence that the automation provides any support when making these projections, so the interpretation and processing of the large number of information parameters rely on the operators. This clearly challenges the limited human cognitive capacity [56], particular when analysis and decisions become time critical and prioritization needs to be done [57]. Time constraints may explain why we found that only a few parameters where taken into consideration when assessing critical situations. We recognize here the three Moving Basis Heuristics [32], as well as the principles of naturalistic decision-making [33]. These decision-making strategies are superior to the analytic approach in terms of speed but are inferior in terms of thoroughness, as only (according to our findings) a few information parameters are taken into consideration.
Based on this, we suggest that automation has the most promising potential to improve barrier quality in mid- and long-term predictions of how events will develop. Operators find it difficult to decide if a planned disconnect should be arranged instead of risking an EQD. Thus, an increased level of automation could compliment the strengths of the human, naturalistic decision-making process with a more thorough analysis of the large amount of dynamic and static information available. In this way, operators could be presented with predicted scenarios along with a set of suggested decision and action alternatives.

6.3. Action Implementation

The implementation, after the decision is made, appears to us as fully autonomous, and we have no findings that indicate that there is a need to change this.

6.4. Limitations and Further Research

This study has provided some insights into the EQD decision-making process, but cannot claim that this analysis is sufficient in order to understand all the potential and limitations for increased automation. In particular, we were surprised by the large number of seemingly relevant information parameters we detected, and their variation in terms of different attributes. We believe we provide a significant contribution to improve the decision making process of EQD, but acknowledge that there is a need for more in-depth analysis of many of the issues we have raised in this paper. This relates to, in particular, RQ3, how the operators analyze all the available information, including the so-called “situation dependent” parameters.
We find the framework of Parasuraman et al. useful and applicable on a conceptual level, but when going into detail it became challenging to connect the complexities of EQD decision making with the model. In particular, the model provides no systematic way to handle different time perspectives (i.e., imminent, short-term, mid-term, and long term), nor the iterative nature of situation awareness. This has major practical implications for a decision such as EQD, in which time and projections of the future state are prominent factors.

7. Conclusions

Petroleum-related accidents in sensitive marine environments inherently possess critical consequences for humans, marine life, economies, and societies. This article has explored the feasibility of increased automation for improving the quality of EQD as a safety barrier in offshore drilling operations. With autonomy and autonomous systems on the rise, this work contributes to applying existing theoretical frameworks on the level of automation, to analyze both potential benefits, as well as drawbacks, before deploying them in complex operations.
Deciding if an EQD is necessary may be, despite its critical nature, a relatively simple decision if a specific key information parameter—the position offset—is reliable and is exceeded. In these situations, operators may also have the aid of the highly automated auto-EQD system (if this is installed on the rig). On the other hand, it also may be a very complex decision to make, if parameters involved are situation dependent, and thus calls upon the subjective assessment of the operator (team). We have identified 45 relevant information parameters that may be included in such an assessment, and also found that the time for information analysis may be very restricted. Most of these parameters seem to be intended for subjective operator assessment.
Our findings indicate that the current automation for situations that actually reach a state of emergency is working as intended due to the centrality of the position offset parameter. Auto-EQD is appreciated by the operators, and they are well aware of the weaknesses, which are mostly related to the unreliability of the positional data (due to erroneous input and time delays). However, there is little automation to support mid- and long-term projections, in particular the decision to whether initiate a planned disconnect or continue operations. Our findings indicate that the latter type of analysis and decision is the most complex for operators. Thus, we suggest increasing the level of automation in this phase, in order to better support operators with presentations of alternative scenarios, including suggestions regarding decisions and actions.

Acknowledgments

This research has been supported by the MARKOM2020-project, funded by the Norwegian Ministry of Education and Research.

Author Contributions

Imset Marius has conceived the design of the study and written the main body of the work. Kjørstad Marianne has collected and presented interview data, and Falk Kristin has contributed with literature review and subject-matter expertise in EQD operations. Nazir Salman has contributed with literature review and editing.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1. Complete Table with Identified Information Parameters and Their Attributes.
Table A1. Complete Table with Identified Information Parameters and Their Attributes.
VariabilityTime PerspectiveParameter SourceOperational Rules
NoEQD Information ParameterExplainationStatic (ST),
Dynamic (DY)
Seconds (S),
Minutes (M),
Hours (H)
Human Observation (HO),
Monitoring Systems (MS),
Expertise (EX)
Threshold Values (TV),
Yes/No (YN),
No Rules (NR)
1HV (high voltage) switchboards/blackoutLoss of power means that the MODU can not hold its positionDYSHO, MSYN
2Position offsetOffset from intended position increases strain on riser system and well (deeper water provides more time)DYS, MMSTV
3Heading offsetOffset due to current and winds, might affect how easy it is to operate vesselDYSMSNR
4Power consumption each network (3-split configuration)Helps operator to detect anomalies in critical systemsDYSMSNR
5Thruster force on each switchboardTruster force is the means for the vessel to move in the wanted directionDYSMSNR
6Position reference availableNeeded to keep the vessel in positionDYSMSYN
7DP control system operationalA malfunction in the DP system can compromise control of the rig positionDYSMSYN
8Wind sensors operationalWind force may exceed the power of the thrusters, leading to loss of positionDYMMSYN
9Motion sensors (MRU) operationalMotion sensor is used for aiding and compensation of vessel motionsDYSMSYN
10Heading sensors (Gyro) operationalProvides the heading information from which the DP system computes steering commandsDYSMSYN
11DP-UPS operationalUPS offers power protection and energy backup to DP control systemDYSMSYN
12IAS System, DP Network operationalIAS is the Integrated Automation System, or the "brain" in the DP systemDYSMSYN
13Fire Alarm System statusMalfunction may lead to increased response time in case of fireDYSMSNR
14Communication SystemsMalfunction makes it difficult to coordinate team effort and distribute informationDYSMSYN
15Riser limitation UFJThe UFJ (Upper Flex Joint) is a part of the riser, and indicates the riser strengthDYSMSNot known
16Riser limitation LFJThe LFJ (Lower Flex Joint) is a part of the riser, and indicates the riser strengthDYSMSNot known
17Riser angleThe angle between the vertical plane and the riserDYSMSNot known
18Wind speedWind force may exceed the power of the thrusters, leading to loss of positionDYMMSNR
19Wave heightIncreased forces from waves may compromise position, stability and deck operationsDYMHONot known
20Heading Deviation from BOP LandingAffects the behaviour of the rig at a certain moment in time, helping in assessing if operations may continue or EQD is neededDYMMSNot known
21Presence of unshearable tools/equipment in BOPMay prevent a safe and effective EQDDYMMSYN
22GPS raw data availableMay indicates delays or imprecisions in the DP system dataDYSMSYN
23Other sensor data/parameters (unspecified)Other information from sensors and instruments, details not knownNot knownNot knownMSNot known
24Sound from thrustersMay help indicate malfunctionDYSHONR
25Sound from bearingsMay help indicate malfunctionDYSHONR
26Sound of generatorsMay help indicate malfunctionDYSHONR
27Position of crew/peopleMay influence communication and desissionsDYSHO, EXNR
28Presence of hydrocarbons on deckProcedures can be different if there are hydrocarbon on deckDYSMSNR
29Presence of tools/equipment in moonpoolMay prevent a safe and effective EQDDYSHONR
30Areas to be cleared of people if EQDMoving parts during EQD process may injure people in close proximityDYMEXNR
31Time delay of position updates in DP systemMay allow the rig to move beyond limits without proper alarms being activatedDYSEXNR
32Delay from activation of thrusters until effect on rig movement (inertia)Indicates need to anticipate and take preventive actionDYSEXNR
33Different systems have different limitsAllows for subjective evaluation of parameters and overall situationSTNot relevantEXNR
34Speed of the rigAffects the behaviour of the rig at a certain moment in time, helping in assessing if operations may continue of EQD is neededDYSMSNR
35Vessel displacement (weight)Affects delays in rig movement when exposed to force (acceleration and retardations)STNot relevantEXNot relevant
36Vessel shapeAffects how rig responds to forces from wind, currents and thrustersSTNot relevantEXNot relevant
37Vessel translation and rotation (3 dimensions)The resulting behaviour of the rig based on the overall weather situation, helping in assessing if operations may continue or EQD is neededDYSHONot relevant
38Mechanical properties of riser and BOP (joints, tensioners, connectors, casings)Provides limitations for vertical and horisontal movement and forces from the rigSTNot relevantEXNot relevant
39Soil modelProvides limitations for vertical and horisontal movement and forces from the rigSTNot relevantEXNot relevant
40Sea currentResulting forces may exceed the power of the thrusters, leading to loss of positionDYHMSNR
41Weather forecastHelps project to what degree the rig will be exposed to forces from wind, waves and currentsDYHOtherNR
42High cost of EQDOperators are aware of the potentially high cost of initiating an EQD, which may delay the decision during a critial incidents, or make them push beyond predefined limitsDYHEXNR
43Emergencies or blackout on nearby vesselsMay drift towards the rig and become a collision hazardDYSHONR
44Confirmed fireDepends on location and severity, OIM to evaluateDYSHO, MSNR
45Uncontrollable situation in the wellNormally try to get control using the well control procedures or diverter. EQD activated if the well treathens the safety of the MODUDYSMSNR

References

  1. Pate-Cornell, M.E. Learning from the piper alpha accident: A postmortem analysis of technical and organizational factors. Risk Anal. 1993, 13, 215–232. [Google Scholar] [CrossRef]
  2. Holen, A. The North Sea oil rig disaster. In International Handbook of Traumatic Stress Syndromes; Springer: New York, NY, USA, 1993; pp. 471–478. [Google Scholar]
  3. Peterson, C.H.; Rice, S.D.; Short, J.W.; Esler, D.; Bodkin, J.L.; Ballachey, B.E.; Irons, D.B. Long-term ecosystem response to the Exxon Valdez oil spill. Science 2003, 302, 2082–2086. [Google Scholar] [CrossRef] [PubMed]
  4. Shapiro, K.; Khanna, S.; Ustin, S.L. Vegetation impact and recovery from oil-induced stress on three ecologically distinct wetland sites in the Gulf of Mexico. J. Mar. Sci. Eng. 2016, 4, 33. [Google Scholar] [CrossRef]
  5. White, H.K.; Hsing, P.Y.; Cho, W.; Shank, T.M.; Cordes, E.E.; Quattrini, A.M.; Nelson, R.K.; Camilli, R.; Demopoulos, A.W.; German, C.R.; et al. Impact of the Deepwater Horizon oil spill on a deep-water coral community in the Gulf of Mexico. Proc. Natl. Acad. Sci. USA 2012, 109, 20303–20308. [Google Scholar] [CrossRef] [PubMed]
  6. Grattan, L.M.; Roberts, S.; Mahan, W.T., Jr.; McLaughlin, P.K.; Otwell, W.S.; Morris, J.G., Jr. The early psychological impacts of the Deepwater Horizon oil spill on Florida and Alabama communities. Environ. Health Perspect. 2011, 119, 838–843. [Google Scholar] [CrossRef] [PubMed]
  7. European Commision, Seveso, I.I. Council Directive 96/82. EC of 1996. 1996. Available online: http://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:01996L0082-20120813&from=EN (accessed on 28 November 2017).
  8. European Parliament, Council of the European Union. The Machinery Directive 2006/42/EC. EC of 2006. 2006. Available online: http://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:32006L0042&rid=1 (accessed on 28 November 2017).
  9. International Electrotechnical Commission (IEC). IEC 61508. In Functional Safety of Electrical/Electronic/Programmable Electronic Safety-Related Systems; International Electrotechnical Commission (IEC): Geneva, Switzerland, 2010. [Google Scholar]
  10. Sklet, S. Safety barriers: Definition, classification, and performance. J. Loss Prev. Process Ind. 2006, 19, 494–506. [Google Scholar] [CrossRef]
  11. Cabrera-Miranda, J.M.; Jeom Kee, P. On the probabilistic distribution of loads on a marine riser. Ocean Eng. 2017, 134, 105–118. [Google Scholar] [CrossRef]
  12. Bray, D. Dynamic Positioning; Oilfield seamanship; Oilfield Publications Limited: Ledbury, UK, 1999; Volume 9. [Google Scholar]
  13. Sørensen, A.J.; Leira, B.; Peter Strand, J.; Larsen, C.M. Optimal setpoint chasing in dynamic positioning of deep-water drilling and intervention vessels. Int. J. Robust Nonlinear Control 2001, 11, 1187–1205. [Google Scholar] [CrossRef]
  14. Verhoeven, H.; Chen, H.; Moan, T. Safety of Dynamic positioning Operation on Mobile Offshore Drilling Units. In Proceedings of the Dynamic Positioning (DP) Conference, Houston, TX, USA, 28–30 September 2004. [Google Scholar]
  15. Bhalla, K.; Cao, Y.S. Watch Circle Assessment of Drilling Risers during a Drift-Off and Drive-Off Event of a Dynamically Positioned Vessel. Presented at the Dynamic Positioning Conference, Houston, TX, USA, 15–16 November 2005; Marine Technology Society: Washington, DC, USA, 2005. [Google Scholar]
  16. Chen, H.; Moan, T.; Verhoeven, H. Safety of dynamic positioning operations on mobile offshore drilling units. Reliab. Eng. Syst. Saf. 2008, 93, 1072–1090. [Google Scholar] [CrossRef]
  17. Det Norske Veritas and Germanischer Lloyd. DNVGL-RP-E307: Dynamic Positioning Systems—Operation Guidance; Recommended Practice: Oslo, Norway, 2015. [Google Scholar]
  18. International Marine Contractors Association (IMCA). Dynamic Positioning Station Keeping Incidents—Incidents Reported from 2000–2016; IMCA: London, UK, 2016. [Google Scholar]
  19. Costa, M.S.R.; Machado, G.B. Analyzing Petrobras DP Incident. Presented at the Dynamic Positioning Conference, Houston, TX, USA, 17–18 October 2006; Marine Technology Society: Washington, DC, US, 2006. [Google Scholar]
  20. Hauff, K.S. Analysis of Loss of Position Incidents for Dynamically Operated Vessels. Master’s Thesis, Department of Marine Technology, Norwegian University of Science and Technology (NTNU), Trondheim, Norway, 2014. [Google Scholar]
  21. Broyde, H.; Falk, K.; Arntzen, A.A.B. Autonomous Security System Specification within Oil and Gas Industry for Offshore Vessel: Requirement and Concept. Presented at the Society for Design and Process Science (SDPS), Birmingham, AL, USA, 5–8 November 2017. [Google Scholar]
  22. Martinsen, T.J.S. Characteristics of Critical Incidents in Dynamic Positioning. Master’s Thesis, Høgskolen i Vestfold, Horten, Norway, 2013. [Google Scholar]
  23. Pedersen, R.N. QRA Techniques on Dynamic Positioning Systems During Drilling Operations in the Arctic: With Emphasis on the Dynamic Positioning Operator. Master’s Thesis, UiT the Arctic University of Norway, Tromsø, Norway, 2015. [Google Scholar]
  24. Bye, A.; Laumann, K.; Taylor, C.; Rasmussen, M.; Øie, S.; van de Merwe, K.; Øien, K.; Boring, R.L.; Paltrinieri, N.; Wærø, I.; et al. The Petro-HRA Guideline (IFE/HR/F-2017/001); Institute for Energy Technology: Halden, Norway, 2017. [Google Scholar]
  25. Pareto, V. Manual of Political Economy; Kelley: New York, NY, USA, 1927. [Google Scholar]
  26. Barthélemy, J.P.; Bisdorff, R.; Coppin, G. Human centered processes and decision support systems. Eur. J. Oper. Res. 2002, 136, 233–252. [Google Scholar] [CrossRef]
  27. Saaty, T.L. Decision making with the analytic hierarchy process. Int. J. Serv. Sci. 2008, 1, 83–98. [Google Scholar] [CrossRef]
  28. Simon, H.A. A behavioral model of rational choice. Q. J. Econ. 1955, 69, 99–118. [Google Scholar] [CrossRef]
  29. Simon, H.A. Human nature in politics: The dialogue of psychology with political science. Am. Political Sci. Rev. 1985, 79, 293–304. [Google Scholar] [CrossRef]
  30. Simon, H.A. Theories of decision-making in economics and behavioral science. Am. Econ. Rev. 1959, 49, 253–283. [Google Scholar]
  31. Tversky, A.; Kahneman, D. Judgment under uncertainty: Heuristics and biases. In Utility, Probability, and Human Decision Making; Springer: Dordrecht, The Netherlands, 1975; pp. 141–162. [Google Scholar]
  32. Barthélemy, J.P.; Mullet, E. Choice basis: A model for multi-attribute preference. Br. J. Math. Stat. Psychol. 1986, 39, 106–124. [Google Scholar] [CrossRef] [PubMed]
  33. Klein, G. Naturalistic decision making. Hum. Factors 2008, 50, 456–460. [Google Scholar] [CrossRef] [PubMed]
  34. Naderpour, M.; Nazir, S.; Lu, J. The role of situation awareness in accidents of large-scale technological systems. Process Saf. Environ. Prot. 2015, 97, 13–24. [Google Scholar] [CrossRef]
  35. Endsley, M.R. Toward a theory of situation awareness in dynamic systems. Hum. Factors 1995, 37, 32–64. [Google Scholar] [CrossRef]
  36. Øvergård, K.I.; Sorensen, L.J.; Nazir, S.; Martinsen, T.J. Critical incidents during dynamic positioning: Operators’ situation awareness and decision-making in maritime operations. Theor. Issues Ergon. Sci. 2015, 16, 366–387. [Google Scholar] [CrossRef]
  37. Hilbert, M.; López, P. The world’s technological capacity to store, communicate, and compute information. Science 2011, 332, 60–65. [Google Scholar] [CrossRef] [PubMed]
  38. Kidd, A. (Ed.) Knowledge Acquisition for Expert Systems: A Practical Handbook; Springer Science & Business Media: Berlin, Germany, 2012. [Google Scholar]
  39. Liao, S.H. Expert system methodologies and applications—A decade review from 1995 to 2004. Expert Syst. Appl. 2005, 28, 93–103. [Google Scholar] [CrossRef]
  40. Hsu, F.H. IBM’s deep blue chess grandmaster chips. IEEE Micro 1999, 19, 70–81. [Google Scholar]
  41. Tiwari, V.; Keskar, A.; Shivaprakash, N.C. Towards creating a reference based self-learning model for improving human machine interaction. CSI Trans. ICT 2017, 5, 201–208. [Google Scholar] [CrossRef]
  42. Silver, D.; Schrittwieser, J.; Simonyan, K.; Antonoglou, I.; Huang, A.; Guez, A.; Hubert, T.; Baker, L.; Lai, M.; Bolton, A.; et al. Mastering the game of go without human knowledge. Nature 2017, 550, 354–359. [Google Scholar] [CrossRef] [PubMed]
  43. Wiener, E.L.; Curry, R.E. Flight-deck automation: Promises and problems. Ergonomics 1980, 23, 995–1011. [Google Scholar] [CrossRef]
  44. Billings, C.E. Aviation Automation: The Search for a Human-Centered Approach; CRC Press: Boca Raton, FL, USA, 1997. [Google Scholar]
  45. Sarter, N.B.; Woods, D.D.; Billings, C.E. Automation surprises. In Handbook of Human Factors and Ergonomics; Wiley: Hoboken, NJ, USA, 1997; Volume 2, pp. 1926–1943. [Google Scholar]
  46. Norman, D.A. The ‘problem’ with automation: Inappropriate feedback and interaction, not ‘over-automation’. Philos. Trans. R. Soc. Lond. B Biol. Sci. 1990, 327, 585–593. [Google Scholar] [CrossRef] [PubMed]
  47. Strauch, B. Ironies of Automation: Still Unresolved After All These Years. IEEE Trans. Hum. Mach. Syst. 2017. [Google Scholar] [CrossRef]
  48. Onken, R.; Schulte, A. System-Ergonomic Design of Cognitive Automation: Dual-Mode Cognitive Design of Vehicle Guidance and Control Work Systems; Springer: Berlin, Germany, 2010. [Google Scholar]
  49. Fitts, P.M. Human Engineering for an Effective Air-Navigation and Traffic-Control System; National Research Council: Washington, DC, USA, 1951. [Google Scholar]
  50. De Winter, J.C.; Dodou, D. Why the Fitts list has persisted throughout the history of function allocation. Cogn. Technol. Work 2014, 16, 1–11. [Google Scholar] [CrossRef]
  51. Parasuraman, R.; Sheridan, T.B.; Wickens, C.D. A model for types and levels of human interaction with automation. IEEE Trans. Syst. Man Cybern. Part A Syst. Hum. 2000, 30, 286–297. [Google Scholar] [CrossRef]
  52. Vagia, M.; Transeth, A.A.; Fjerdingen, S.A. A literature review on the levels of automation during the years. What are the different taxonomies that have been proposed? Appl. Ergon. 2016, 53, 190–202. [Google Scholar] [CrossRef] [PubMed]
  53. Patton, M.Q. Enhancing the quality and credibility of qualitative analysis. Health Serv. Res. 1999, 34, 1189. [Google Scholar] [PubMed]
  54. Kvale, S. Doing Interviews (Qualitative Research Kit); Sage Publications Ltd.: London, UK, 2008. [Google Scholar]
  55. Bowen, G.A. Document analysis as a qualitative research method. Qual. Res. J. 2009, 9, 27–40. [Google Scholar] [CrossRef]
  56. Marois, R.; Ivanoff, J. Capacity limits of information processing in the brain. Trends Cogn. Sci. 2005, 9, 296–305. [Google Scholar] [CrossRef] [PubMed]
  57. Véronneau, S.; Cimon, Y. Maintaining robust decision capabilities: An integrative human–systems approach. Decis. Support Syst. 2007, 43, 127–140. [Google Scholar] [CrossRef]
Figure 1. Rig connected to the well by marine riser string elements. Not to scale. Green dotted circle indicates safe operating area and red dotted circle the limits where EQD is needed. (Source: Authors).
Figure 1. Rig connected to the well by marine riser string elements. Not to scale. Green dotted circle indicates safe operating area and red dotted circle the limits where EQD is needed. (Source: Authors).
Jmse 06 00017 g001
Figure 2. Endsley’s model of SA (reproduced with permission from [36], copyright Øvergård et. al., 2015).
Figure 2. Endsley’s model of SA (reproduced with permission from [36], copyright Øvergård et. al., 2015).
Jmse 06 00017 g002
Figure 3. Our understanding and definition of automation vs. autonomy.
Figure 3. Our understanding and definition of automation vs. autonomy.
Jmse 06 00017 g003
Figure 4. Our research framework for identifying and improving the level of automation in EQD.
Figure 4. Our research framework for identifying and improving the level of automation in EQD.
Jmse 06 00017 g004
Table 1. Well specific operating guidelines, with examples of parameters containing precise and imprecise (“situation-specific”) criteria.
Table 1. Well specific operating guidelines, with examples of parameters containing precise and imprecise (“situation-specific”) criteria.
GreenBlueYellowRed
LabelNormal operationsAdvisory statusReduced statusEmergency status
DefinitionAll systems are fully operational. Operations are commencing within acceptable limits.Operations are approaching performance or alarm limits. Operations may continue, but risk must be assessed. And/or there is a failure that does not affect redundancy.Operational or performance limits are reached, and/or there are components or system failure that result in loss of redundancy. The vessel is not out of position.Pre-defined operational or performance limits are exceeded, and/or there are critical component or system failures. Loss of control or position.
ResponseMeet all conditions in the green column in order to commence operations.Conduct a risk assessment to determine if operations should be adjusted or ceased. Stop operations, initiate contingency procedures and prepare to disconnect. Operations may be resumed if redundancy is recovered and all operational risks have been assessed.Cease operations. Take immediate action and initiate EQD sequence. Ensure the safety of people, environment, the vessel and equipment.
Example 1: DP position footprint<5 m>5 m10 m15 m
Example 2: Riser limitation UFJ0–1.5 deg.2 deg.Situation-specificSituation-specific
Example 3: Wind directionSituation-specificSituation-specificSituation-specificSituation-specific
Table 2. Example of a rational decision-making technique, illustrating the systematic approach.
Table 2. Example of a rational decision-making technique, illustrating the systematic approach.
Process Steps
1.
Define the problem and what kind of knowledge that is needed.
2.
Make a decision hierarchy, starting with the goal of the decision at the top, then state the objectives from a broad perspective at the intermediate levels. Formulate criteria and draw lines representing dependencies to subsequent elements. The lowest level should represent a set of alternatives.
3.
Make matrices allowing for pairwise comparison. Upper level elements should be used to compare elements in the level below.
4.
Generate priorities from the comparisons, and use them to weigh the priorities in the level below.
5.
For each element in the level below, apply its weighed values to obtain its global priority. Continue this process of weighing and prioritizing throughout the hierarchy to the alternatives in the bottom most level.
Table 3. Challenges in automation.
Table 3. Challenges in automation.
PhenomenonDescription
BrittlenessModern socio-technical systems may be so complex that it is almost impossible to define all relevant functions and alternatives, as well as the scope of system limits and relevant interfaces with other systems.
OpacityTechnology systems have limited capability to express and explain what it is doing, and what it is planning to do next, to the human operator.
LiteralismAutomata stick to the rules and instructions given by their programmers or operators (the process), even if they may lead to obviously undesired outcomes (lack of goal orientation).
ClumsinessThe system has little understanding of the work situation of the operator, and thus does not aid when needed or call for attention when operator workload is very high.
Data overloadProducing large amounts of information, of which only a small part is useful for the operator. The situation may also be opposite: that the system does not produce information that obviously would be helpful from the perspective of the operator.
Table 4. Profiles of interviewees.
Table 4. Profiles of interviewees.
RoleVesselEQD SystemWater Depths (m)Years of ExperienceNo. of EQD Experienced
OIMDP3 rigManual250–500161
DPODP3 rigAutomatic310–36051
DPODP3 shipManual150–20080
Expert Drilling AdvisorDP3 rigManual100–1700>200
Table 5. Interview findings of information parameters used for assessing need for EQD.
Table 5. Interview findings of information parameters used for assessing need for EQD.
No.ParameterInput from Role
1Red lamp and sound alarmAll
2DP system failureAll
3Rig moving towards positional limitsAll
4Weather conditions (current situation and weather forecast)All
5Black out on supply vessel nearby (lost communication with supply vessel, and vessel drifted towards the rig)OIM
6Confirmed fire (depends on where, OIM to evaluate)OIM, expert
7Failure on machinery or thrustersOIM
8Unshearable tubulars inside the BOP upon emergencyAll
9Sudden failure on heave compensating detected by the operator (sound)OIM
10Limits specified in the Well Specific Operational Guideline (WSOG), procedure placed on the DP console at all timesDPO drill rig
11GPS raw data (used when the DP system failed to read the sudden change in position due to a large wave that pushed the rig out of position)DPO drill rig
12DP system reference failure (sensor failure, DPO calls for EQD whithout reaching the position limit due to not knowing the position of the rig)DPO drill rig
13Communication between driller/DPO/OIM upon emergency (you would like to know that the driller is aware of the situation, preferably the driller should activate the EQD)DPO drill rig, Expert
14DP drive off/drift off/black outDPO drill ship
15Detection of hydrocarbons on deckOIM, DPO drill ship
16Wind sensor from DP systemExpert
17Uncontrollable situation in the wellExpert
Table 6. Interview findings about the use of auto-EQD.
Table 6. Interview findings about the use of auto-EQD.
RoleFindings
OIM“I think it is positive that we do not have auto-EQD, so we can move slightly outside the limit of 15 m.”…”need to keep a cool head to let it (the rig) drift outside the red circle”.
“If installed, I will trust the auto-EQD system”…”If required by the oil company, we will install it”.
“A drift off goes very fast, auto-EQD provides safety”.
“Have not heard frustrations from those (rigs) using auto-EQD”.
“If operating in deep waters you have plenty of time (to decide), but on shallow water depths there are almost no time”.
DPO drill rig“This is the system (auto-EQD) that I know of today, and I am satisfied with it”.
“Have heard several times that the DP loses references, and the DPO pushes the button (starts EQD) before breaching predefined limits”.
“Auto-EQD is there to improve safety”.
“The operator does not need to defend the decision of doing the EQD—if limits are exceeded, the system initiates”.
“If you see that the rig is moving outside the limits, and you keep a clear mind, it is possible to disable the auto-EQD. This is not according to procedure. Has been discussed, but cannot be done without changing the procedure”.
DPO drill ship“If you have a drive off instead of a drift off…or a black out, then an auto-EQD would be good”.
“In some situations, it is good to be able to chase the position (slightly exceed the limits)”.
“Could be good to combine auto-EQD with type of situation, drive off, drift off, black out, hydrocarbons on deck…”.
“What is good with the auto-EQD, is that the company has already made the decision (to disconnect)”.
Expert“A reliable auto-EQD will increase the safety”.
“A more automatic system will reduce the probability of failing to disconnect due to the angle”.
“My concern is that if you include more software and more automatic systems, failure can arise and there are no one to interpret it, and suddenly you disonnect without intention”.
“I would have worked on a rig with auto-EQD, but I would also have asked som questions…how do you do maintenance, when was it checked the last time, DP trial…”.
“The challenge with the manual system is that you have to trust humans all the way, and the communication, the communication needs to be interpreted correctly”.
Table 7. Summary table of information parameters, coded with EQD information attributes (note that some parameters belong to more than one category, so the sum for each category may be larger than 45).
Table 7. Summary table of information parameters, coded with EQD information attributes (note that some parameters belong to more than one category, so the sum for each category may be larger than 45).
VariabilityTime PerspectiveParameter SourceOperational Rules
Static (ST): 5Seconds (S): 30Human observation (HO): 10Threshold values (TV): 1
Dynamic (DY): 39Minutes (M): 7Monitoring systems (MS): 27Yes/No rule (YN): 12
Not known: 1Hours (H): 3Expertise (EX): 10No clear Rules (NR): 22
Not known: 1Other: 1Not known: 6
Not relevant: 5Not relevant: 5
Table 8. Summary of current level of EQD automation, based on our research framework. Level 1 means that the system offers no assistance, humans must take all decisions and actions. Level 4 means that the system suggests one alternative. Level 7 means that the system executes automatically, then informs the human. Level 10 means the system acts autonomously, ignoring the human.
Table 8. Summary of current level of EQD automation, based on our research framework. Level 1 means that the system offers no assistance, humans must take all decisions and actions. Level 4 means that the system suggests one alternative. Level 7 means that the system executes automatically, then informs the human. Level 10 means the system acts autonomously, ignoring the human.
Information AcquisitionInformation AnalysisDecision SelectionAction Implementation
7 (from HMI)7 (with auto-EQD)10 (with auto-EQD)10 (fully automated sequence)
1 (manual observation)4 (without auto-EQD)4 (without auto-EQD)
1 (long term)

Share and Cite

MDPI and ACS Style

Marius, I.; Kristin, F.; Marianne, K.; Salman, N. The Level of Automation in Emergency Quick Disconnect Decision Making. J. Mar. Sci. Eng. 2018, 6, 17. https://doi.org/10.3390/jmse6010017

AMA Style

Marius I, Kristin F, Marianne K, Salman N. The Level of Automation in Emergency Quick Disconnect Decision Making. Journal of Marine Science and Engineering. 2018; 6(1):17. https://doi.org/10.3390/jmse6010017

Chicago/Turabian Style

Marius, Imset, Falk Kristin, Kjørstad Marianne, and Nazir Salman. 2018. "The Level of Automation in Emergency Quick Disconnect Decision Making" Journal of Marine Science and Engineering 6, no. 1: 17. https://doi.org/10.3390/jmse6010017

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop