**1. Introduction**

If it were possible for all organizations to learn e ffective lessons from the past, the e ffects of future unwelcome events might be limited [1]. Aviation safety depends to a large extent on the e fficacious efforts of all involved in the system [2]. Research has acknowledged the importance of event information when it comes to learning and preventing recurrence [3]. Thankfully, major events such as accidents are becoming less frequent and generate less points for learning [4]. In contrast, there are numerous incidents with less severe consequences and if appropriately considered, these could o ffer an earlier insight into the circumstances that enable unwelcome events. Predefined and relevant information harvested from incident reporting systems is a major element of learning and preserving acceptable levels of safety. Hobbs and Williamson [5] highlight the importance of aircraft maintenance sta ff being aware of the cumulative effect of "seemingly insignificant" incidents as this amplifies the need to be proactive when it comes to learning from incidents. This research undertook a qualitative examination of sta ff involved in aircraft maintenance and continuing airworthiness operations in order to identify factors that could augmen<sup>t</sup> learning from incidents within this industry sector.

In the areas of continuing airworthiness and aircraft maintenance, safety managemen<sup>t</sup> systems include incident and occurrence reporting [6] as an obligation. It is common for incidents to be discovered within organizations and reported with the assistance of such "systems of systems" [7]. On an operational level, initial training on human factors and company procedures is intended to specify and re-a ffirm the category and type of occurrence and incident that should be reported. Recent developments in European Union (EU) regulations [8] empower voluntary and confidential reporting and are independent of all other individual obligations. Detecting and identifying hazards highlighted through incident reporting systems is also recommended by the International Civil Aviation Organization (ICAO) standards and recommended practices as an e ffective means of augmenting levels of safety. However, Gerede [9] strongly suggests that a failure to foster a just culture is considered to have a negative impact upon e ffective data collection (reporting), organizational learning and the subsequent ability to learn from incidents.

Drupsteen and Wybo [10] rea ffirm organizations use experience gained from past events in order to improve safety. E ffective learning can be considered as a successful translation of safety information into knowledge. Utilizing information from events with learning potential can actively improve the operating environment and help prevent recurrence. Learning in this context can often be experienced as modifying or implementing new knowledge where cultural, technical or procedural elements are integrated. Therefore, when learning is transformed into measures to prevent re-occurrence, an organization often has a reasonable means of mitigating future similar events. Argyris and Schön [11] highlight the importance of learning to detect and address e ffective responses to errors. Their "theory in action" concept is the focal point for this determination. The first of its two components, "theory in use" is one that guides a person's behavior. It is often "tacit" and is how people behave routinely. Very often these observed "habits" are unknown to the specimen. The second element is known as "espoused theory", namely what people say or think they do. Drupsteen and Guldenmund [12] mention that espoused theory comprises of "the words we use to convey what we do, or what we like others to think we do".

However, it is important to re-a ffirm the linkages that exist between individuals and organizational learning. The introduction of safety managemen<sup>t</sup> systems (SMS) has initiated a shift in how organizational errors are viewed. Firstly, equipment has become increasingly more reliable, but the human form has not displayed the same response. In the second instance, the impact of complexities associated with an increasing cognitive load for sta ff is just beginning to be realized. The existence of a potential for blaming an individual is now being aligned with organizational responsibilities. Prior to this, event causation was often misrepresented or even over quantified the human input as organizational factors were not always considered. They o ffer an insight into the connection between individual actions and organizational initiatives designed to secure the best safety outcomes. Fogarty et al. [13] also recognize the role that both individual factors have on human error and the inputs both can have on preventing recurrence.

ICAO Doc 9859 ICAO [14] defines a template for aviation operators and regulators to support the application of a variety of proactive, predictive and reactive oversight methodologies. In addition to routine monitoring schemes, voluntary and mandatory reporting, post incident follow-up, there are also regular safety oversight audits. These audits and inspections often set out to establish if there is a di fference between espoused theory and the theory in use (e.g., is the task being correctly performed in accordance with the documented procedure/work instruction or is there a deviation from approved data and practice?). However, Drupsteen and Guldenmund [12] caution auditors not to "focus too much on the documentation of procedures" alone. In such cases, the oversight audit may be ine ffective because of its sole focus on espoused theories of the organization only and not the theory-in-use. These authors translate this idea of poor focus on theory in action, into a valid learning component arising from incidents. They also highlight the "espoused" aspect where those attempting to learn from incidents often fail to experience the desired learning because outcomes are not fully aligned with the practical objectives of a learning from incidents (LFI) initiative. For learning to be most e ffective, espoused theory and theory in use should be reasonably

well aligned. Ward et al. [15] propose it is necessary to further develop an operational model that can account for "what is meant to happen and what actually happens".

Continuing airworthiness and aircraft maintenance and activities performed in EU member states are subjected to rules that mandate reporting of defined issues. Repositories of reported data tend to be populated by sources that are predominantly the subject of mandatory reporting requirements. Conventional safety oversight models also only verify the presence of reporting media and repositories in this segmen<sup>t</sup> of the industry. Jacobsson et al. [16] avow the degree of interest invested in learning from incidents but question its e fficiency in some organizations. Although unwelcome events are less prevalent, less severe events still provide learning opportunities. There is often only a primary focus for organizations upon reporting in line with each state's own reporting obligations. Unfortunately, a narrow focus on this single element of an incident in its lifecycle can negate the potential benefits of learning from incidents at an organizational level. The absence of clearly defined competency requirements [6] that support a pedagogy for learning from incidents for continuing airworthiness sta ff could also be considered an impediment to e ffective learning in the domain.

The featured industry sector is regulated by the application and upkeep of numerous requirements in the jurisdictions of operation. In general, a costly regulatory overhead tends to be carried by regulating states and operators to support safe and viable activity. However, a growing tendency to increase regulatory requirements in pursuance of safer activity across the segments may not always o ffer the same returns as previously realized by states. Brunel [17] (p.45) suggests, " ... it is impossible to make men perfect: the men will always remain the same as they are now and no legislation will make him have more presence of mind ... ". Furniss et al. [18] reviewed the Hollnagel [19] Functional Resonance Analysis Method (FRAM) which explores how functional variability resonates within systems, i.e., how well comprising elements function in a system. They also consider how FRAM can be modified to support complex socio technical system improvements. Perhaps as the paradigm supporting the linearity of regulatory oversight shifts, proactive regulatory inputs will also influence more e ffective safety outputs as intricacy increases.
