*6.5. Aircraft Swap*

After progressing with their planned course of action to complete checks on other aircraft, the maintenance crew had a break from work then proceeded to obtain the required equipment and supplies to replenish the IDG oil levels of the accident aircraft. The two technicians then went to another aircraft believing it was the aircraft they intended to service; it was, however, the wrong aircraft. They were surprised to note that the cowl doors were closed and latched and thought that perhaps other staff had locked the cowl doors. They decided to check all was well and opened the cowl doors and noted that the IDG oil levels on both engines were at acceptable levels. They rationalized that the oil levels had risen to acceptable levels as the engine had cooled. So they correctly closed up the cowl doors and verified each other's work [35], on the wrong aircraft.

This is an example of an aircraft swap error—that is, "required maintenance being carried out on an incorrect aircraft" (p. 78) [35]. The investigators found aircraft swap errors were "an occasional, infrequent occurrence" (p. 78) [35]. In having the correct plan of returning to the accident aircraft to complete the service but incorrectly carrying out the plan by going to the wrong aircraft the technicians unsafe act was a slip, rather than a mistake [39]. The outcome of the normalized changes to promulgated procedures when opening cowl doors, along with the aircraft swap error was that the accident aircraft was dispatched to service with the cowl doors still on the hold-open device and not fully latched.

### *6.6. Active Failures and Latent Conditions*

Reason's [39] first description and illustration of the now ubiquitous "Swiss-Cheese" model, had five "planes". These planes were defenses against accidents occurring. The first four planes did not have holes through them. The last two planes interacted with local events and along with the limited window of opportunity a fforded by a hole in the last planes an accident could occur. A further refinement by Reason of his model was published in 1997. He saw that to understand accidents there needed to be three elements, hazards, defenses, and loses. The defenses had latent conditions arising from organization factors as well as local workplace factors.

The defenses the accident organization has in place against an event such as the cowl doors not being fully latched involved a member of the flight crew and a ground crew worker conducting separate visual checks of the aircraft including the latching devices. Unfortunately, workplace failures punched holes in this localized defense. Neither the co-pilot on his walk around inspection nor the tug driver on his inspection noticed the locking devices protruding below the cowls. The latent condition that contributed to the lack of visual recognition of the unsafe condition of the cowl doors was the positioning of the latching devices being close to the ground and not easy to see. To visually check these devices, the workers were required to be on hands and knees on the ground [35], an option that obviously did not appeal to the co-pilot or tug driver.

One of the 'planes' in Reason's (1990) original iteration of the Swiss Cheese model was labeled "fallible decisions", Kourousis, et al. [40] identified the increased defenses the manufacturer of the A32X series of aircraft inserted into the safety system in the aftermath to this accident in an e ffort to reduce or eliminate the occurrences of cowl doors not being fully latched at take-o ff. Mandated modifications included new hardware and new procedures. However, the authors noted the potential latent failings that could arise from these newly implemented modifications [40]. These latent conditions arising from fallible decisions made by people who are not proximal to the accident may hinder or even work directly against the desired e ffect of reducing the number of cowl door incidents the mandated modifications are seeking.
