**5. Director's Dashboard**

The director's Dashboard is a Graphical User Interface (GUI) designed specifically to allow the editorial team to define shooting missions in an easy manner during a pre-production phase and interact with the autonomous system during mission execution. We designed this tool for media production in several steps. First, we defined the high-level interactions between the Dashboard and its users, to come up with an UML diagram depicting use cases. Second, we created a database to store all information entities considered in our XML-based cinematography language. Third, we used UML activity diagrams to define the workflows that shall be supported by the Dashboard software, and built upon them the GUI appearance and behavior.

#### *5.1. Use Case Analysis*

Figure 2 shows the use case breakdown of the Dashboard. The roles involved in the process include both human actors and machine system components (shown, respectively, as man icons and gray rectangles in Figure 2). The director is responsible for defining, leading, and coordinating the editorial shooting missions. The role of the cameramen is to operate the cameras on board the drones, thus replacing their autonomous operation when necessary due to production requirements. The editorial staff deals with the data entry of events, possibly organizing them hierarchically in

sub-events. The system administrator is responsible for effective provisioning, configuration, operation, and maintenance of the Dashboard.

**Figure 2.** High-level use case diagram of the Dashboard.

The Mission Controller is the autonomous module in charge of controlling the mission production workflow. It generates signals upon the occurrence of leaf events in to trigger associated shooting actions on board the drones. Some leaf events may be manually triggered by the director, e.g., to start a race. Others may be generated automatically, e.g., after detecting the appearance of a point of interest. The Mission Controller is also in charge of managing semantic maps, i.e., Keyhole Markup Language (KML) files [20] representing a set of geolocalized features such as landing/take-off spots, no-fly zones, or points of interest. These maps are first created offline during the pre-production phase and displayed on the Dashboard. Then, during mission execution, they might be automatically updated if new features were detected, e.g., obstacle or crowd regions detected by the drone cameras.

The human and machine actors defined above interact with each other in order to create and manage an editorial shooting mission (see Figure 2). There is a higher-level process preceding any shooting mission, which is that of event management, i.e., the process by which events are organized hierarchically, and by which specific events are associated with missions planned to be executed in reaction to them.

The process of mission production is split into mission planning and mission execution. Mission planning is the process in which all aspects related to each mission are defined. These include, e.g., expected starting time and duration, shot types, shooting targets like the leader of a race or a geographic points of interest within the race route, etc. The mission planning process can be further broken down into a mission configuration process, optionally supported by a mission simulation process, which depends on the former. A fundamental subprocess of mission planning is mission validation, i.e., the process by which the flight plan originated by a shooting mission is checked and validated in terms of safety and security.

Mission execution is the process during execution, and it always follows mission planning, i.e., a shooting mission cannot be started if it has not been previously planned and validated. Mission execution is broken down into two distinct subprocesses, namely, mission management and shooting management. Mission management is the process by which the director finally takes decisions about which among the several missions associated with an event will be actually executed, i.e., specific mission and shooting action sequence roles are selected. Shooting management is the process that allows the cameraman to watch the live streams coming from the available video sources (drone cameras) and to change some camera parameters such as zoom, translation and rotation.
