Next Article in Journal
Autoethnography of Living with a Sleep Robot
Previous Article in Journal
LightSub: Unobtrusive Subtitles with Reduced Information and Decreased Eye Movement
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Systematic Review

Mobile AR Interaction Design Patterns for Storytelling in Cultural Heritage: A Systematic Review

by
Andreas Nikolarakis
* and
Panayiotis Koutsabasis
Department of Product and Systems Design Engineering, University of the Aegean, 84100 Syros, Greece
*
Author to whom correspondence should be addressed.
Multimodal Technol. Interact. 2024, 8(6), 52; https://doi.org/10.3390/mti8060052
Submission received: 1 May 2024 / Revised: 3 June 2024 / Accepted: 5 June 2024 / Published: 17 June 2024

Abstract

:
The recent advancements in mobile technologies have enabled the widespread adoption of augmented reality (AR) to enrich cultural heritage (CH) digital experiences. Mobile AR leverages visual recognition capabilities and sensor data to superimpose digital elements into the user’s view of their surroundings. The pervasive nature of AR serves several purposes in CH: visitor guidance, 3D reconstruction, educational experiences, and mobile location-based games. While most literature reviews on AR in CH focus on technological aspects such as tracking algorithms and software frameworks, there has been little exploration of the expressive affordances of AR for the delivery of meaningful interactions. This paper (based on the PRISMA guidelines) considers 64 selected publications, published from 2016 to 2023, that present mobile AR applications in CH, with the aim of identifying and analyzing the (mobile) AR (interaction) design patterns that have so far been discussed sporadically in the literature. We identify sixteen (16) main UX design patterns, as well as eight (8) patterns with a single occurrence in the paper corpus, that have been employed—sometimes in combination—to address recurring design problems or contexts, e.g., user navigation, representing the past, uncovering hidden elements, etc. We analyze each AR design pattern by providing a title, a working definition, principal use cases, and abstract illustrations that indicate the main concept and its workings (where applicable) and explanation with respect to examples from the paper corpus. We discuss the AR design patterns in terms of a few broader design and development concerns, including the AR recognition approach, content production and development requirements, and affordances for storytelling, as well as possible contexts and experiences, including indoor/outdoor settings, location-based experiences, mobile guides, and mobile games. We envisage that this work will thoroughly inform AR designers and developers abot the current state of the art and the possibilities and affordances of mobile AR design patterns with respect to particular CH contexts.

1. Introduction

This paper reviews mobile applications that promote cultural heritage (CH) storytelling with the aim of identifying and analyzing user experience (UX) design patterns for augmented reality (AR). Our goal is to systematically identify and analyze these patterns—a discussion that has so far been presented sporadically in related work—in order to lay the groundwork for a reflection on the current approaches in mobile AR design for CH within the community of UX designers and CH professionals, and to stimulate further research, design, and development.
Derived from the domain of architecture with the works of Christopher Alexander, the concept of design patterns has since been adapted widely in other areas, such as software development, human–computer interaction (HCI), and game design [1,2,3]. For the purposes of our review, we have formulated a working definition for UX mobile AR patterns based on the observations of game design patterns mentioned by Björk and Holopainen (2004) [2]. They propose design patterns as semi-formal approaches or tools or elements for recurring problems. Our approach adopts the concepts of recurring design elements, their semi-formal style, and the interdependence of patterns and adapts them to the context of mobile AR UI design for CH. Thus, we define UX mobile AR patterns as semi-formal, interdependent descriptions of commonly recurring parts of the design of a mobile AR experience in a cultural heritage context. Moreover, the key characteristics of our working definition are concisely explained below.
  • Semi-formal descriptions: We have chosen to follow a descriptive rather than a prescriptive approach (e.g., guidelines) [1,2]. UX mobile AR design patterns present a general design approach rather than a directly reusable solution. They emerge from elegant designs to address problems in particular contexts; therefore, their review and analysis are valuable to the designers and developers of mobile CH experiences.
  • Design patterns interrelationship: It is quite common for AR design patterns to coexist in a mobile AR application; thus, they form relationships that affect the UX. The types of relationships vary and might include parent–child relationships, chains of two patterns, and conflicts. It should also be noted that the coexistence of two or more patterns could be either intentionally designed or emergent [2].
  • Cultural heritage contexts: Mobile applications, as well as other kinds of interactive systems, that target cultural heritage contexts, such as museums, historical buildings, and archaeological sites, should be compatible with several factors that occur within these contexts, including, but not limited to, exhibits and cultural content, spatial arrangements, museological approaches, exhibition curation, and audience-related aspects.
  • Different from UI widgets: UX design patterns are different from user interface (UI) widgets, which are composite user interface components (or controls or views) that are directly reusable. Typically, the software implementation of a UI widget is well established, while, in the case of UX design patterns, it is more demanding both in terms of interaction design and development.
Over the last few years, several examples of UX design patterns have been applied in mobile CH applications, including virtual characters that provide narrations about CH (e.g., [4]), the appearance of 3D models of various interaction affordances to augment the user’s view of an environment (e.g., [5]), overlays of 2D texts or signs to assist user navigation (e.g., [6]), etc. However, there is no systematic identification, recording, or analysis of these patterns for the designers of mobile AR user experience for CH, with the consequence that the use of AR in CH is often ambiguous or limited to the mere superimposition of 3D models, without further meaningful interactions. A few previous reviews of mobile AR in CH approach the issue of UX design patterns indirectly but do not provide an in-depth analysis. For example, according to Goh, Sunar, and Ismail (2019) [7], all AR interaction techniques fall into one of three categories: touch-based, mid-air gesture-based, and device-based interaction techniques. Touch-based interactions involve users tapping the touchscreen with their fingertips, and mid-air gesture-based ones recognize not only fingers, but user’s whole hand as input. Device-based interactions are derived from rotate, tilt, and skew transformations and the movement of the mobile device itself. Other surveys review software frameworks or design processes for AR applications in CH and present discussions of UX patterns between the lines (for example [8,9,10,11]). Some surveys focus on technological advances in the fundamental elements of mobile AR, such as tracking and registration, network connectivity, software frameworks, AR-enabled devices, and system performance [12]. Finally, a few surveys assess mobile AR from neighboring fields to CH, like history learning or education, with a focus on the learning effects for students or learners [13,14,15].
This paper reviews selected publications published from 2016 to 2023 that present mobile AR applications in CH, with the aim of identifying and analyzing AR interaction design patterns. However, before addressing this principal concern, we must first construct the main dimensions of the current state of the art on mobile AR for CH. In addition, the reporting of UX design patterns will be undertaken with regard to selected user interaction and storytelling challenges. Therefore, the main objectives of the review are to
  • Present important dimensions of the current state of mobile AR for CH concerning the purposes, contexts of use, and main technologies of application;
  • Identify and explain the main UX design patterns that have been proposed or adopted to materialize mobile AR experiences;
  • Discuss the interaction challenges that are addressed by these design patterns and the degree to which these are addressed satisfactorily, indicating good practices and limitations, as well as areas for further research and development.
The paper is structured as follows. Section 2 presents the method of the review. Section 3 presents the general dimensions of mobile AR for CH, which are organized into the following categories: purpose (tour, game, installation, etc.), CH field (historic site, museum, etc.) environment (indoors, outdoors), user mode (single/multiple), AR tracking approach (2D images, 2D markers, etc.), visual AR content (3D models, images, texts, etc.), and other non-AR content (maps, audio narratives, etc.). Section 4 presents and explains the identified UX design patterns for mobile AR experiences in CH. Section 5 presents an overall discussion of the design patterns in terms of broader design and development concerns such as incorporating storytelling, requirements for content production, interaction affordances, and the location or environment of use. Finally, Section 6 presents the summary and conclusions.

2. Method of the Review

We have followed the PRISMA (Preferred Reporting Items for Systematic reviews and Meta-Analyses) guidelines for this review paper [16]. The PRISMA guidelines were developed to systematize the reporting of clinical trials; however, they are commonly adopted in other scientific domains. They propose a checklist of 27 items for the reporting of a systematic review. With respect to the method of the review, we report on the following items.
Eligibility criteria: The paper corpus consists of full papers that describe mobile applications that promote CH with AR technology. The publication dates of the papers are within [2016, 2023]. We set 2016 as the starting year because it was then that many mobile software frameworks and AR-capable devices were released. All papers had to include a section on empirical evaluation.
Information sources: We used the following search engines and academic libraries: Google Scholar, Scopus, ACM Digital Library, and IEEE Xplore. These online services provide direct access to most high-quality journals and conferences of interest.
Search strategy: At first, we explored the search results with a few search terms that combined (“augmented reality” or “mobile” or “cultural heritage” or “storytelling”). We observed that the most comprehensive lists of search results were obtained when we used two queries: (“mobile cultural heritage” and “augmented reality”) and (“mobile augmented reality” and “storytelling”). Therefore, we decided to retain these queries for all online sources.
Selection process: We examined approximately the first 100 search results of each online library (if available) since we (gradually) observed that the results after the first 100 were less relevant to our search. In addition, we used the feature “cited by” in cases of highly cited papers to identify more results that may have not been identified directly by the queries employed.
The flowchart diagram of the PRISMA guidelines is depicted in Figure 1.
In total, we downloaded and screened more than 600 search results (based on their titles and abstracts) based on the following criteria: (a) they potentially referred to a mobile AR application that promoted CH; (b) they were scientific papers (i.e., not a book, a thesis, an editorial, etc.); (c) they were accessible by subscription from our academic institution (e.g., not a mere citation without a link to the source publication); (d) they were relevant to this study, even if not a paper to be included in the corpus (e.g., other survey papers). A total number of 95 papers met these criteria.
Furthermore, we reviewed these papers by abstract and by the rapid examination of the structure and content to (a) exclude other relevant papers (identified in the previous screening, phase (d)); (b) eliminate duplicates; (c) identify whether they included an empirical evaluation session; (d) validate that each paper referred to a unique study (when more than one paper referred to the same study, the most comprehensive was kept); (e) exclude short papers and works in progress. Furthermore, the papers were reviewed in further detail to ensure that they included reasonably sufficient information.
A total of 64 papers remained, and these comprised the paper corpus for this review. Table 1 summarizes the dates of publication from 2016 to 2023. Table 2 summarizes the areas of publication and highlights some of the journals and conferences in which the papers were published. Almost half of the papers were published in scientific journals.
Data collection process: The first author read all papers and filled in information about the data items in a shared spreadsheet file. The second author examined the papers to confirm the data collected. Any discrepancies were resolved cooperatively.
Data items: We identified the following data items per paper: date; citation; journal or conference (J/C); contribution (in short); purpose (tour, game, etc.); environment (indoors/outdoors/both); cult. her. field (museum, historic site, etc.); AR device; AR tracking technology; AR software; app platform; AR content (visual); other non-visual AR content; single- or multiplayer app; storytelling (yes/no); game genre (if a game); main interactive tasks; UX AR design patterns; evaluation methods; evaluation dimensions; free-form comments.
Synthesis methods: Data synthesis is an iterative process of learning and providing consistency. During this step, gradually, we standardized the information provided into the data fields to the widest degree possible, to allow for automated descriptive statistics. We used both tabular and diagrammatic formats to illustrate the information with usability and clarity.

3. General Dimensions of Mobile Augmented Reality Experiences for Cultural Heritage Storytelling

Before addressing the main objective of our analysis (AR design patterns), we provide an overview of a few more generic dimensions of the paper corpus that are important to convey the application context. An overview of these dimensions is presented in Figure 2. These were organized into the following areas.
IDE + AR SDK. There are several popular integrated development environments (IDEs) that have been utilized to develop mobile AR applications. The most prevalent ones are Unity (50%), which supports cross-platform development (Android and iOS), and Android Studio (3%), which is the official IDE for Android application development, provided by Google. In recent years, a notable variety of AR software development kits (SDKs) have been released, targeting different types of AR and OS platforms, as well as supporting advanced technological features such as computer vision, object and plane recognition, geo-location tracking and sophisticated image rendering capabilities. The cross-platform Vuforia SDK is the most prevalent choice (28.1%), while its combination with Unity for the development of mobile AR apps is also well established (25%). Other choices include Google’s ARCore (9.4%), which supports both iOS and Android devices, and ARKit (4.7%) provided by Apple, which targets only iOS devices. The well-known open-source computer vision library OpenCV, with exceptional object recognition and image processing algorithms, has also been utilized to build AR applications (4.7%). Other widely used AR SDKs are Wikitude (6.3%) and EasyAR (4.7%). Apart from the aforementioned SDKs, which usually require programming skills to a varying extent, some no-code or low-code online services for the building of AR experiences have also become available recently, like HPReveal and Wikitude Studio.
OS Platform. The choice of operating system (OS) platform for the delivery of AR experiences is strongly tied to the reach of the target audience, as it significantly influences the adoption of the application, especially when a “bring-your-own-device” policy is adopted. Overall, 14.1% of the identified mobile AR apps provide cross-platform support. Over half of the AR apps (54%) target only Android devices and only a few (7.8%) target iOS devices exclusively.
Environment. We use the term “environment” to denote the surrounding CH place or space and the context of the end-users when they interact with mobile AR applications. There are several types of environments for mobile UX in CH, which were classified as “indoors”, “outdoors”, and “independent”. Indoor CH environments are often well-organized or curated and include museums, historic buildings, galleries, libraries, etc. About one third (28.1%) of the paper corpus included mobile UX for CH that were used in indoor environments. Outdoor CH environments are open-air places or spaces that often include archeological sites, gardens, historic cities, or neighborhoods. About half of the papers examined (54.7%) presented mobile applications for CH for outdoor environments. A few mobile CH experiences (7.8%) were designed for environments that were both indoors and outdoors, like the work of Koutsabasis et al. (2021) [17], which presents a mobile location-based game that enables learning about the intangible cultural heritage of olive oil production and a local factory (now a museum); this game is played both inside the factory (indoors) and in the village nearby (outdoors). Lastly, we independently characterize those approaches where the issue of the environment is not of direct relevance, like the work of Chiang et al. (2023) [18], who test the learning effect of a mobile application for learning about local art CH that activates AR content by recognizing various objects and markers that may be set up in various types of environments, like a classroom (indoors) or a local street (outdoors).
User Mode. The large majority of the papers examined describe mobile AR applications aimed for use by a single user (only) (85.95%), despite the fact that it is often acknowledged that CH visits are often made in groups of people. There were 4/64 (6.25%) mobile apps that were multi-user (only), while 5/64 (7.8%) apps supported both user modes.
AR Tracking. AR tracking refers to the approach or technology employed to sense the environment and activate digital content. The most common approach to AR tracking in mobile CH applications is object-based (45.3%), i.e., the recognition of a physical object in the environment and the activation of AR content with respect to the identified object. An example of this approach is the work of Marques and Costello (2018) [50], who developed a mobile CH app at the Smithsonian Museum of National History that can recognize animal skeletons to overlay 3D models of the animals on the user device. Location-based (GPS) tracking was common (28.1%), especially for various CH mobile tour apps, and this approach was often employed in combination with other techniques. Next, the recognition of 2D images like museum posters or paintings was also a common approach (21.9%) to enable various types of content, like audio narratives or slides with image and textual narrations (as, for example, in the work of Teixeira et al. (2021) [72]). Fewer works (10.9%) make use of plane recognition technology (like Google’s ARCore). Plane recognition may be supported by specific devices only and can be used to position 3D models onto a flat surface in the physical space (i.e., plane), like in Sabie et al. (2023) [62], who developed a mobile app that presents interactive 3D objects of households to simulate rituals between migrants and the host community. Finally, 2D markers (like QR codes) have been employed in a few works (9.4%).
Visual AR Content. Another important feature of any mobile AR UX is the (type of) visual AR content that is presented to users. In the large majority (68.8%) of the papers examined, this was 3D models of various sizes and types, like digital renderings of historic figures (e.g., [32]) or historic buildings, like the Tower of Pisa [31]. The second most common (34.4.%) type of visual AR content is images, which are overlayed upon the recognition of some feature of the physical environment, like in the work of Foukarakis et al. (2023) [34], who augment tourist guides with historic photographs that can be activated upon the visitor’s arrival at places of interest. Notably, some mobile UX experiences made use of more than one type of visual AR content (e.g., images and texts).
Non-AR Content. In most of the papers examined, we found several types of non-AR content, i.e., digital content that is supplementary to the AR experience—most notably, maps (29.7%) and audio narratives (25%).

4. Design Patterns for Mobile User Experiences and Storytelling in Augmented Reality

The analysis of the paper corpus emphasizes the identification of mobile AR design patterns. We have identified sixteen (16) main design patterns (listed in Table 3). In this section, we present each design pattern with a title, an abstract illustration that indicates the main concept and its workings (where applicable), a table with concise information and references, and an analytical description with respect to examples from the paper corpus. At the end of this section, we briefly describe another eight (8) design patterns with a single occurrence in the paper corpus.

4.1. AR Recreation of the Past Pattern

This design pattern facilitates the blending of virtually reconstructed elements and recreated events from historical eras to a variable extent with the contemporary real world (Table 4). One of the most common aspects that mobile AR applications cover to deliver views of the past involves the virtual integration of 3D reconstructions of monuments and historical buildings that have either vanished or are partially damaged into contemporary real-world settings [5,23,30,39,48,69]. Moreover, in archaeological sites, the virtual reconstruction is superimposed over the actual ancient ruins [25,51], which, in most cases, are severely damaged. Since many mobile AR apps for outdoor settings rely on GPS coordinates to register their AR content (markerless AR), and due to the actual scale of the buildings, users can physically walk around the augmented site and examine it from multiple perspectives [5,22,31,51,66,69]. An approach that achieves a similar affordance in indoor environments borrows elements from the “AR tabletop board” pattern and presents human-scale models (e.g., [43,59]. Galani and Vosinakis (2022) [35], who followed the aforementioned approach, augmented a 3D-printed small-scale model of an old factory with 3D animated human models as workers in order to demonstrate the leather tanning process. Apart from individual 3D models, other means of projecting an entire recreated historical scene via AR in outdoor settings could include 360-degree photospheres embedded with 3D models [57] and 2D images using a combination of a fading effect and perspective transformation to seamlessly blend them with the real world [27,33]. Furthermore, significant enhancements in this AR pattern could include introducing animated digital characters [78] or enabling users to actively participate in scene recreation. Concerning the former, Liestøl (2018) [48] supplemented the virtual reconstruction of a historic square, which consisted of several 3D buildings, with animated digital humans and vehicles aligned with the depicted period. Since the primary focus of this AR design pattern is to present AR content, the users tend to act as viewers, and, consequently, this might result in limited provided interactions. Tan and Ng (2022) [71] have introduced an interactive AR activity in their mobile game that asks users to virtually place white candles in their surroundings, while keeping a distance between them, in order to simulate a ritual for the festivals of the Kristang community.

4.2. AR Tricorder Pattern

Drawing inspiration from the “Star Trek” television series, Lamantia (2009) [79] proposes the AR tricorder pattern by describing a portable handheld device that scans the environment and displays relevant information overlaid onto it through the device screen (Figure 3/Table 5). A key UX aspect is the users’ ability to freely point the device toward points of interest, while simultaneously watching the augmented result via the screen. Lamantia (2009) [79] also argues that this constitutes the most prevalent design pattern in AR experiences that provides an enhanced view of the real world. The “tricorder pattern” has been referenced several times in the literature to describe works related to AR (e.g., [80]). Herein, we aim to demonstrate the variety of implementations of this pattern focused on CH contexts according to the body of reviewed papers.
Perhaps the most essential interaction in AR involves users triggering the AR content by either scanning a target (e.g., printed material, the façade of a building, a physical exhibit, etc.) or reaching a specified location of interest. This aligns completely with the primary purpose of mobile guide applications in cultural heritage contexts, which aim to connect artifacts to informative and explanatory multimedia content [81]. In these cases, the “tricorder pattern” is implemented since the content is digitally anchored to real-world objects, shaping a blended view for users and thus facilitating a conceptual–semantic connection. AR 2D annotations, like text and images, have been extensively used in this design pattern [18,24,40,52,58,77]. When implemented in cultural exhibition settings, the “tricorder pattern” can deliver supplementary content that otherwise might be difficult to physically display, like 3D models [18,37]. Augmenting printed exhibition labels can also be a variant of this pattern [43]. Instead of superimposing text and images upon exhibit recognition, Paliokas et al. (2020) [56] implemented a side panel UI with explanatory text and images, while keeping the live camera feed unchanged. Videos can also be used with this AR pattern [63,67,74,75]. Although mostly used in smartphones, Schaper et al. (2018) [63], by following the “Window-on-the-World” interaction paradigm, utilized the “tricorder pattern” in a handheld pico-projector that superimposed videos onto the physical surroundings through projections. Lastly, Tian et al. (2023) [41] utilized 3D animations to artistically illustrate traditional Chinese poems for an AR experience in Jichang Garden, China.
Notably, AR annotations used for wayfinding purposes also fit into the “tricorder pattern”, but, given their well-targeted scope, they are described under the more general “AR navigation pattern”.

4.3. AR 3D Object Manipulation

This AR pattern allows users to interact with virtual objects overlaid onto the real environment by using a variety of mainly touch-based single-hand interactions on the screens of mobile devices [7] (Table 6). The AR manipulation tasks include natural hand gestures and motions such as place, grab, drag, move, rotate, and scale (e.g., finger-pinch zooming gesture). This design pattern might also involve simulation techniques for realistic object manipulation behavior (e.g., physics). The types of AR content that can be manipulated range from small-scale items, like digitized exhibits and cultural heritage items [17,38,47,56] embalmed animals [53], and 3D traditional food assets [71] to 3D reconstructions of historical buildings [43,45,60]. Regarding the latter, human-scale representations of buildings are chosen to be presented via AR to achieve a better user experience during manipulation. Although, in many cases, 3D virtual object manipulation is provided for demonstration purposes, it can also be contextualized. Following the “augmented access” AR pattern, the capacity for manipulation is transformed UX-wise into a unique opportunity to examine protected exhibits that users are otherwise prohibited from touching [38,53,56]. Koutsabasis et al. (2021) [17] implemented this pattern to provide users with a means to showcase their inventory via AR, which they gradually acquire throughout the game. In the educational AR book by Ntagiantas et al. (2021) [54], users can resize and move AR content related to each page of the printed book to create their own AR scenes. An interactive and storytelling-driven application of this pattern is presented in the work by Sabie et al. (2023) [62], in which users can perform several cultural rituals around everyday objects via AR. More specifically, a scene consisting of various 3D objects required for a ritual (e.g., preparing Turkish coffee) is overlaid onto a real-world surface so that users can choose objects to accomplish tasks as the steps of a ritual by interacting with the other ones. The mobile app provides rich object-specific AR manipulation (e.g., pouring into a cup) and it is accompanied by audio narration for each task, as well as sound effects.

4.4. AR Digital Character Narrator Pattern

Based on the reviewed papers, the digital characters presented in mobile AR applications usually play the role of the presenter or the virtual guide for the visitors of a cultural heritage site [78] (Table 7). They could be either historical figures [32,55,70] or fictional ones, with an appearance and backstory related to the respective historical period [4,46]. In the former case, the embedded storytelling experience might include references to the protagonists’ life, like the biography of the author Italo Svevo [33] or the Dean Jonathan Swift [55]. Although many of these are presented via AR as 3D models, 2D imagery can also be adequately used: a video with actors with a transparent background [4,32] or a talking silhouette of the protagonist with a voiceover narration [33]. The anthropomorphism technique, which refers to the attribution of human characteristics to non-human entities, has also been used to generate AR characters, such as animals [54], animated exhibit statues like the figurine of “Venus of Willendorf” [64], and fantasy creatures like ghosts [73]. The content delivery form is facilitated either by audio narratives (e.g., [4]), which can also be accompanied by background music (e.g., [33]), or by text prompts (e.g., [70]).
Since AR is fundamentally linked to real-world settings, a key factor in introducing virtual characters is incorporating a storytelling approach that aligns with the context of use. For instance, in order to give meaning to the interactions required for self-navigation in mobile guide apps, the main characters appear as ghosts or spirits of the past, with fragmented memories that are gradually recovered throughout the tour [4,73].
The digital characters for location-based AR experiences in historic buildings can be their past residents [32,70] or people that regularly visited them [55]. Similarly, an important exhibit in a museum exhibition could play the role of a guide for visitors [64]. A potential next step to increase engagement could be to introduce interactive dialogs between the digital characters and users.

4.5. AR Navigation Pattern: 2D Navigation Annotations

AR has been widely utilized to assist user navigation in both indoor and outdoor environments (Figure 4). Navigation via AR involves both visual recognition capabilities, such as in the marker-based approach [82], and digitally overlaid navigational and/or informative content. This design pattern pertains to the presentation of visual annotations on the user’s live view of the real world for wayfinding purposes (Table 8). These visual annotations may be categorized into two groups: (a) textual annotations and (b) graphic annotations. The former may include the name of points of interest, their categories (e.g., historical buildings, religious sites), and their distances from the current user’s position [5,24,44,48]. The latter usually consists of graphical elements that point users to destinations or locations of interest and help them to follow a specified route [5,24,43]. In addition, some mobile AR apps feature a radar-like user interface (UI) over the live camera feed that displays other points of interest relative to the user’s location and direction [5,6].

4.6. AR Internal Structure Revelation Pattern (X-ray Effect)

This AR design pattern is used to visualize hidden or obscured internal structures, components, and mechanisms of physical objects and demonstrate how they are constructed and function (Table 9). It tends to involve 3D animated or interactive models, and this is the reason that it might require significant effort in 3D modeling to visualize extensive detail, as well as to animate the models (Figure 5). An exemplary case is described in [29], in which the visitors of a Leonardo DaVinci exhibition use a mobile AR app to scan displayed sketches to visualize 3D animated models of machines depicted on them. As a result, the visitors gain a deeper understanding of the structure and working principles of DaVinci’s inventions. The same approach was followed in [50] in order to demonstrate the complex tongue mechanism of a woodpecker at the Smithsonian’s National Museum of Natural History. The assembly process of artworks with multiple components could also be safely showcased through this AR pattern without touching the exhibited one, as in the wooden puzzle portrait of Italo Svevo in [33].
Apart from small-scale physical artifacts, the internal structure revelation pattern can also illustrate the construction processes or interior layouts of historical buildings. Plecher et al. (2019) [59] implemented marker-based AR in their educational AR game to overlay 3D models of Celtic buildings onto printed markers, which also functioned as board game components. When the player moves their device closer to the virtual model, the roof fades out and the interior of the house is revealed. Koo et al. (2019) [43] have incorporated simulation construction AR games into their mobile guide application that allows users to virtually construct, step by step, historical buildings located inside the Hwaseong Fortress, South Korea. A similar, but technically simpler, approach to exploring indoor spaces was followed in [45]: after the superimposition of the 3D model of a stone pagoda temple over a QR code marker, users can navigate inside its virtual space (in a non-AR way) and play treasure-hunt-like games. Based on the mentioned works, the revelation effect can be categorized either as passive (e.g., merely scanning the AR marker) or as active, in which case users can perform more engaging interactions.

4.7. AR Inventory Pattern

The classic game mechanism of a player’s inventory enables the collection and management of portable virtual objects across the game world [83] (Table 10). Depending on the type of game, avatars might hold different inventories and engage in decision-making upon finding and collecting these items; based on a series of conflicting attributes, they may also present game challenges. In this regard, the AR equivalent of the inventory pattern involves users collecting digital items that are either superimposed onto the real world via AR, or their utilization by the user may trigger a visual AR effect. In Sekhavat (2016) [65], a 3D model of a kiosk-like art exhibition room is placed by the user onto a specified physical location displaying embedded 3D models of their digital artworks. Other players can visit the locations of these virtual kiosks, meet the artist–owner in situ, and see the presented artworks via AR. In mobile location-based AR games, the AR inventory pattern could also provide a record of the user’s virtual possessions and accomplishments [72] and act as a reward system for a cultural heritage site [17,77]. Furthermore, in [77], the gathered AR objects, which resemble the pieces of a puzzle, when merged, reveal a digital reconstruction of a stone statue. In [17], players collect 3D models of traditional tools for olive oil production by correctly answering location-based quizzes, which they can later overlay on any surface for a post-visit cultural experience. A reverse approach to this pattern that also facilitates a storytelling approach is described in [49]: the player is a postman, and their mission is to deliver party invitations by visiting physical locations, where they interact with 3D avatars of historical inhabitants of the town via AR.

4.8. AR Navigation Pattern: Visual Information Gap

This AR design pattern could be described as the visual equivalent of the “information gap” technique, which involves purposefully withholding pieces of information in order to stimulate curiosity [84]. It seeks to conceal or selectively present visual elements of an image or environment to provide visitors with navigational guidance in an engaging and gamified way [81] (Table 11). An example of concealing aspects of the real world is the presentation of the outlined shape of an object, while making the interior details obscure or empty (as illustrated in Figure 6). In [53], visitors are challenged to match the silhouettes presented in the mobile app with the appropriate embalmed animals exhibited in the Natural History Museum. Similarly, in [43], the silhouettes of real objects that act as AR markers, such as signboards and wall panels, aid users’ self-navigation. Furthermore, this pattern could also assist visitors in orienting their sight towards defined areas within a location. Damala et al. (2016) [28] used exhibit outlines in their tangible AR Loupe to focus the gaze on museum displays, while outlines of buildings were utilized in an outdoor setting in [4]. In the latter work, the outlined images were seamlessly intertwined with storytelling as the abstract representations of reality symbolized the faded memories of the spirit-like main character [4].

4.9. AR Time Travel Effect

The time travel effect aims to transport users to different time periods in order to present historical contexts and events via AR for present-day locations. What differentiates this AR design pattern from simply overlaying 3D models of historical buildings is that time control is given to the user, who can actively select a historic period and instantly watch the respective AR visualizations (Table 12). In [31], a typical button-based timeline shows different phases of construction of the Leaning Tower of Pisa, Italy, with separate 3D models for each one. Gao et al. (2018) [36] implemented a two-position toggle switch for the time: when touched, it showed ancient images of the user’s location, and, when released, the current view appeared with the 3D AR content again. Although the time travel effect mostly involves presenting images of the past while having modern-day settings as a starting reference point, the opposite can also occur. In the mobile AR app presented by Foukarakis et al. (2023) [34], users can select physical postcards that range across different periods of Heraklion City, Greece, in order to overlay them with images of the same locations as they exist today.
Due to the visualization of different time periods, the implementation of this design pattern could be quite demanding considering the required AR content. Specifically, in the case of overlaying 3D models rather than 2D historic photographs, it might require a large number of models to be constructed to cover the whole scene, or multiple versions of buildings might need to be constructed. Since its main purpose is to present a scene of the past, the time travel pattern might result in limited interaction capabilities for users. A more intuitive timeline UI design, such as the respective design concept in [85], could enhance the user’s control over the time travel effect.

4.10. Augmented Access Pattern

This AR pattern allows protected artifacts to be accessed virtually through realistic 3D models (Table 13). Users can digitally examine and interact with items that cannot be touched in reality because of preservation factors. The options for interactivity with the virtual object draw upon the “3D object manipulation” AR pattern. In the prototype of the mobile AR app presented by Liang et al., 2020 [47], a high-accuracy 3D-scanned model of a porcelain vase was presented to users via AR so that they could observe its pattern and glaze in detail. The illusion of accessing items that are protected within displays is created from the fact that the visualization of the 3D model takes place close to the actual exhibited one (as illustrated in Figure 7) [53,56]. Thus, users can examine both the physical and virtual artifact in parallel. Regarding the UX aspect of an AR experience, the exclusive access provided by this AR pattern might increase visitors’ interest and trigger a sense of discovery. Lastly, because of the focus on manipulation, the 3D models presented must be created with extensive detail.

4.11. AR Tabletop Board Pattern

This AR pattern requires a physical board and other printed materials (e.g., cards) (as illustrated in Figure 8) that play the role of AR markers for virtual overlays [42,59] (Table 14). Users might interact with both the physical and virtual environments. Due to the small-scale game pieces, there is usually an all-seeing top-down perspective in the AR scene (e.g., [76]), which is compatible with the typical perspective of board games [83]. An exemplary use of the top-down perspective is presented in [35], where a 3D-printed scale model of an old leather tanning factory was the overlay surface for the 3D AR content. The users were able to see the whole 3D model with the animated content while walking around the model or changing the approaching distance while holding a smartphone. Although, in the aforementioned case, the main area of interaction is the virtual environment, in educational AR board games the digital content supplements rather than fully replacing the physical components, like superimposing 3D models of historical buildings onto game boards [42,59]. However, several actions that affect the gameplay can also be performed via AR, like in [59], such as household items and weapons upgrades and the configuration of the physical distances between the AR markers and overlaid buildings.

4.12. Invisible Element Revelation Pattern

This design pattern serves a similar goal as “AR recreation of the past”, i.e., virtually restoring historical artifacts and visualizing their original states as accurately and authentically as possible (Table 15). Although virtual recreation usually involves the superimposition of 3D models onto the real world, this pattern makes use of AR overlays onto existing physical cultural objects in order to restore faded colors and markings (as illustrated in Figure 9) or to unveil invisible mechanics and backstories. Two examples of the former case are the recoloring of white Ancient Greek statues [28] and the revelation of faded prehistoric rock art paintings [21]. Regarding the latter, the mobile guide application presented by Aitamurto et al. (2018) [19] utilized an AR overlay to highlight the perspective lines of exhibited paintings, which could otherwise be overlooked by the majority of visitors. From the aforementioned contexts that this pattern has been applied to, it can be stated that it serves its purpose in location-based AR experiences.

4.13. AR Photo Capturing Pattern

This design pattern offers a method for user-generated content [36] and involves capturing still images, in which the real-world scene is overlaid with virtual content through the device screen (Table 16). Users can also be part of the AR photo (“selfie”) and thus appear to interact with virtual cultural objects [47] or historical figures [36,68]. To better achieve this effect, users can move around while holding their devices and utilize different poses and perspectives within the AR scene, resulting in an engaging activity. Apart from image generation, Gao et al. (2018) [36] have also implemented a sharing functionality within their mobile app to promote social interactions within the community.

4.14. AR Quiz Pattern

This AR design pattern utilizes visual recognition capabilities, which are inherent to marker-based AR, to transform typical quiz challenges into real-world on-site experiences (Table 17). Instead of written answers, an AR quiz accepts visual input from a live camera feed as the user’s responses as depicted in Figure 10. In the work by Tan & Ng (2022) [71], users have to guess the meaning of a word in the Kristang language that describes an everyday object (e.g., a bottle) and answer by taking a photo of the actual object in their physical surroundings. Paliokas et al. (2020) [56] have implemented a treasure-hunt-like AR quiz to encourage the visitors of a silversmithing museum to self-assess their gained knowledge. Visitors are given short verbal descriptions of exhibited artifacts, and they have to search for each corresponding item in the museum collections and scan it with their device’s camera. The AR quiz also incorporates two common game mechanics: a points reward system and a public top-ten leaderboard.
While the AR quiz design pattern does not provide any augmentations of the real-world, as most AR applications do, it can transform quizzes into engaging location-based activities that encourage visitors to actively explore cultural heritage sites. Unlike other AR patterns that might require the production of complicated digital content, like 3D models or animations, for AR overlays, the AR quiz could be simpler to implement.

4.15. AR Spotlight Effect

The AR spotlight effect (Table 18) aims to guide users’ gaze to a specified area of a physical or digital object, which is illuminated, while surrounding elements are visually dimmed or disabled via AR overlays as illustrated in Figure 11. The visual effect could be achieved through differences in brightness, contrast, and saturation. Marques and Costello (2018) [50], in their “Skin & Bones” AR app, superimposes 3D models of animals over the respective exhibited skeletons at the Smithsonian’s National Museum of Natural History. An indicative implementation of the AR spotlight effect occurs on the virtual model of a woodpecker, in which the whole body becomes disabled, with the exception of the skull, in order to illustrate (via the X-ray effect pattern, described in the next section) its complex tongue mechanism via a 3D animation. A simpler application of this pattern is described in the work by Aitamurto et al. (2018) [19], in which only specified areas of paintings remain illuminated, while the rest are darkened, to illustrate specific details along with explanatory AR textual annotations. Since the AR spotlight pattern has an explanatory purpose, audio narratives would likely be more engaging and effective than relying solely on AR textual annotations.

4.16. AR Tangible Viewer Pattern

Tangible interfaces have long been integrated into cultural heritage exhibitions to provide engaging hands-on experiences to visitors, while keeping the user interactions simple and intuitive [86]. These types of interfaces are usually small-scale replicas of actual objects, which might simulate their functionalities or possess a conceptual connection with them or with the context of use (including cultural exhibition themes). Tangible interfaces that support AR features belong to this design pattern (Table 19). Damala et al. (2016) [28] described an AR tangible design for a storytelling-guided tour of a museum, which resembled a magnifying glass (“Loupe”) and was built by enclosing a smartphone device into a wooden case. Users could use the AR Loupe to self-navigate inside the museum gallery using the exhibit recognition capabilities and to watch AR animations overlaid onto the displayed exhibits (e.g., recoloring of statues). The conceptual connection with reality, as well as the exhibition, was mainly facilitated through the Loupe metaphor that it served: enhanced vision for the area at which the user looks through it. Romano et al. (2016) [61] presented an AR tangible design resembling an Ancient Roman brick that enclosed an Android smartphone with only its screen exposed, which was also connected to a micro-processor with an NFC reading functionality. In the case of human guide tours, the AR brick had to be placed into holes in totem-like installations with large screens, while, for individual self-guiding tours, the user had to place it in a simple holder in front of an exhibit to watch AR augmentations through its embedded screen. In addition, the vertical rotation of the AR brick served as an interaction interface for the selection of another AR experience for a specified exhibit. Apart from appropriately integrating an AR tangible design in a cultural heritage context, its potential to facilitate storytelling should also be considered during the design process.

4.17. Other AR Design Patterns

The AR design patterns described above were the most frequently identified in the paper corpus. In this section, we concisely present some additional AR patterns that, despite having been addressed individually in only a few publications, seem promising for the delivery of meaningful interactions and the enhancement of UX in cultural heritage contexts.

4.17.1. AR Photo-Matching

Drawing inspiration from the photographic concept of “then-and-now”, Liestøl (2018) [48] devised an AR photo-matching challenge that prompts users to visit locations in a historic square in Estonia and match the spot and the viewing angle of a historic photo to the actual location. Valid matchings reward users with digitally overlaid, 3D-animated reconstructions of historical buildings onto the contemporary settings of the site. In addition, a fade-in visual effect akin to the “Hot & Cold” game provides hints by gradually revealing the 3D model to indicate whether the user is close to a correct match [48].

4.17.2. AR Book Metaphor

AR requires users to hold their devices in front of an object for both visual recognition and the projection of the AR content. While recognition is usually instantaneous, the projection of the virtual content might last longer and could cause physical strain. A means to tackle this ergonomic issue is described in [19], in which the authors introduce a book metaphor interface that users can switch to after recognizing the artwork by rotating their portable device to a landscape orientation. The information that was previously delivered via AR over the physical artwork is then presented with a digital overlay on top of a photo of the respective artwork.

4.17.3. AR for Social Interactions

Location-based AR experiences can foster co-located social interactions between users. An exemplary case is described in [65], in which users own and manage a 3D kiosk-like art exhibition room with their digital artwork, which is displayed in a specified physical location via AR. Users visit these locations to see the presented artworks and they can physically meet the artist–owner, who is present on-site.

4.17.4. AR Augmented Human Body

This AR pattern pertains to the overlaying of digital content onto users’ bodies or faces with the support of real-time body- and face-tracking algorithms. In the mobile AR app developed by Siang et al. (2019) [68], users can wear a traditional lip plate and play with a virtual spinning top named “gasing” via AR.

4.17.5. Public AR Annotations

Following a crowdsourcing practice, Ioannidi et al. (2017) [40] developed a novel mobile AR app that involves the creation and public sharing of user-generated AR content (text or photos) related to architectural heritage. The generated AR content is viewed through the app itself, which acts as a mobile guide.

4.17.6. AR Marker Manipulation

The educational AR board game presented by Plecher et al. (2019) [59] involves the economic simulation of a Celtic village. The goal is to efficiently manage resources, such as food, wood, coal, and ore, in order to sustain and expand the village. These resources are associated with various Celtic buildings that are overlaid as 3D models via marker-based AR onto printed cards. For instance, wood production is connected with the woodchopper’s house card. Players can adjust the production rates of the required resources by manipulating the distances between the printed cards, which are AR markers. The card distances represent the distances between the buildings of the village.

4.17.7. AR Object Transformation

In [62], users perform a variety of cultural rituals around everyday objects via AR (e.g., preparing Indian tea). At the beginning of the AR experience, they are asked to point the device’s camera at one of their own drinking utensils. After it is recognized, it is transformed via a 3D model overlay into drinking utensils from different cultures.

4.17.8. AR Projection Manipulation

Schaper et al. (2018) [63] used a handheld pico-projector to superimpose images and videos onto the physical surroundings inside a bomb shelter. For one AR activity, an image of a stretcher was split into two parts projected from two different pico-projectors. Two children were asked to synchronize their movements in order to transport a virtual person on the projected stretcher safely into the infirmary. Although the AR projection was a still image with no other digital interaction involved, the affordance of the handheld device shaped a gamified yet meaningful AR interaction.

5. Discussion

We organize the discussion of the suitability, adoption, advantages, and disadvantages of the current AR design patterns for mobile UX in CH with respect to two broad dimensions: (a) significant factors of system design and development, e.g., the AR tracking technology or approach, real-world augmentation effect, requirements for digital content production, and storytelling affordances, and (b) the type of environment or application genre, e.g., indoor/outdoor environments, (non-)location-based experiences, mobile guides, and mobile games.

5.1. AR Design Patterns for Mobile UX in CH: Considering Adoption in Terms of Factors of System Design and Development

The introduction and adoption of AR design patterns in the design and development process of mobile UX in CH depends on many factors that range from intuitive to technical aspects. In Figure 12, we outline the interplay between the identified AR design patterns and selected dimensions that pertain to the interaction design and the development phases of AR experiences in cultural heritage contexts: AR tracking (approach or technology), the real-world augmentation effect, and storytelling affordances.

5.1.1. The AR Tracking Technology or Approach Constraints the Possibilities of the AR Design Patterns’ Implementation

According to the current state of the art, the implementation of specific AR patterns depends on the approach or technology of AR tracking (which, in some cases, depends on the device capabilities). Thus, it is essential for designers and developers to cooperate closely to ensure that the introduction of a design pattern can be gracefully implemented.
Furthermore, regarding specific AR tracking approaches, we have found that marker-based approaches, such as 2D images and object recognition, can involve an actual cultural artifact as the marker for users to scan, thereby establishing a meaningful connection between the AR experience and tangible reality. The augmented access pattern, AR internal structure revelation, visual information gap, and spotlight effect are examples of this finding. In the absence of the respective exhibit (e.g., a painting, an ancient vase, an antique device with an intricate mechanism) within the users’ surrounding view, the engagement potential would be profoundly reduced.
Apart from marker-based approaches, the AR tricorder, along with specific implementations of the AR recreation of the past and AR digital character narrator patterns, could also be utilized via user location tracking. In these instances, the connection with reality emerges from a point of interest within the real world. In the absence of this contextual anchor with reality, any content delivery that includes location-based references might result in a confusing and disjointed experience.

5.1.2. Some AR Design Patterns Have a Greater Real-World Augmentation Effect Than Others

The AR design patterns that typically have the highest level of intervention within the physical surroundings are those that superimpose 3D models with the physical surroundings, such as displaying buildings or parts of them in the case of the AR recreation of the past pattern and AR time travel effect. The AR internal structure revelation pattern is mostly used to overlay small-scale objects with 3D models that demonstrate their interior structures. On the opposite end, design patterns, such as the AR tricorder and 2D navigation annotation patterns, allow more subtle interventions as they visually supplement or enhance the real-world view without occupying a significant portion of the mobile device’s screen. Lastly, design patterns that superimpose content with plane recognition techniques (e.g., AR 3D object manipulation) or use designated AR markers, like the AR tabletop board pattern, lie in between these.

5.1.3. Some AR Design Patterns Have Greater Requirements for Content Production Than Others

Since AR is strongly associated with the presentation of high-fidelity visual content, such as 3D models, complicated animations, videos, etc., this dimension evaluates the extent to which a particular AR design pattern necessitates the production of such high-quality visual assets. The AR design patterns that tend to require high-quality 3D models, which are either produced via photogrammetry or designed from scratch in a computer-aided design (CAD) application, possess the highest requirement for this dimension: the AR 3D object manipulation, the AR internal structure revelation, and the AR time travel effect patterns. However, some implementations have demonstrated the ability to fulfill these content requirements through the careful crafting of images or videos. For example, the demanding AR recreation of the past pattern could also be implemented with videos featuring actors dressed in traditional clothes or with digitally edited still imagery that blends seamlessly with the live camera feed. The AR design patterns that require only text and images that appear as annotations via AR, such as the AR tricorder pattern, or those patterns that do not actually involve AR content, such as the AR quiz and the visual information gap patterns, hold the lowest requirement for dedicated content production.

5.1.4. Some AR Design Patterns Have Stronger Storytelling Affordances Than Others

We refer to the notion of storytelling affordances in the sense of the expressive potential and narrative capabilities supported by each AR design pattern, in terms of facilitating meaningful interactions. The AR design patterns with the highest level of storytelling affordances are those whose primary purpose is to deliver cultural content, either in verbal form, like in the AR digital character narrator pattern, or with visual representations, like and the AR recreation of the past pattern. The AR time travel effect, which combines aspects of the aforementioned patterns, could also serve as a backdrop for a storytelling-driven AR experience. The visual information gap pattern, while primarily utilized as stimulating navigational assistance for visitors, if interwoven with an appropriate narrative, could also shape a compelling experience as it gives meaning to visitors’ movements and interactions. Similarly, the AR inventory pattern, when incorporated into a broader narrative, could transform the collection of digital items into an appealing exploratory experience.

5.1.5. Taking a Step Further to Associate AR Design Patterns and Storytelling: Exploring the Concept of Pattern–Story Mappings

If we are to realize the wider adoption of AR design patterns into the design thinking process, we need to associate them with storytelling approaches or techniques. In this regard, the concept of “story–pattern mapping”, drawing upon Don Norman’s (2013) [87] principle of mapping, can be explored further. “Story–pattern mapping” links AR design patterns with a given storytelling approach in a way that is meaningful within its context. Since a mobile AR experience might incorporate more than one design pattern throughout its progression [88], we could additionally introduce a “micro-story–pattern mapping” that includes AR interactions for a given sub-part of the overall storytelling experience.
One of the most frequent story–pattern mappings that we have identified lies in location-based AR applications and intends to provide meaning to users’ movements between the physical locations included in the AR experience. This is achieved by interweaving storytelling with (a) AR-assisted user navigation in the CH site and (b) digital content delivery. Regarding user navigation, we highlight the patterns of the visual information gap, AR inventory, AR digital character narrator, and AR time travel, which support storytelling in the following ways.
  • The protagonist–narrator of the story returns from the past magically (as a spirit or ghost), while this time travel has the consequence of weakened memory that can be progressively regained at each point of the tour (e.g., [4]).
  • The user acquires a role in which they need to accomplish a mission that requires visiting different locations, such as being a postman and delivering invitation letters [49] or an apprentice factory worker who solves location-based puzzles to learn about intangible cultural heritage (e.g., [17]).
  • The selection of incorporated locations is based on external, already established narrations, like biographies (e.g., [33]), historical events and facts (e.g., [58]), etc. One sub-category herein includes AR experiences in historical buildings that feature a storytelling approach based on the lives of their past residents (e.g., [32,70]) or people associated with them (e.g., [55]).
Regarding the interweaving of storytelling with digital content delivery, we have found that the design patterns of AR tricorder, AR digital character narrator, AR recreation of the past, and AR time travel can support micro-story–pattern mapping that follows the given storytelling approach. For example, the AR tricorder pattern has been utilized in [58] in order to display historical photographs and textual information, while the AR digital character narrator and AR recreation of the past patterns have been utilized in the mystery adventure in [32].
The aforementioned story–pattern mapping aligns well with the “reinforcing” AR storytelling strategy in Azuma’s framework [89]. The included AR design patterns interweave digital interactions with the real environment, resulting in an enhanced location-based experience that adds to the inherent value of cultural heritage settings. This is the reason that it is primarily used in mobile guide applications in CH, aiming to provide interpretive material in an engaging manner [88].

5.2. AR Design Patterns for Mobile UX in CH: Considering Adoption in Terms of Factors Related to Type of Environment or Application Genre

5.2.1. Some AR Design Patterns Are More Suitable Than Others in Indoor/Outdoor Environments

Depending on the environment in which a mobile AR application is intended to be used, certain AR design patterns may be more suitable or preferable than others.
Two indicative examples of suitable design patterns for outdoor settings are the AR tricorder and AR recreation. Although the most common scenario involves the utilization of the AR tricorder pattern to display navigational information, such as the distances and names of landmarks, some mobile AR apps present informative content mainly in short text form, with or without images. It is well documented that AR in outdoor settings faces several technical challenges that could negatively impact the overall UX, such as occlusion, tracking and registering problems, and varying lighting conditions. We believe that the key to mitigating these issues is to select AR content that does not require prolonged viewing or reading by users. In cases where the delivery of extensive interpretive content in visual form (e.g., text, images, videos) is required, a semantic connection from the AR-pointed object or building to a non-AR screen should be taken into consideration. For example, in [58], users can select a stumbling stone via an AR UI superimposed over the real ones, and then open a non-AR screen to read the respective story. Regarding the AR recreation of the past pattern, it is, unsurprisingly, prevalent in outdoor sites, as the displayed AR content involves 3D models of buildings—both in contemporary urban settings and archaeological sites—and the recreated scenes are usually connected to public spaces such as squares or the landmarks of a city.
On the other hand, two AR design patterns that are predominantly utilized in indoor settings are the AR digital character narrator and the 3D object manipulation patterns. Both require prolonged engagement, which is expected to be more comfortable and safer and have fewer distractions, when taking place in indoor spaces such as museums and historical buildings. Furthermore, as the AR digital characters usually act as virtual guides, it is crucial to be properly anchored to predefined exhibits or locations within the cultural space in order to directly reference, interact, or provide information about nearby real-world objects. Marker-based AR is commonly implemented in these location-based experiences since it enables more accurate AR tracking and registration in the controlled environment of an interior space. Additionally, interpretive content is delivered either via text, which requires users to keep their eyes on the screen, or via audio narrations, which necessitate a relatively calm environment.

5.2.2. Some AR Design Patterns Can Support Mobile UX Independently of the User Location

Although most of the reviewed papers pertain to location-based experiences in either indoor or outdoor settings, a few of them (6 out of 64) are not dependent on the user’s physical location (Figure 2). Detached from any specific location, the AR experience is delivered either by employing the AR tabletop board pattern, which inherently involves marker-based AR, or by following a markerless AR approach, which aligns with the “reskinning” AR strategy in Azuma’s framework [89].
The AR tabletop board pattern relies on physical materials that are digitally supplemented with a variety of AR content, mostly small-scale 3D models of buildings, cultural artifacts, and virtual characters, and AR interactions such as 3D object manipulation and AR marker manipulation. From a UX perspective, the gameplay and the overall experience are akin to those of typical board games. A similar approach, but without gameplay elements, has been employed in the educational AR book by Ntagiantas et al. (2021) [54], referring to intangible CH topics.
According to Azuma (2015) [89], the “reskinning” AR strategy focuses on virtual content that, when superimposed onto users’ surrounding environments, can shape an immersive storytelling experience. In this case, the virtual content possesses inherent value; thus, it does not rely on reality for contextualization. For example, in the work by Tan and Ng (2022) [71], users can view 3D-scanned traditional food dishes via AR, while, in another interactive AR activity, they can simulate a 3D scene of a ritual for the festivals of the Kristang community in their own space. Similarly, users of the mobile AR app designed by Sabie et al. (2023) [62] can perform various cultural rituals around everyday objects via AR. Apart from the interactive 3D objects that are superimposed onto the user’s environment, the app also supports the AR object transformation pattern, in which a real drinking utensil at which the user points their camera is recognized and transformed into a drinking utensil from a different culture.

5.2.3. AR Design Patterns for Mobile Guides

Almost half of the reviewed papers (35 out of 64) involve a mobile guide application with AR functionalities. The primary objective of these types of applications in cultural heritage contexts is to connect displayed exhibits, buildings, or locations of interest and interpretive multimedia content. This might explain the prevalence of AR design patterns that enable direct, location-based information delivery, such as the AR tricorder and AR digital character narrator patterns. To enhance the explanatory potential of mobile guide apps, the AR spotlight pattern can be employed to direct users’ attention towards specified areas of exhibits, while the AR internal structure revelation and AR invisible element revelation patterns can help to uncover hidden or faded details that are imperceptible to the naked eye. Similarly, the AR augmented access and 3D object manipulation patterns offer visitors the opportunity to interact with protected and fragile cultural heritage artifacts, thereby fostering a better understanding of their composition.
Moreover, mobile guide applications should also provide navigational assistance for users along the guided tour path. The AR 2D navigation annotations pattern is commonly utilized to address this requirement. Another pattern that also makes visitors’ self-navigation more engaging via gamification is the visual information gap pattern. Lastly, the integration of AR tangible interfaces by following the AR tangible viewer pattern has affordances that can be leveraged in guided museum tours.

5.2.4. AR Design Patterns for Mobile Games

Gamification elements were present in a considerable number of the reviewed papers. The most common of these were treasure-hunt-style location-based activities and non-AR multiple-choice quizzes. However, 12 out of 64 designs could be clearly categorized as mobile AR games. Some game genres that stand out from conventional approaches include adventure games (escape rooms, mystery missions), simulation games, and resource management games. The AR design patterns that show potential in AR games are those that can be applied in location-based experiences, which involve users traversing a walking tour. Such patterns, including the AR inventory, AR quiz, and visual information gap patterns, can foster visitors’ motivation by transforming the inevitable walking–touring activity into a more enjoyable experience. Since the game world and characters are crucial elements in game design, the AR recreation of the past and the AR digital character narrator patterns can both contribute to the gameplay and storytelling aspects of mobile AR games in the CH domain. The former aims to create a historically relevant backdrop by superimposing digital cultural content over contemporary real-world settings, while the latter serves as a medium to communicate game challenges and storytelling content to the players—the visitors of a cultural heritage site. Lastly, the AR tabletop board pattern, which can be utilized in combination with many of the aforementioned AR design patterns, can transform a cultural heritage experience into an AR-enhanced yet classic type of game, akin to a board game, allowing it to be accessible to a wider audience regardless of their physical location.

6. Summary and Conclusions

This paper reviewed the current state of the art of mobile AR applications for CH with the aim of identifying the AR design patterns employed. We followed the PRISMA guidelines for this review; the methodology is depicted in Figure 1 and the (characteristics of the) studies identified are described in Section 2 (Methodology of the Review).
The first objective of the paper was to present the most important dimensions of the current state of mobile AR for CH concerning the purposes, contexts of use, and main technologies of application. We identified a corpus of 64 papers published within 2016–2023 for this purpose. The findings of this analysis were presented and discussed in Section 3 (General Dimensions of Mobile Augmented Reality Experiences for Cultural Heritage Storytelling) and are summarized in Figure 2. We reported on the generic dimensions of the works on (a) mobile AR for CH: IDE + AR SDK (Unity is the most popular platform); (b) the OS platform (Android OS is the most common); (c) the environment, in the sense of the surrounding CS place or space (outdoor environments are the most common); (d) the user mode (single-user is the most common); (e) the AR tracking approach or technology (object-based is the most common); (f) the visual AR content (3D models are the most common); and (g) the non-AR content (maps are the most common). During the presentation of these dimensions, we provided selected examples of applications from the paper corpus.
The second objective of the paper was to identify and explain the main UX design patterns that have been proposed or adopted to materialize mobile AR experiences. This is presented in Section 4 (Design Patterns for Mobile User Experiences and Storytelling in Augmented Reality). After our analysis of the paper corpus, we identified a total of twenty-four (24) patterns, with sixteen (16) of the designs recurring in more than one paper. For each of the sixteen main patterns, we provided a title, an abstract illustration that indicated the main concept and its workings (where applicable), a table with concise information and references, and an analytical description with respect to examples from the paper corpus. A summary of the main mobile AR design patterns identified is provided in Table 3.
The third objective of the paper was to discuss the interaction challenges that are addressed by these design patterns and the degree to which these are addressed satisfactorily, indicating good practices and limitations, as well as areas for further research and development. This is presented in Section 5 (Discussion). We have discussed the identified design patterns in terms of the factors of system design and development (Section 5.1, summarized in Figure 12), i.e., the AR tracking constraints, the real-world augmentation effect, the requirements for content production, and the storytelling and interaction affordances, and the types of CH environments or application genres (Section 5.2), i.e., indoors/outdoors, location-based, mobile guides, and mobile games.
We can identify some limitations in our work. The first is the query method, which inevitably constrained the sample in ways that could not be easily assessed at the time of writing. To minimize the risk of omitting important work, we identified multiple search terms and we investigated the lists of citations of the papers identified. Second, the selection of the papers was made in early 2024, with the consequence that papers published during this time or later would not have appeared in the search engines and digital libraries. Finally, the perspective of our review was constrained, to some degree, to the full range of work on mobile AR for CH: we focused on empirical research (i.e., including a section on the user-centered evaluation of the systems or approaches), which represented a subset of the whole corpus of the work in the field. Naturally, there is more work on mobile AR interaction (i) regardless of the applications of CH, (ii) with the use of other wearable devices, and (iii) with a strong technical orientation that presents the implementation of innovative techniques and algorithms. We have not included this work in our review for reasons related to our own broader research plan; nonetheless, the body of work consisting of empirical research has its own characteristics and methods and it is already quite large and is still growing.
The main contribution of this paper is the identification and description of the mobile AR design patterns (Section 3) and their discussion with respect to several common issues (Section 4) that UX designers and developers are confronted with in their attempts to incorporate AR into CH applications. As explained earlier in the paper, this is the first review of mobile AR for CH from the perspective of design patterns. This work can contribute to a better understanding of the affordances and constraints of the design patterns in current and future scenarios and applications of mobile AR for CH. Furthermore, researchers and practitioners in mobile AR interaction design may be stimulated by the facts and ideas presented in this review, reflect on the issues identified, enrich their knowledge of the state of the art on the design of UX, and possibly rethink and improve their own work and practice. Our future work on mobile AR design patterns will include the further functional and technical analysis of selected patterns with the aim of incorporating these into semi-automated environments to develop mobile AR experiences for CH in cooperation with educators and CH professionals.

Author Contributions

Conceptualization, A.N. and P.K.; methodology, A.N. and P.K.; validation, A.N. and P.K.; investigation, A.N. and P.K.; resources, A.N. and P.K.; data curation, A.N.; writing—original draft preparation, A.N.; writing—review and editing, P.K.; visualization, A.N.; supervision, P.K.; project administration, P.K. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Acknowledgments

The authors would like to thank Triantafillos Triantafillou for his valuable assistance in illustrating the AR design patterns.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Bayle, E.; Bellamy, R.; Casaday, G.; Erickson, T.; Fincher, S.; Grinter, B.; Gross, B.; Lehder, D.; Marmolin, H.; Moore, B.; et al. Putting it all together: Towards a pattern language for interaction design. ACM SIGCHI Bull. 1998, 30, 17–23. [Google Scholar] [CrossRef]
  2. Björk, S.; Holopainen, J. Patterns in Game Design; Game Development Series; Charles River Media, Inc.: Needham, MA, USA, 2004. [Google Scholar]
  3. Borchers, J.O. A pattern approach to interaction design. In Proceedings of the 3rd Conference on Designing Interactive Systems: Processes, Practices, Methods, and Techniques, New York, NY, USA, 17–19 August 2000; pp. 369–378. [Google Scholar]
  4. Spierling, U.; Kampa, A. An extensible system and its design constraints for location-based serious games with augmented reality. In Serious Games: Third Joint International Conference, JCSG 2017, Valencia, Spain, 23–24 November 2017; Proceedings 3; Springer International Publishing: Cham, Switzerland, 2017; pp. 60–72. [Google Scholar]
  5. Panou, C.; Ragia, L.; Dimelli, D.; Mania, K. An architecture for mobile outdoors augmented reality for cultural heritage. ISPRS Int. J. Geo-Inf. 2018, 7, 463. [Google Scholar] [CrossRef]
  6. Galatis, P.; Gavalas, D.; Kasapakis, V.; Pantziou, G.E.; Zaroliagis, C.D. Mobile Augmented Reality Guides in Cultural Heritage. In Proceedings of the MobiCASE’16: Proceedings of the 8th EAI International Conference on Mobile Computing, Applications and Services, Cambridge, UK, 30 November–1 December 2016; pp. 11–19. [Google Scholar]
  7. Dd Goh, E.S.; Sunar, M.S.; Ismail, A.W. 3D object manipulation techniques in handheld mobile augmented reality interface: A review. IEEE Access 2019, 7, 40581–40601. [Google Scholar] [CrossRef]
  8. Cao, J.; Lam, K.Y.; Lee, L.H.; Liu, X.; Hui, P.; Su, X. Mobile augmented reality: User interfaces, frameworks, and intelligence. ACM Comput. Surv. 2023, 55, 1–36. [Google Scholar] [CrossRef]
  9. Boboc, R.G.; Băutu, E.; Gîrbacia, F.; Popovici, N.; Popovici, D.M. Augmented Reality in Cultural Heritage: An Overview of the Last Decade of Applications. Appl. Sci. 2022, 12, 9859. [Google Scholar] [CrossRef]
  10. Russo, M. AR in the Architecture Domain: State of the Art. Appl. Sci. 2021, 11, 6800. [Google Scholar] [CrossRef]
  11. Aliprantis, J.; Caridakis, G. A survey of augmented reality applications in cultural heritage. Int. J. Comput. Methods Herit. Sci. 2019, 3, 118–147. [Google Scholar] [CrossRef]
  12. Chatzopoulos, D.; Bermejo, C.; Huang, Z.; Hui, P. Mobile augmented reality survey: From where we are to where we go. IEEE Access 2017, 5, 6917–6950. [Google Scholar] [CrossRef]
  13. Challenor, J.; Ma, M. A review of augmented reality applications for history education and heritage visualisation. Multimodal Technol. Interact. 2019, 3, 39. [Google Scholar] [CrossRef]
  14. Kljun, M.; Geroimenko, V.; Čopič Pucihar, K. Augmented reality in education: Current status and advancement of the field. In Augmented Reality in Education: A New Technology for Teaching and Learning; Springer: Cham, Switzerland, 2020; pp. 3–21. [Google Scholar] [CrossRef]
  15. Gonzalez Vargas, J.C.; Fabregat, R.; Carrillo-Ramos, A.; Jové, T. Survey: Using augmented reality to improve learning motivation in cultural heritage studies. Appl. Sci. 2020, 10, 897. [Google Scholar] [CrossRef]
  16. Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Bssoutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.; et al. The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. Int. J. Surg. 2021, 88, 105906. [Google Scholar] [CrossRef] [PubMed]
  17. Koutsabasis, P.; Partheniadis, K.; Gardeli, A.; Vogiatzidakis, P.; Nikolakopoulou, V.; Chatzigrigoriou, P.; Vosinakis, S. Location-Based Games for Cultural Heritage: Applying the Design Thinking Process. In Proceedings of the CHI Greece 2021: 1st International Conference of the ACM Greek SIGCHI Chapter, Athens, Greece, 25–27 November 2021; pp. 1–8. [Google Scholar]
  18. Chiang, K.C.; Weng, C.; Rathinasabapathi, A.; Chen, H.; Su, J.H. Augmented Reality Supported Learning for Cultural Heritage of Taiwan in On-site and Off-site Environments: The case of a Daxi Old Street. ACM J. Comput. Cult. Herit. 2023, 16, 1–17. [Google Scholar] [CrossRef]
  19. Aitamurto, T.; Boin, J.B.; Chen, K.; Cherif, A.; Shridhar, S. The impact of augmented reality on art engagement: Liking, impression of learning, and distraction. In Virtual, Augmented and Mixed Reality: Applications in Health, Cultural Heritage, and Industry: 10th International Conference, VAMR 2018, Held as Part of HCI International 2018, Las Vegas, NV, USA, 15–20 July 2018; Proceedings, Part II 10; Springer International Publishing: Cham, Switzerland, 2018; pp. 153–171. [Google Scholar]
  20. Alkhafaji, A.; Cocea, M.; Crellin, J.; Fallahkhair, S. Guidelines for designing a smart and ubiquitous learning environment with respect to cultural heritage. In Proceedings of the 2017 11th International Conference on Research Challenges in Information Science (RCIS), Brighton, UK, 10–12 May 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 334–339. [Google Scholar]
  21. Blanco-Pons, S.; Carrión-Ruiz, B.; Lerma, J.L.; Villaverde, V. Design and implementation of an augmented reality application for rock art visualization in Cova dels Cavalls (Spain). J. Cult. Herit. 2019, 39, 177–185. [Google Scholar] [CrossRef]
  22. Boboc, R.G.; Duguleană, M.; Voinea, G.D.; Postelnicu, C.C.; Popovici, D.M.; Carrozzino, M. Mobile augmented reality for cultural heritage: Following the footsteps of Ovid among different locations in Europe. Sustainability 2019, 11, 1167. [Google Scholar] [CrossRef]
  23. Boboc, R.G.; Gîrbacia, F.; Duguleană, M.; Tavčar, A. A handheld Augmented Reality to revive a demolished Reformed Church from Braşov. In Proceedings of the Virtual Reality International Conference-Laval Virtual, Laval, France, 22–24 March 2017; pp. 1–4. [Google Scholar]
  24. Bousbahi, F.; Boreggah, B. Mobile augmented reality adaptation through smartphone device based hybrid tracking to support cultural heritage experience. In Proceedings of the 2nd International Conference on Smart Digital Environment, Rabat, Morocco, 18–20 October 2018; pp. 48–55. [Google Scholar]
  25. Čejka, J.; Zsíros, A.; Liarokapis, F. A hybrid augmented reality guide for underwater cultural heritage sites. Pers. Ubiquitous Comput. 2020, 24, 815–828. [Google Scholar] [CrossRef]
  26. Chen, C.-C.; Kang, X.; Li, X.-Z.; Kang, J. Design and Evaluation for Improving Lantern Culture Learning Experience with Augmented Reality. Int. J. Hum.–Comput. Interact. 2024, 40, 1465–1478. [Google Scholar] [CrossRef]
  27. Cushing, A.L.; Cowan, B.R. Walk1916: Exploring non-research user access to and use of digital surrogates via a mobile walking tour app. J. Doc. 2017, 73, 917–933. [Google Scholar] [CrossRef]
  28. Damala, A.; Hornecker, E.; Van der Vaart, M.; van Dijk, D.; Ruthven, I. The Loupe: Tangible augmented reality for learning to look at Ancient Greek art. Mediterr. Archaeol. Archaeom. 2016, 16, 73–85. [Google Scholar]
  29. De Paolis, L.T.; De Luca, V.; D’Errico, G. Augmented reality to understand the Leonardo’s Machines. In Augmented Reality, Virtual Reality, and Computer Graphics: 5th International Conference, AVR 2018, Otranto, Italy, 24–27 June 2018; Proceedings, Part II 5; Springer International Publishing: Cham, Switzerland, 2018; pp. 320–331. [Google Scholar]
  30. dela Cruz, D.R.; Sevilla, J.S.; San Gabriel, J.W.D.; Cruz, A.J.P.D.; Caselis, E.J.S. Design and Development of Augmented Reality (AR) Mobile Application for Malolos’ Kameztizuhan (Malolos Heritage Town, Philippines). In Proceedings of the 2018 IEEE Games, Entertainment, Media Conference (GEM), Galway, Ireland, 15–17 August 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 1–9. [Google Scholar]
  31. Duguleana, M.; Brodi, R.; Girbacia, F.; Postelnicu, C.; Machidon, O.; Carrozzino, M. Time-travelling with mobile augmented reality: A case study on the piazza dei miracoli. In Digital Heritage. Progress in Cultural Heritage: Documentation, Preservation, and Protection: 6th International Conference, EuroMed 2016, Nicosia, Cyprus, 31 October–5 November 2016; Proceedings, Part I 6; Springer International Publishing: Cham, Switzerland, 2016; pp. 902–912. [Google Scholar]
  32. Fazio, S.; Turner, J. Bringing empty rooms to life for casual visitors using an AR adventure game: Skullduggery at old government house. J. Comput. Cult. Herit. 2020, 13, 1–21. [Google Scholar] [CrossRef]
  33. Fenu, C.; Pittarello, F. Svevo tour: The design and the experimentation of an augmented reality application for engaging visitors of a literary museum. Int. J. Hum.-Comput. Stud. 2018, 114, 20–35. [Google Scholar] [CrossRef]
  34. Foukarakis, M.; Faltakas, O.; Frantzeskakis, G.; Ntafotis, E.; Zidianakis, E.; Kontaki, E.; Manoli, C.; Ntoa, S.; Partarakis, N.; Stephanidis, C. Connecting Historic Photographs with the Modern Landscape. In Proceedings of the International Conference on Human-Computer Interaction, Copenhagen, Denmark, 23–28 July 2023; Springer Nature: Cham, Switzerland, 2023; pp. 408–414. [Google Scholar]
  35. Galani, S.; Vosinakis, S. Connecting Intangible Cultural Heritage and Architecture through Mobile Augmented Reality Narratives and Scale Models. In Proceedings of the 2022 International Conference on Interactive Media, Smart Systems and Emerging Technologies (IMET), Limassol, Cyprus, 4–7 October 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 1–8. [Google Scholar]
  36. Gao, G.; Zhang, Y.; Cheng, C.; Bu, Y.; Shih, P.C. Designing to enhance student participation in campus heritage using augmented reality. In Proceedings of the 2018 3rd Digital Heritage International Congress (DigitalHERITAGE) Held Jointly with 2018 24th International Conference on Virtual Systems & Multimedia (VSMM 2018), San Francisco, CA, USA, 26–30 October 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 1–4. [Google Scholar]
  37. Garro, V.; Sundstedt, V.; Sandahl, C. Impact of Location, Gender and Previous Experience on User Evaluation of Augmented Reality in Cultural Heritage: The Mjällby Crucifix Case Study. Heritage 2022, 5, 1988–2006. [Google Scholar] [CrossRef]
  38. Hammady, R.; Ma, M.; Powell, A. User experience of markerless augmented reality applications in cultural heritage museums:‘museumeye’as a case study. In Augmented Reality, Virtual Reality, and Computer Graphics: 5th International Conference, AVR 2018, Otranto, Italy, 24–27 June 2018; Proceedings, Part II 5; Springer International Publishing: Cham, Switzerland, 2018; pp. 349–369. [Google Scholar]
  39. Hincapie, M.; Diaz, C.; Zapata, M.; Mesias, C. Methodological framework for the design and development of applications for reactivation of cultural heritage: Case study cisneros marketplace at Medellin, Colombia. J. Comput. Cult. Herit. 2016, 9, 1–24. [Google Scholar] [CrossRef]
  40. Ioannidi, A.; Gavalas, D.; Kasapakis, V. Flaneur: Augmented exploration of the architectural urbanscape. In Proceedings of the 2017 IEEE Symposium on Computers and Communications (ISCC), Heraklion, Greece, 3–6 July 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 529–533. [Google Scholar]
  41. Tian, J.; Cao, Y.; Feng, L.; Fu, D.; Yuan, K.; Qu, H.; Wang, Y.; Fan, M. PoeticAR: Reviving Traditional Poetry of the Heritage Site of Jichang Garden via Augmented Reality. Int. J. Hum.–Comput. Interact. 2023, 40, 1438–1454. [Google Scholar] [CrossRef]
  42. Kalmpourtzis, G.; Ketsiakidis, G.; Vrysis, L.; Xi, T.; Wang, X.L.; Dimoulas, C. Eliciting Educators’ Needs on the Design and Application of Augmented Reality Educational Board Games on Cultural Heritage: The case of CHARMap. In Proceedings of the 2021 IEEE Global Engineering Education Conference (EDUCON), Vienna, Austria, 21–23 April 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 1282–1286. [Google Scholar]
  43. Koo, S.; Kim, J.; Kim, C.; Kim, J.; Cha, H.S. Development of an augmented reality tour guide for a cultural heritage site. J. Comput. Cult. Herit. 2019, 12, 1–24. [Google Scholar] [CrossRef]
  44. Kotsopoulos, K.I.; Chourdaki, P.; Tsolis, D.; Antoniadis, R.; Pavlidis, G.; Assimakopoulos, N. An authoring platform for developing smart apps which elevate cultural heritage experiences: A system dynamics approach in gamification. J. Ambient. Intell. Humaniz. Comput. 2019, 15, 1679–1695. [Google Scholar] [CrossRef]
  45. Lee, J.; Lee, H.K.; Jeong, D.; Lee, J.; Kim, T.; Lee, J. Developing museum education content: AR blended learning. Int. J. Art Des. Educ. 2021, 40, 473–491. [Google Scholar] [CrossRef]
  46. Lehto, A.; Luostarinen, N.; Kostia, P. Augmented reality gaming as a tool for subjectivizing visitor experience at cultural heritage locations—Case lights on! J. Comput. Cult. Herit. 2020, 13, 1–16. [Google Scholar] [CrossRef]
  47. Liang, H.; Ge, C.; Sun, Y.; Li, P.U.; Liang, F.; Wang, C. Prototype of porcelain safety display based on Augmented Reality Technology. In Proceedings of the 2020 9th International Conference on Networks, Communication and Computing, Tokyo, Japan, 18–20 December 2020; pp. 57–64. [Google Scholar]
  48. Liestøl, G. The photo positioning puzzle: Creating engaging applications for historical photographs by combining mobile augmented reality and gamification. In Proceedings of the 2018 3RD Digital Heritage International Congress (Digitalheritage) Held Jointly with 2018 24th International Conference on Virtual Systems & Multimedia (VSMM 2018), San Francisco, CA, USA, 26–30 October 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 1–8. [Google Scholar]
  49. Luiro, E.; Hannula, P.; Launne, E.; Mustonen, S.; Westerlund, T.; Häkkilä, J. Exploring local history and cultural heritage through a mobile game. In Proceedings of the 18th International Conference on Mobile and Ubiquitous Multimedia, Pisa, Italy, 26–29 November 2019; pp. 1–4. [Google Scholar]
  50. Marques, D.; Costello, R. Concerns and challenges developing mobile augmented reality experiences for museum exhibitions. Curator: Mus. J. 2018, 61, 541–558. [Google Scholar] [CrossRef]
  51. Marto, A.; Gonçalves, A.; de Sousa, A.A. DinofelisAR: Users’ perspective about a mobile AR application in cultural heritage. In VR Technologies in Cultural Heritage: First International Conference, VRTCH 2018, Brasov, Romania, 29–30 May 2018; Revised Selected Papers 1; Springer International Publishing: Cham, Switzerland, 2019; pp. 79–92. [Google Scholar]
  52. Matviienko, A.; Günther, S.; Ritzenhofen, S.; Mühlhäuser, M. AR Sightseeing: Comparing Information Placements at Outdoor Historical Heritage Sites using Augmented Reality. Proc. ACM Hum.-Comput. Interact. MHCI 2022, 6, 1–17. [Google Scholar]
  53. Nisi, V.; Cesario, V.; Nunes, N. Augmented reality museum’s gaming for digital natives: Haunted encounters in the Carvalhal’s palace. In Entertainment Computing and Serious Games: First IFIP TC 14 Joint International Conference, ICEC-JCSG 2019, Arequipa, Peru, 11–15 November 2019; Proceedings 1; Springer International Publishing: Cham, Switzerland, 2019; pp. 28–41. [Google Scholar]
  54. Ntagiantas, A.; Manousos, D.; Konstantakis, M.; Aliprantis, J.; Caridakis, G. Augmented Reality children’s book for intangible cultural heritage through participatory content creation and promotion. Case study: The pastoral life of Psiloritis as a UNESCO World Geopark. In Proceedings of the 2021 16th International Workshop on Semantic and Social Media Adaptation & Personalization (SMAP), Corfu, Greece, 4–5 November 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 1–4. [Google Scholar]
  55. O’dwyer, N.; Zerman, E.; Young, G.W.; Smolic, A.; Dunne, S.; Shenton, H. Volumetric video in augmented reality applications for museological narratives: A user study for the long room in the library of trinity college dublin. J. Comput. Cult. Herit. 2021, 14, 1–20. [Google Scholar] [CrossRef]
  56. Paliokas, I.; Patenidis, A.T.; Mitsopoulou, E.E.; Tsita, C.; Pehlivanides, G.; Karyati, E.; Stathopoulos, E.A.; Kokkalas, A.; Diplaris, S.; Meditskos, G.; et al. A gamified augmented reality application for digital heritage and tourism. Appl. Sci. 2020, 10, 7868. [Google Scholar] [CrossRef]
  57. Petrucco, C.; Agostini, D. Teaching cultural heritage using mobile augmented reality. J. e-Learn. Knowl. Soc. 2016, 12, 115–128. [Google Scholar] [CrossRef] [PubMed]
  58. Pittarello, F.; Carrieri, A.; Pellegrini, T.; Volo, A. Remembering the city: Stumbling stones, memory sites and augmented reality. In Proceedings of the 2022 International Conference on Advanced Visual Interfaces, Rome, Italy, 6–10 June 2022; pp. 1–9. [Google Scholar]
  59. Plecher, D.A.; Eichhorn, C.; Köhler, A.; Klinker, G. Oppidum-a serious-ar-game about celtic life and history. In Games and Learning Alliance: 8th International Conference, GALA 2019, Athens, Greece, 27–29 November 2019; Proceedings 8; Springer International Publishing: Cham, Switzerland, 2019; pp. 550–559. [Google Scholar]
  60. Rizvić, S.; Bošković, D.; Okanović, V.; Kihić, I.I.; Prazina, I.; Mijatović, B. Time travel to the past of bosnia and herzegovina through virtual and augmented reality. Appl. Sci. 2021, 11, 3711. [Google Scholar] [CrossRef]
  61. Romano, M.; Díaz, P.; Ignacio, A.; D’Agostino, P. Augmenting smart objects for cultural heritage: A usability experiment. In Augmented Reality, Virtual Reality, and Computer Graphics: Third International Conference, AVR 2016, Lecce, Italy, 15–18 June 2016; Proceedings, Part II 3; Springer International Publishing: Cham, Switzerland, 2016; pp. 186–204. [Google Scholar]
  62. Sabie, D.; Sheta, H.; Ferdous, H.S.; Kopalakrishnan, V.; Ahmed, S.I. Be Our Guest: Intercultural Heritage Exchange through Augmented Reality (AR). In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, Hamburg, Germany, 23–28 April 2023; pp. 1–15. [Google Scholar]
  63. Schaper, M.M.; Santos, M.; Malinverni, L.; Berro, J.Z.; Pares, N. Learning about the past through situatedness, embodied exploration and digital augmentation of cultural heritage sites. Int. J. Hum.-Comput. Stud. 2018, 114, 36–50. [Google Scholar] [CrossRef]
  64. Schönhofer, A.; Hubner, S.; Rashed, P.; Aigner, W.; Judmaier, P.; Seidl, M. Viennar: User-centered-design of a bring your own device mobile application with augmented reality. In Augmented Reality, Virtual Reality, and Computer Graphics: 5th International Conference, AVR 2018, Otranto, Italy, 24–27 June 2018; Proceedings, Part II 5; Springer International Publishing: Cham, Switzerland, 2018; pp. 275–291. [Google Scholar]
  65. Sekhavat, Y.A. KioskAR: An augmented reality game as a new business model to present artworks. Int. J. Comput. Games Technol. 2016, 2016, 7690754. [Google Scholar] [CrossRef]
  66. Shih, N.J.; Diao, P.H.; Chen, Y. ARTS, an AR tourism system, for the integration of 3D scanning and smartphone AR in cultural heritage tourism and pedagogy. Sensors 2019, 19, 3725. [Google Scholar] [CrossRef] [PubMed]
  67. Shin, J.E.; Park, H.; Woo, W. Connecting the dots: Enhancing the usability of indexed multimedia data for ar cultural heritage applications through storytelling. In Proceedings of the 15th International Workshop on Content-Based Multimedia Indexing, Florence, Italy, 19–21 June 2017; pp. 1–6. [Google Scholar]
  68. Siang, T.G.; Ab Aziz, K.B.; Ahmad, Z.B.; Suhaifi, S.B. Augmented reality mobile application for museum: A technology acceptance study. In Proceedings of the 2019 6th International Conference on Research and Innovation in Information Systems (ICRIIS), Johor Bahru, Malaysia, 2–3 December 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 1–6. [Google Scholar]
  69. Střelák, D.; Škola, F.; Liarokapis, F. Examining user experiences in a mobile augmented reality tourist guide. In Proceedings of the 9th ACM International Conference on Pervasive Technologies Related to Assistive Environments, Corfu, Greece, 29 June–1 July 2016; pp. 1–8. [Google Scholar]
  70. Tan, K.L.; Lim, C.K. Digital heritage gamification: An augmented-virtual walkthrough to learn and explore historical places. In AIP Conference Proceedings, the 2nd International Conference on Applied Science and Technology 2017 (ICAST’17), Kedah, Malaysia, 3–5 April 2017; AIP Publishing: Melville, NY, USA, 2017; Volume 1891. [Google Scholar]
  71. Tan, S.N.; Ng, K.H. Gamified mobile sensing storytelling application for enhancing remote cultural experience and engagement. Int. J. Hum.–Comput. Interact. 2022, 40, 1383–1396. [Google Scholar] [CrossRef]
  72. Teixeira, N.; Lahm, B.; Peres, F.F.F.; Mauricio, C.R.M.; Xavier Natario Teixeira, J.M. Augmented Reality on Museums: The Ecomuseu Virtual Guide. In Proceedings of the Symposium on Virtual and Augmented Reality, Virtual, 18–21 October 2021; pp. 147–156. [Google Scholar]
  73. Tsai, T.H.; Lee, L.C. Location-based augmented reality for exploration of cultural heritage. ICIC Express Lett. Part B Appl. Int. J. Res. Surv. 2017, 8, 1577–1583. [Google Scholar]
  74. Tzima, S.; Styliaras, G.; Bassounas, A. Revealing hidden local cultural heritage through a serious escape game in outdoor settings. Information 2020, 12, 10. [Google Scholar] [CrossRef]
  75. Tzima, S.; Styliaras, G.; Bassounas, A. Augmented Reality in Outdoor Settings: Evaluation of a Hybrid Image Recognition Technique. J. Comput. Cult. Herit. 2021, 14, 1–17. [Google Scholar] [CrossRef]
  76. Voinea, G.D.; Girbacia, F.; Postelnicu, C.C.; Marto, A. Exploring cultural heritage using augmented reality through Google’s Project Tango and ARCore. In VR Technologies in Cultural Heritage: First International Conference, VRTCH 2018, Brasov, Romania, 29–30 May 2018; Revised Selected Papers 1; Springer International Publishing: Cham, Switzerland, 2019; pp. 93–106. [Google Scholar]
  77. Xu, N.; Liang, J.; Shuai, K.; Li, Y.; Yan, J. HeritageSite AR: An Exploration Game for Quality Education and Sustainable Cultural Heritage. In Proceedings of the Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems, Hamburg, Germany, 23–28 April 2023; pp. 1–8. [Google Scholar]
  78. Vosinakis, S. Digital characters in cultural heritage applications. Int. J. Comput. Methods Herit. Sci. 2017, 1, 1–20. [Google Scholar] [CrossRef]
  79. Lamantia, J. Inside Out: Interaction Design for Augmented Reality. UXmatters. 2009. Available online: https://www.uxmatters.com/mt/archives/2009/08/inside-out-interaction-design-for-augmented-reality.php (accessed on 13 January 2024).
  80. Emmerich, F.; Klemke, R.; Hummes, T. Design Patterns for Augmented Reality Learning Games. In Games and Learning Alliance: 6th International Conference, GALA 2017, Lisbon, Portugal, 5–7 December 2017; Proceedings; Lecture Notes in Computer Science; Dias, J., Santos, P., Veltkamp, R., Eds.; Springer: Berlin/Heidelberg, Germany, 2017; Volume 10653, pp. 161–172. [Google Scholar] [CrossRef]
  81. Nikolarakis, A.; Koutsabasis, P.; Gavalas, D. A location-based mobile guide for gamified exploration, audio narrative and visitor social interaction in cultural exhibitions. In Proceedings of the International Conference on Human-Computer Interaction, Virtual, 26 June–1 July 2022; Springer International Publishing: Cham, Switzerland, 2022; pp. 247–255. [Google Scholar]
  82. Anastopoulou, N.; Kokla, M.; Tomai, E.; Cheliotis, K.; Liarokapis, F.; Pastra, K.; Kavouras, M. Cartographic perspectives on spatial and thematic levels of detail in augmented reality: A review of existing approaches. Int. J. Cartogr. 2023, 9, 373–391. [Google Scholar] [CrossRef]
  83. Adams, E. Fundamentals of Game Design; Pearson Education: London, UK, 2014. [Google Scholar]
  84. Garris, R.; Ahlers, R.; Driskell, J.E. Games, motivation, and learning: A research and practice model. Simul. Gaming 2002, 33, 441–467. [Google Scholar] [CrossRef]
  85. Nilsson, S.; Arvola, M.; Szczepanski, A.; Bång, M. Exploring place and direction: Mobile augmented reality in the Astrid Lindgren landscape. In Proceedings of the 24th Australian Computer-Human Interaction Conference, Melbourne, Australia, 26–30 November 2012; pp. 411–419. [Google Scholar]
  86. Nikolakopoulou, V.; Vosinakis, S.; Nikopoulos, G.; Stavrakis, M.; Politopoulos, N.; Fragkedis, L.; Koutsabasis, P. Design and User Experience of a Hybrid Mixed Reality Installation that Promotes Tinian Marble Crafts Heritage. ACM J. Comput. Cult. Herit. 2022, 15, 1–21. [Google Scholar] [CrossRef]
  87. Norman, D.A. The Design of Everyday Things: Revised and Expanded Edition; Basic Books: New York, NY, USA, 2013. [Google Scholar]
  88. Chrysanthi, A.; Katifori, A.; Vayanou, M.; Antoniou, A. Place-based digital storytelling. the interplay between narrative forms and the cultural heritage space. In Proceedings of the International Conference on Emerging Technologies and the Digital Transformation of Museums and Heritage Sites, Nicosia, Cyprus, 2–4 June 2021; Springer International Publishing: Cham, Switzerland, 2021; pp. 127–138. [Google Scholar]
  89. Azuma, R. Location-based mixed and augmented reality storytelling. In Fundamentals of Wearable Computers and Augmented Reality, 2nd ed.; CRC Press: Boca Raton, FL, USA, 2015; pp. 259–276. [Google Scholar]
Figure 1. PRISMA flowchart diagram.
Figure 1. PRISMA flowchart diagram.
Mti 08 00052 g001
Figure 2. The taxonomy of the 64 reviewed papers regarding core technological and UX aspects: purpose, cultural heritage field, IDE and AR SDK, OS platform, environment, user mode, AR tracking method, visual AR content, non-AR content [4,5,6,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60,61,62,63,64,65,66,67,68,69,70,71,72,73,74,75,76,77].
Figure 2. The taxonomy of the 64 reviewed papers regarding core technological and UX aspects: purpose, cultural heritage field, IDE and AR SDK, OS platform, environment, user mode, AR tracking method, visual AR content, non-AR content [4,5,6,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60,61,62,63,64,65,66,67,68,69,70,71,72,73,74,75,76,77].
Mti 08 00052 g002
Figure 3. Illustration of AR tricorder pattern.
Figure 3. Illustration of AR tricorder pattern.
Mti 08 00052 g003
Figure 4. Illustration of AR navigation pattern: 2D navigation annotations.
Figure 4. Illustration of AR navigation pattern: 2D navigation annotations.
Mti 08 00052 g004
Figure 5. Illustration of AR internal structure revelation pattern.
Figure 5. Illustration of AR internal structure revelation pattern.
Mti 08 00052 g005
Figure 6. Illustration of AR navigation pattern: visual information gap.
Figure 6. Illustration of AR navigation pattern: visual information gap.
Mti 08 00052 g006
Figure 7. Illustration of AR augmented access pattern.
Figure 7. Illustration of AR augmented access pattern.
Mti 08 00052 g007
Figure 8. Illustration of AR tabletop board pattern.
Figure 8. Illustration of AR tabletop board pattern.
Mti 08 00052 g008
Figure 9. Illustration of invisible element revelation pattern.
Figure 9. Illustration of invisible element revelation pattern.
Mti 08 00052 g009
Figure 10. Illustration AR quiz pattern.
Figure 10. Illustration AR quiz pattern.
Mti 08 00052 g010
Figure 11. Illustration of AR spotlight effect.
Figure 11. Illustration of AR spotlight effect.
Mti 08 00052 g011
Figure 12. An overview of the sixteen (16) main AR design patterns critically presented, along with design and development-related dimensions: AR tracking, real-world augmentation effect, requirements for content production, and storytelling affordances.
Figure 12. An overview of the sixteen (16) main AR design patterns critically presented, along with design and development-related dimensions: AR tracking, real-world augmentation effect, requirements for content production, and storytelling affordances.
Mti 08 00052 g012
Table 1. Years of publication of the paper corpus.
Table 1. Years of publication of the paper corpus.
Year#%
202369%
202258%
2021813%
202058%
20191219%
20181219%
2017813%
2016813%
Table 2. Publication venues of the paper corpus.
Table 2. Publication venues of the paper corpus.
Publication Venue#%
Journals3046.9%
ACM Journal of Computing and Cultural Heritage710.9%
International Journal of Human–Computer Interaction34.7%
International Journal of Human–Computer Studies23.1%
Applied Sciences23.1%
Other (one occurrence)1624.6%
Conferences3453.1%
Augmented and Virtual Reality (AVR)46.3%
HCI International (HCII)23.1%
Human Factors in Computing Systems (CHI)23.1%
Games and Learning Alliance (GALA)23.1%
Virtual Systems and Multimedia (VSMM)23.1%
Serious Games23.1%
Other (one occurrence)2031.3%
Table 3. Occurrence of main AR design patterns in the paper corpus.
Table 3. Occurrence of main AR design patterns in the paper corpus.
AR Design Pattern Name#%
AR Recreation of the Past1828.1%
AR Tricorder Pattern1421.8%
AR 3D Object Manipulation1117.2%
AR Digital Character Narrator914.1%
AR Navigation Pattern: 2D Navigation Annotations69.4%
AR Internal Structure Revelation (X-ray effect)69.4%
AR Inventory57.8%
AR Navigation Pattern: Visual Information Gap46.3%
AR Tabletop Board46.3%
AR Time Travel Effect34.7%
Augmented Access34.7%
Invisible Element Revelation34.7%
AR Photo Capturing34.7%
AR Quiz Design23.1%
AR Spotlight Effect23.1%
AR Tangible Viewer Pattern23.1%
Table 4. AR recreation of the past pattern: overview.
Table 4. AR recreation of the past pattern: overview.
NameAR Recreation of the Past Pattern
DefinitionAn AR visual result shaped by blending virtual reconstructed elements and recreated events from historical eras to a variable extent with the contemporary real world
Use casesAR recreation of a historical scene with real-world navigation
AR recreation of a historical scene or event with 2D images, animated 3D objects, and photospheres with a 360-degree view
Active AR scene recreation by users
References[5,22,23,25,27,30,31,33,35,39,43,48,51,57,59,66,69,71]
Table 5. AR tricorder pattern: overview.
Table 5. AR tricorder pattern: overview.
NameAR Tricorder Pattern
DefinitionOverlays informative AR content about a location or object of interest over the real-world setting
Use casesLocation-based AR information
Augmenting printed labels
References[18,24,37,40,41,43,52,56,58,63,67,74,75,77]
Table 6. AR 3D object manipulation: overview.
Table 6. AR 3D object manipulation: overview.
NameAR 3D Object Manipulation
DefinitionUsers interact with virtual objects overlaid in the real environment with touch-based single-hand interactions
Use casesObject-specific manipulation in AR scene
Manipulation cases for 3D objects: food dishes, porcelain vases, monuments, buildings, exhibits
References[17,38,43,45,47,53,54,56,60,62,71]
Table 7. AR digital character narrator pattern: overview.
Table 7. AR digital character narrator pattern: overview.
NameAR Digital Character Narrator Pattern
DefinitionDigital characters presented via AR that play the role of a presenter or virtual guide for visitors of a cultural heritage site
Use cases2D image narrator with audio or text prompts
3D narrator with audio or text prompts
Dialogs between actors through location-based videos
References[4,32,33,46,54,55,64,70,73]
Table 8. AR navigation pattern: 2D navigation annotations—overview.
Table 8. AR navigation pattern: 2D navigation annotations—overview.
NameAR Navigation Pattern: 2D Navigation Annotations
DefinitionDigital overlays with navigational or informative content
Use casesAR 2D navigation annotations: arrows, text information (e.g., distance), icon information (e.g., monument category)
Non-AR radar interface
References[5,6,24,43,44,48]
Table 9. AR internal structure revelation pattern: overview.
Table 9. AR internal structure revelation pattern: overview.
NameAR Internal Structure Revelation Pattern
(X-ray Effect)
DefinitionVisualizes the hidden or obscured internal structures, components, and mechanisms of physical objects and demonstrates how they are constructed and function
Use casesInner structure presentation: active or passive
AR artwork structure presentation and process of creation
References[29,33,43,45,50,59]
Table 10. AR inventory pattern: overview.
Table 10. AR inventory pattern: overview.
NameAR Inventory Pattern
DefinitionUsers collect digital items that are either superimposed onto the real world via AR or their utilization can trigger a visual AR effect
Use casesCapturing and carrying 3D objects to user’s inventory
Distributing items from inventory (reverse pattern)
References[17,49,65,72,77]
Table 11. AR navigation pattern: visual information gap—overview.
Table 11. AR navigation pattern: visual information gap—overview.
NameAR Navigation Pattern: Visual Information Gap
DefinitionConceal or selectively present visual elements of an image or environment to provide a gamified navigational aid for visitors
Use casesLocating exhibits or buildings based on their outlines (stencils)
Matching silhouettes of exhibits or buildings
References[4,28,43,53]
Table 12. AR time travel effect: overview.
Table 12. AR time travel effect: overview.
NameAR Time Travel Effect
DefinitionDisplays AR content from different time periods, allowing users to control the time
Use casesBlending past and present with 3D models or 2D images
Time travel effect with timeline UI
References[31,34,36]
Table 13. Augmented access pattern: overview.
Table 13. Augmented access pattern: overview.
NameAugmented Access Pattern
DefinitionProtected displayed artifacts can be accessed virtually through realistic 3D models
References[47,53,56]
Table 14. AR tabletop board pattern: overview.
Table 14. AR tabletop board pattern: overview.
NameAR Tabletop Board Pattern
DefinitionInvolves a physical board and other printed materials that act as AR markers for superimposition of AR content.
Use casesAll-seeing top-down perspective in AR scene
AR physical tabletop (game board)
References[35,42,59,76]
Table 15. Invisible element revelation pattern.
Table 15. Invisible element revelation pattern.
NameInvisible Element Revelation Pattern
DefinitionSuperimposes AR overlays onto physical objects in order to restore faded colors and markings or to unveil invisible mechanics
Use casesOverlay with perspective lines over artwork
Recoloring of statues
Revealing faded rock art paintings
References[19,21,28]
Table 16. AR photo capturing pattern: overview.
Table 16. AR photo capturing pattern: overview.
NameAR Photo Capturing Pattern
DefinitionCapturing images with a blended view of a real-world scene overlaid with AR content
Use casesUser-generated content/sharing functionality
AR photo capturing with historical figures or cultural artifacts
References[36,47,68]
Table 17. AR quiz pattern: overview.
Table 17. AR quiz pattern: overview.
NameAR Quiz Pattern
DefinitionAn AR quiz accepts visual input from a live camera feed as the user’s response
Use casesAnswering via scanning the correct exhibit
References[56,71]
Table 18. AR spotlight effect: overview.
Table 18. AR spotlight effect: overview.
NameAR Spotlight Effect
DefinitionAims to guide users’ gaze to a specified area of a physical or digital object
Use casesTextual annotations over physical exhibits
3D animations over physical exhibits
References[19,50]
Table 19. AR tangible viewer pattern: overview.
Table 19. AR tangible viewer pattern: overview.
NameAR Tangible Viewer Pattern
DefinitionSmall-scale, tangible interfaces and replicas of actual objects, embedded with AR features
References[28,61]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Nikolarakis, A.; Koutsabasis, P. Mobile AR Interaction Design Patterns for Storytelling in Cultural Heritage: A Systematic Review. Multimodal Technol. Interact. 2024, 8, 52. https://doi.org/10.3390/mti8060052

AMA Style

Nikolarakis A, Koutsabasis P. Mobile AR Interaction Design Patterns for Storytelling in Cultural Heritage: A Systematic Review. Multimodal Technologies and Interaction. 2024; 8(6):52. https://doi.org/10.3390/mti8060052

Chicago/Turabian Style

Nikolarakis, Andreas, and Panayiotis Koutsabasis. 2024. "Mobile AR Interaction Design Patterns for Storytelling in Cultural Heritage: A Systematic Review" Multimodal Technologies and Interaction 8, no. 6: 52. https://doi.org/10.3390/mti8060052

Article Metrics

Back to TopTop