Next Article in Journal
126 GeV Higgs Boson Associated with D-term Triggered Dynamical Supersymmetry Breaking
Next Article in Special Issue
Teaching-Learning Activity Modeling Based on Data Analysis
Previous Article in Journal
Symmetry of “Twins”
Previous Article in Special Issue
Output Effect Evaluation Based on Input Features in Neural Incremental Attribute Learning for Better Classification Performance
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Real-Time Projection-Based Augmented Reality System for Dynamic Objects in the Performing Arts

1
Department of Digital Media, Graduate School, Soongsil University, 369 Sangdo-ro, Dongjak-gu, Seoul 156743, Korea
2
Department of Sports Information Technology, Graduate School, Soongsil University, 369 Sangdo-ro, Dongjak-gu, Seoul 156743, Korea
3
Department of Computer Science and Information Engineering, Inha University, 100 Inha-ro, Nam-gu, Incheon 402751, Korea
*
Author to whom correspondence should be addressed.
Symmetry 2015, 7(1), 182-192; https://doi.org/10.3390/sym7010182
Submission received: 23 January 2015 / Revised: 10 February 2015 / Accepted: 13 February 2015 / Published: 27 February 2015
(This article belongs to the Special Issue Advanced Symmetry Modelling and Services in Future IT Environments)

Abstract

: This paper describes the case study of applying projection-based augmented reality, especially for dynamic objects in live performing shows, such as plays, dancing, or musicals. Our study aims to project imagery correctly inside the silhouettes of flexible objects, in other words, live actors or the surface of actor’s costumes; the silhouette transforms its own shape frequently. To realize this work, we implemented a special projection system based on the real-time masking technique, that is to say real-time projection-based augmented reality system for dynamic objects in performing arts. We installed the sets on a stage for live performance, and rehearsed particular scenes of a musical. In live performance, using projection-based augmented reality technology enhances technical and theatrical aspects which were not possible with existing video projection techniques. The projected images on the surfaces of actor’s costume could not only express the particular scene of a performance more effectively, but also lead the audience to an extraordinary visual experience.

1. Introduction

In contemporary performing arts, we can find various attempts at furnishing the audience with different visual experience by video projection. This approach is one of ongoing various attempts to create new performance paradigm by the convergence of digital technology and analog sensibility which we can feel in the classical performance genre [1]. The ordinary video projection in performing shows has a purpose similar to matt-painting effects usually employed on the film, where background video can be composited with the actors’ actions. It is primarily used to describe the background of the stage together with sets and props made elaborately in order that the audience can be immersed in the stage as an artificial place.

Recently, in various performing shows, more advanced projection-based augmented reality (AR) enhances visual aspects which were not possible with existing video projection techniques. While the projected area in previous video projection is limited to a rectangular region like a screen, projection-based AR can project contents such as images or videos onto the unconstrained nonplanar forms. This technique offers the experience of visual immersion to the audience by representing the sets of stage as a virtual environment or virtual world rather than simply showing the background of a stage on rectangular planes [24].

We performed a research for applying projection-based AR in a particular scene of Korean original musical show, named “Turandot”, which was produced by “Daegu International Musical Festival” in Korea. The purpose of this research was to develop an effective performance technique fused with digital technology. In particular, we focused on applying projection-based AR on the dynamic objects with flexible shapes, precisely projecting images on the inside silhouette of dynamic objects. The existing techniques stated above have constraint that only static objects can be used. Among the elements of performance, there exist dynamic objects continuously moving as well as static objects. The dynamic elements influence the narrative of the performance rather than the static ones. The dynamic objects we consider in this paper are continuously moving actors and actors’ costumes; the costume in the genre of performing arts is one of the important elements of performance, describing the actor’s character. Therefore, this paper proposes an advanced performing technique, which we call a real-time projection-based augmented reality system for dynamic objects in the performing arts, to provide the audience with distinctive visual experience through our research. The texture, patterns, and images on the surface of a moving actor’s costume can be changed arbitrarily during the live performance, and it can represent the character of specific actor more effectively with our system.

This paper is described as follows:

Firstly, we examine previous related studies regarding to projection-based AR briefly and define the costume which is a dynamic object with flexible shape in performing arts. Secondly, we explain technical approach and the configuration of our system for performing arts, in addition, we present its advantages. Thirdly, we illustrate the process about how our system can be applied to the performance, the result, and the limitations of our work. Finally, we also discuss the lessons learned from this research and the direction of future work.

2. Related Works and Motivation

2.1. Projection-Based Augmented Reality

Projection-based AR is described as a video projection technique, which can extend and reinforce visual data by throwing images on the surface of 3D objects or space; this belongs to Spatial Augmented Reality in a broad sense [5,6]. Using projection-based AR, it is easy to implement graphical representation that ordinary lighting techniques cannot express. Unlike general lighting technique, the technique can project high-definition image or video, and change the object shape visually with the flow of time. Therefore, it can show visual images dynamically. This combination of imagery and real-object allows the audiences to recognize visually extended space. In addition, unlike conventional highly equipment-dependent AR [7] which is restricting audience’s body, this technique can advantage to improve their immersion. These advantages are requirements for better audiences’ experience in the performance [8,9]. In media art field, the projection-based AR is called Projection Mapping, which covers a smaller field than the Projection-based AR.

2.2. Projection Mapping on Moving Objects

Projection mapping is a technique which causes an optical illusion by analyzing three-dimensional object, projecting images, and then precisely aligning them. This technique is being widely used in various fields such as façade of building, plastic objet, and also in the performing art field. In these cases, it usually projects the images onto fixed objects by using manual alignment between the objects and projected images. On the other hand, there has been much recent research trying to perform projection onto dynamic objects with automatic alignment, but these researches are applicable only to limited shapes and movement of objects, requiring huge computation for image alignment [1012]. This causes latency until the image is aligned with 3D objects. The more latency, the slower the movement of projected image following the movement of object, consequently, it generates a visual error, which reduces the immersion of audiences [13,14].

2.3. Feature of Costumes in Our System

When projection mapping is applied to performing art, its purpose is augmenting the exterior of object or the interior environments in the theater, such as a set and objects in the stage. Unfortunately, we were unable to apply the existing method due to the characteristics of costume as follows.

Flexible shape

“Flexible shape” is a shape like actor’s costume. The existing techniques can be applied only to the surface of 3D solid objects, such as polyhedrons, cylinders, or spheres. In general, the shape of stage objects may be complex or atypical, but they are not flexible. However, flexible objects have silhouettes which are difficult to predict, and their shapes change frequently. Besides, if part of the costume is covered by occluding objects, the existing method cannot be applied.

Dynamic object

Similarly, when projection mapping is applied to dynamic object whose position is moving or shape is changing, this work also causes difficult problems. For example, when the actor performs simple movements like moving from side to side or lifting arms, it is difficult to apply projection mapping even in this easy case. To solve these problems, there have been various attempts; such as, showing the effect like projection mapping by merging certain pre-set motions with video contents through many exercises [15], or synchronizing pre-programmed robot’s movement with video contents [16,17]. However, it is evident that the actor cannot perform the given motion according to the scenario exactly robot during live performance.

3. Real-Time Projection Mapping System

The problems discussed in Chapter 2 are factors degrading performance quality by distracting the attention of the audience. Thus, we implemented Real-time Projection Mapping System using a masking technique, so that the images are automatically projected and aligned onto the dynamic object and actor’s costume with flexible shape in real-time. The process of the system is as follows: (1) Tracking varying actor’s silhouette; (2) Creating synthetic images by masking the silhouette and then compositing with video; (3) Aligning generated images with actor’s costume on stage through the projection.

3.1. Hardware Configuration

Figure 1 abstracts the hardware configuration of real-time projection mapping system. The main apparatuses of this system are composed of infrared (IR) camera equipment including IR light for the extraction of actor’s silhouette, computing device with graphics processing unit (GPU) for image processing, and a high-lumens projector for projecting the images onto the surface of the actor’s costume.

There have been previous attempts using depth camera, like Kinect from Microsoft, to extract dynamic object silhouette with flexible shape [18,19]. This equipment can easily extract the masking image, with the help of the depth data in space. To produce a high-quality result, the center lines of the projector lamp and IR camera lens have to be set up such that they are parallel and as close as possible. However, in the case of the depth-camera, its effective-range is too short to be used in a big theater. To solve the problem, we used an IR camera device with a zoom function in order to capture the actor in a large space as well. Capturing only one particular area reflecting IR light, the IR camera can effectively separate the background and the actor precisely.

3.2. Projection Mapping Based on Real-Time Masking

The purpose of the real-time masking technique is to project the images onto the certain part or the rest of the scene without delay. For instance, when we would like to augment an actor or his costume surface using projection mapping, we can select only his silhouette area excluding any other part, then project with exact alignment. While this masking technique needs less computation (as compared with 3D object tracking such as [12]), it is able to show a relatively high quality result. In other words, it can respond rapidly to the actor’s movement and align precisely. We perform the process illustrated Figure 2 to generate the masking images of the silhouette from IR camera and to merge the images with video contents made in advance.

First, an actor is captured at certain position by IR camera, with IR light projected over a wide range in the backside of the stage. Then, the masking image is generated from the silhouette by using binarization, morphology, and Gaussian blurring of the captured images [20,21]. Finally, the masking images are combined with the frames of video contents. This process is presented in Figure 2. These steps of the process are performed on GPU, which enables parallel operation and computational speed-up. As a result, it can reduce time latency a lot in image processing.

3.3. Precise Calibration of Mapping

In order to handle unexpected problems directly on the stage, we made real-time mapping application with several adjustment functions. This can be used for a quick and effective set up before the performance. This application is designed to deal with various problems happening under physical environment, lighting conditions, or projector positions, in the theater. In addition, it includes various functions which can directly control the parameters of image processing in real-time. The functions can be grouped into three categories.

  • Mask adjustment: adjusting brightness value of IR camera according to the lighting condition and modifying the parameter for morphology and blurring to extract accurate and natural actor’s silhouette.

  • Projection transform: adjusting the size and the location of projected image to minimize the errors of mapping in the case that it is hard to select optimal projector position because of the theater conditions.

  • ROI area exclusion: excluding the unnecessary area from the projection when the projection mapping is applied to the actor in live performance.

4. Implementation and Result

We selected a theater, and installed stage set, stage objects, and the real-time projection mapping system, as illustrated in Figure 3.

Then, particular musical scenes applying our system were rehearsed. In this scene, two characters appeared, and one’s dress was augmented to look like a ghost decorated by our system. Although the actress kept moving frequently, the images were projected onto and combined with her dress accurately in real-time, as shown in Figures 4 and 5. The actress did not have any difficulty in performing with our system, since she did not need to act differently and nothing needed to be changed from the original movement.

Since the theater is wide (about 27 m in width and 1200 seats), the result of the projection on the actress’ costume is different from each audience member’s viewpoint. If the ghost actress’ role is located at the center of the stage and the projection mapping is performed by using one projector, the audiences who sit at the left or right side can see only half of her silhouette. To solve this problem, the projection mapping area was expanded by installing two projectors on the left and right sides and overlapping the directions of projectors toward her, illustrated in Figure 3. This method allowed the audience to see a more preferable projected imagery than using only one projector, wherever they sat in the theater, illustrated in Figure 6.

To find out how fast our system can respond to the actor’s movement, we checked the required time to align processed images with the actor’s silhouette, illustrated in Figure 7. A series of images represent the test motions for lifting arm in controlled experiment environment for precise measurement.

Initially, while the actor keeps standing at one position and spreads his arms out horizontally, we precisely calibrated the imagery on the actor’s costume. This calibration performs image transformations through adjusting the size and the location of the projection, and then accurate silhouettes are generated through morphology and blurring operations. Then, a particular scene was rehearsed according to the planned direction while recording with a camera in front of the stage. Finally, the recorded video was analyzed. In the recording, we selected a portion with the largest actor’s movement. When the actor stopped moving, we counted the frames until the images were aligned with the actor’s costume.

With this method, the result presented that the images are aligned exactly after five frames; which means that approximately 0.167 s latency occurred. If we calculate the aligning time for all the frames in the scene, our system shows lower average latency. Our system was tested on a personal computer (PC) with Intel core i7-3770 3.50GHz CPU, NVIDIA GTX680 GPU, 16GB RAM (Customized PC); Point Grey FIREFLY® MV IR camera (Point Grey Research, Richmond, Canada); and EPSON EB-G5950 high-lumen projector (Seiko Epson Corporation, Suwa, Japan).

5. Limitation

The limitation of this result is that the experimentation of our system is optimized to express the ghost in the scene of the musical. If the actor performs any extreme movements, such as running, jumping, or swinging arms, the visual errors may be more exposed to the audience due to larger latency. In addition, when the actor moves out of the projection area, the images cannot be projected onto her silhouette.

The other limitation appears when multiple projectors are applied. As described above, two projectors were used to cover the wide audience view in the performance. In this case, it is a problem to handle the seam in the overlapping nonplanar surface. For now, we used a video content which does not make a severe artifact in the seam region. As a future research, the overlapping regions should be connected seamlessly and made into one complete image to provide a better visual experience for the audience.

Nevertheless, the experts in the musical field gave the positive response that our research can express spectacular scenes regardless of the limitations explained above.

6. Conclusions and Future Work

In this paper, we presented a projection-based augmented reality technique for a dynamic object for performing arts, which can be used effectively by merging performance costume and digital technology. In addition, we equipped sets for the performance in the theater, and showed the result of our system. Our research started in order to provide interesting visual experience to the audience. Our system augments dynamic object with flexible shape, which is the actor’s costume surface, and can change the images according to the narrative of a play in real-time. A high-quality mapping result is obtained from the application that corrects errors generated in the process of projection mapping on stage.

In the future, our plan is to improve the performance of our system further to be performed robustly as well as the performances of multiple genres or various scenes. In order to improve our system further, in the viewpoint of reducing the aligning latency, we are going to optimize the GPU based algorithm and we are currently researching the technique that generates the composed image beforehand at the estimated positions by anticipating an actor’s future movement from the previous frames. Furthermore, in order to show the audience seamless mapping results in the wide theatre environment, we will try to solve the issue about handling the overlapping region due to multiple projectors.

Acknowledgments

This research was supported by Next-Generation Information Computing Development Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education, Science and Technology (No. 2012M3C4A7032182)

Author Contributions

Jaewoon Lee and Dongho Kim designed the main idea of this research; Jaewoon Lee, Yeonjin Kim, and Myeong-Hyeon Heo developed the real-time projection-based augmented reality system and performed the experiments applying our system on the stage; Jaewoon Lee, Dongho Kim, and Byeong-Seok Shin designed the experimental environments for the error measurement of the projection; Yeonjin Kim and Myeong-Hyeon Heo performed the measurement; Jaewoon Lee, Yeonjin Kim and Dongho Kim wrote this paper.

Conflicts of Interest

The authors declare no conflict of interest

References

  1. Dixon, S. Digital Performance: A History of New Media in Theater, Dance, Performance art, and Installation; MIT Press: Cambridge, MA, USA, 2007; pp. 37–45. [Google Scholar]
  2. Coniglio, M. Materials vs. Content in Digitally Mediate Performance. In Performance and Technology: Practices of Virtual Embodiment and Interactivity; Broadhurst, S., Machon, J., Eds.; Palgrave Macmillan: New York, NY, USA, 2006; pp. 78–84. [Google Scholar]
  3. Broadhurst, S. Digital Practices: Aesthetic and Neuroesthetic Approaches to Performance and Technology; Palgrave Macmillan: Basingstoke, UK, 2007. [Google Scholar]
  4. Marner, M.R.; Haren, S.; Gardiner, M.; Thomas, B.H. Exploring interactivity and augmented reality in theater: A case study of half real 81–86.
  5. Raskar, R.; Welch, G.; Low, K.L.; Bandyopadhyay, D. Shader lamps: Animating real objects with image-based illumination, Proceedings of the 12th Eurographics Workshop on Rendering Techniques, Vienna, Austria, 25–27 June 2001; Gortler, S.J., Myszkowski, K., Eds.; Springer: London, UK, 2001; pp. 89–102.
  6. Bimber, O.; Raskar, R. Spatial Augmented Reality: Merging Real and Virtual Worlds; A. K. Peters, Ltd.: Natick, MA, USA, 2005. [Google Scholar]
  7. Feese, S.; Burscher, M.J.; Jonas, K.; Tröster, G. Sensing spatial and temporal coordination in teams using the smartphone. Hum. Centric Comput. Inf. Sci. 2014, 4, 1–18. [Google Scholar]
  8. Mine, M.R.; van Baar, J.; Grundhöfer, A.; Rose, D.; Bei, Y. Projection-based augmented reality in disney theme parks. Computer 2012, 45, 32–40. [Google Scholar]
  9. Lee, J. A Research on the Extended Virtuality of 3D Projection Mapping. Master’s Thesis, Soongsil University, Seoul, Korea, 2012. [Google Scholar]
  10. Jones, B.; Sodhi, R.; Murdock, M.; Mehra, R.; Benko, H.; Wilson, A.; Ofek, E.; MacIntyre, B.; Raghuvanshi, N.; Shapira, L. Roomalive: Magical experiences enabled by scalable, adaptive projector-camera units, Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology, Honolulu, Hawaii, USA, 5–8 October 2014; ACM: New York, NY, USA, 2014; pp. 637–644.
  11. Benko, H.; Wilson, A.D.; Zannier, F.; Benko, H. Dyadic projected spatial augmented reality, Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology, Honolulu, Hawaii, USA, 5–8 October 2014; ACM: New York, NY, USA, 2014; pp. 645–655.
  12. Hide’n’Seek—3D Auto Calibration Tool for Projectors Technology demonstration. In vimeo, Available online: http://vimeo.com/53126679 accessed on 12 January 2015.
  13. Nakamura, T.; Watanabe, A.; Hashimoto, N. Dynamic Projection Mapping, Proceedings of the SIGGRAPH 2012, Los Angeles, CA, USA, 5–9 August 2012.
  14. Sakamaki, S.; Hashimoto, N. Time-delay compensation for dynamic projection mapping, Proceedings of the SIGGRAPH Asia 2013 Posters, Hong Kong, China, 19–21 November 2013; ACM: New York, NY, USA, 2013. [CrossRef]
  15. PUMA L.I.F.T. In YouTube, Available online: http://www.youtube.com/watch?v=TM8DA830xng accessed on 12 January 2015.
  16. Byrne, K.; Proto, J.; Kruysman, B.; Bitterman, M. The power of engineering, the invention of artists. In Robotic Fabrication in Architecture, Art and Design 2014; Springer: New York, NY, USA, 2014; pp. 399–405. [Google Scholar]
  17. Bot & Dolly. In vimeo, Available online: http://vimeo.com/75260457 accessed on 12 January 2015.
  18. Kim, S.; Kim, J.; Jo, H.; Choi, B.; Im, J.; Sung, J. Dynamic projection mapping system for an augmented character in performing art, Proceedings of the SIGGRAPH Asia 2012 Posters, Singapore, 28 November–1 December 2012; ACM: New York, NY, USA, 2012. [CrossRef]
  19. Motta, T.; Loaiza, M.; Raposo, A.; Soares, L. Kinect projection mapping. SBC 2014, 5, 5. [Google Scholar]
  20. Bhajantri, N.; Nagabhushan, P. Discriminatory projection of camouflaged texture through line masks. J. Inf. Process. Syst. 2013, 9, 660–677. [Google Scholar]
  21. Yang, X.; Peng, G.; Cai, Z.; Zeng, K. Occluded and low resolution face detection with hierarchical deformable model. J. Converg. 2013, 4, 11–14. Available online: http://www.ftrai.org/joc/vol4no2/v04n02_A03.pdf accessed on 12 January 2015. [Google Scholar]
Figure 1. The hardware configuration of real-time projection mapping system.
Figure 1. The hardware configuration of real-time projection mapping system.
Symmetry 07 00182f1 1024
Figure 2. Steps of our system process.
Figure 2. Steps of our system process.
Symmetry 07 00182f2 1024
Figure 3. Installation of stage sets, stage objects, and the real-time projection mapping system.
Figure 3. Installation of stage sets, stage objects, and the real-time projection mapping system.
Symmetry 07 00182f3 1024
Figure 4. Sample scenes applying our real-time projection-based AR system.
Figure 4. Sample scenes applying our real-time projection-based AR system.
Symmetry 07 00182f4 1024
Figure 5. Image sequences of musical scenes applying our system in real-time.
Figure 5. Image sequences of musical scenes applying our system in real-time.
Symmetry 07 00182f5 1024
Figure 6. Results of applying our system at various points/In order left point, center point and right point.
Figure 6. Results of applying our system at various points/In order left point, center point and right point.
Symmetry 07 00182f6 1024
Figure 7. Visual error in our system. (a) 0 frame; (b) +1 frame; (c) +2 frame; (d) +3 frame; (e) +4 frame; (f) +5 frame after the motion was stopped.
Figure 7. Visual error in our system. (a) 0 frame; (b) +1 frame; (c) +2 frame; (d) +3 frame; (e) +4 frame; (f) +5 frame after the motion was stopped.
Symmetry 07 00182f7 1024

Share and Cite

MDPI and ACS Style

Lee, J.; Kim, Y.; Heo, M.-H.; Kim, D.; Shin, B.-S. Real-Time Projection-Based Augmented Reality System for Dynamic Objects in the Performing Arts. Symmetry 2015, 7, 182-192. https://doi.org/10.3390/sym7010182

AMA Style

Lee J, Kim Y, Heo M-H, Kim D, Shin B-S. Real-Time Projection-Based Augmented Reality System for Dynamic Objects in the Performing Arts. Symmetry. 2015; 7(1):182-192. https://doi.org/10.3390/sym7010182

Chicago/Turabian Style

Lee, Jaewoon, Yeonjin Kim, Myeong-Hyeon Heo, Dongho Kim, and Byeong-Seok Shin. 2015. "Real-Time Projection-Based Augmented Reality System for Dynamic Objects in the Performing Arts" Symmetry 7, no. 1: 182-192. https://doi.org/10.3390/sym7010182

Article Metrics

Back to TopTop