Next Article in Journal
A Probabilistic Capacity Model and Seismic Vulnerability Analysis of Wall Pier Bridges
Next Article in Special Issue
Safety Lighting Sensor Robots Communicate in the Middle of the Highway/Roads
Previous Article in Journal
Experimental Study of Stepped-Lap Scarf Joint Repair for Spar Cap Damage of Wind Turbine Blade in Service
Previous Article in Special Issue
A Multi-Robot Formation Platform based on an Indoor Global Positioning System
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Communication with Self-Growing Character to Develop Physically Growing Robot Toy Agent

1
Graduate School of Convergence Science and Technology/Dept. of Earth Science Education, School of Education, Seoul National University, Seoul 08826, Korea
2
Department of Medicine, College of Medicine, Kyunghee University, Seoul 02447, Korea
3
Department of Medicine, College of Medicine, Seoul National University, Seoul 03080, Korea
4
Department of Urban and Regional Planning, Ryerson University, Toronto, ON M5B2K3, Canada
5
Department of Mechanical Engineering, Seoul National University, Seoul 08826, Korea
6
Graduate School of Engineering, Seoul National University, Seoul 08826, Korea
7
Graduate School of Engineering Practice, Seoul National University, Seoul 08826, Korea
8
Department of Craft and Design, Intermedia Lab, Seoul National University, Seoul 08826, Korea
*
Authors to whom correspondence should be addressed.
Appl. Sci. 2020, 10(3), 923; https://doi.org/10.3390/app10030923
Submission received: 18 December 2019 / Revised: 21 January 2020 / Accepted: 21 January 2020 / Published: 31 January 2020
(This article belongs to the Special Issue Swarm Robotics 2020)

Abstract

:

Featured Application

Synthetic character implemented to the physically self-growing robot can be utilized as a teacher assistant for childhood education with its sympathetic communication.

Abstract

Robots for communication are developed extensively with an emphasis on sympathy. This study deals with the growth of character and the control of its operation. The child has time to be alone with the nature of his/her robot friend. That child can interact with other people’s emotional expressions through a robot. Step by step, the robot character will grow as the child grows. Through design studies, qualitative processes such as customer experience audit, eye tracking, mental model diagrams, and semantic differences have been executed for the results. The participatory behavior research approach through user travel is mapped from the user’s lead to the evidence-based design. This research considers how synthetic characteristics can be applied to the physical growth of robot toys through the product design process. With the development of robot toy “Buddy”, two variations on the robot were made to achieve recognizable growth. (1) one-dimensional height scaling and (2) facial expression including the distance between two eyes on the screen. Observations represented children’s reactions when "Buddy" was released with the children. As an independent synthetic character, the robot was recognized by children who had the designed function. Robots for training may require more experimentation.

1. Introduction

In recent years, a large amount of robot research has been devoted to the replacement work by the robot. On the other side, few have investigated the simulation of communication between humans and robots. Children are more likely to engage in interaction with robots because they perceive robots more positively and more life-like [1,2]. Furthermore, robot design based on user research of personal service robots has become an important research issue [2]. The introduction should briefly place the study in a broad context and highlight why robot design for children is important. The introduction should briefly place the study in a broad context and highlight why it is important.
For “learning with Character” or “learning with RT”, examples are; a study of “PaPeRo” applied to edutainment [3] and “AIBO” applied as a dog friend that tells storybooks of children like a story-telling robot [4]. So far, research on educational robots has focused on children’s reactions and learning effects when applied to educational environments [5].
However, to utilize the robot in the education field, simply introducing the robot is not enough. It is necessary to apply various existing pedagogical theories to the interaction design of the educational robot and verify it for the children. Research on educational robots increases the focus, interest, and achievement of children’s learning in comparison to other traditional media [6]. Thus, teacher assistant role, suggesting the use of relationships as a media for inducing learning motivation is children is more important than educational contents delivery, in teacher assistant roles.
As shown in the Figure 1a, children often get tired of toys they play with when they are younger. As the children grow up, they want to have new toys for their age. Unlike nature, which changes with the passage of time, dolls and existing robot toys retain their original appearance. Even when dealing with toys or serious games that have specific functions, children find other things to play with when they are consumed for a certain amount of time.
The purpose of this research is to motivate children to learning with the synthetic character as an entertainment robot as shown in the Figure 1b. In this paper, we design a teacher-assisted robot interaction system that can maximize empathy like by using advanced optimization techniques from [7,8] embedded within a suitable artificial neural network to operate a teacher-assisted robot that can induce children’s learning motivation and reduce the novelty effect. Based on the “Buddy” robot developed by the same researchers, the research on the character of the growing robot character and the empathic interview is used to verify that the robot can be used as an object of empathy that can serve as a teacher’s assistant [9] role.

2. Synthetic Character

Every developed robot has its character. A synthetic character is a creature that whose motivation is artificial, which can make real-time interaction with a human [5,9].

2.1. Former Researches on Synthetic Character

2.1.1. Synthetic Character Resembling an Animal (Around Year 2000)

According to the synthetic character group in the MIT Media Lab, synthetic character [6] approaches:
  • “Every day common sense”
  • “The ability to learn”
  • “The sense of empathy”
Integrated approaches that implement adaptive and expressive virtual characters appeared in archive projects for synthetic animals’ characters such as dogs. Its result created characters that seem to have their minds in the context of the behavior—for example, a multifaceted approach for designing systems that mimic biological systems as clues and design principles have been used [10,11].

2.1.2. Synthetic Character in the Real-World Application

According to the synthetic character research of Rodrigues et al. (2009) [12], there are two common related concepts:
  • Empathy
  • Sympathy
Motivation for the synthetic character was introduced from the education industries. Plenty of user research are ongoing while many companies are producing a lot of toys, dolls, computer games, and mobile applications associated with the use of synthetic character available to motivate users for work, study and training objectives [13]. Especially for children, eating/sleeping habits were triggered to be modified with the help of synthetic character in various media like TV, books, and games.
On the other hand, the synthetic character which looks like animal shown in Figure 2, can decide its behavior based on its own internal/external information by itself within interactive learning approaches [14,15]. Unsupervised facial expression is available in front of a child with input from children. In this project, we decided on the synthetic character due to the physical growth and facial expression within its design.

2.2. Design Process of Robot Character in Application with Robot “Buddy”

“Buddy” began with the need for a dynamic playground, enabling emotional interchange so that it can play with the child. This background starts with the child’s lack of peer experience which is critical for his/her development. It was introduced in the context of half of the babies are facing the lack of peer experience because their parents are too exhausted from working or childcare in South Korea [16,17]. For lonely children and exhausted parents, an interactive robot toy “Buddy” was designed to become a friend for children. Requirements and needs for the robot toy can be extracted from a series of design their processes.

2.2.1. Diary Studies for Collecting User Insight

Diary studies are useful tools in exploratory research, preparing the designer for further research by contributing to an understanding of participant user groups [13]. Unlike traditional diary studies with paper and pen, digital photos uploaded on provided sites like Facebook and Instagram were the source of the main material in this study:
  • Facebook post: Message with the photo
  • Instagram post: Digital photo with description
In the context of the burden of care, photos uploaded to the parents’ site represent a special memory that parents and children cannot frequently use. The specification of the robot toy includes the additional function of memory to remember the situation of specific events. As a result, the appearance of the robot toy is human-like and it communicates with the child as friend and teacher assistant.

2.2.2. The Goal of Design: Position behind the Uncanny Valley

According to the following qualitative studies, as shown in Figure 3, the target robot toy decided its position behind the uncanny valley [17,18,19]. Humanoid robots, for example, BeatBo works as a dancing robot toy for babies which has features of dancing with moving, learning with games, and a customized sing-along gross motor [20]. The requirement for a robot toy might be a humanoid robot that plays like a stuffed animal in front of babies.

2.2.3. Total Design Process: From Design Research to Development and Implementation

Design process from planning to implementation is shown on Figure 4.

3. Results

In this research, we applied a growth system applied to the character for human-robot interaction to observe children’s recognition and reaction to the robot. This section deals with the growing robot device and method for controlling operation thereof.

3.1. Design Result: Function of Communication

3.1.1. Data Network of the Robot Toy

From a perspective of the operation module, the control terminal is an information processing apparatus, and can be connected to a server 30 at a remote location via a network or directly to another terminal via direct or other information processing apparatuses. The control terminal can be provided with a client program for controlling the growth robot apparatus and can control the setting of the growth robot apparatus through the installed client program. The growth robot apparatus can communicate with the control terminal via the network. The control terminal 20 can be connected to the remote server 30 through the network N or to the other terminal and the server 30. The robot system is networked with the control terminal for controlling the growth robot and outside server, as shown in Figure 5.

3.1.2. Communication between Robot and Child

Hereinafter, each component provided in the growth robot apparatus will be described in more detail. As shown in Figure 6, the camera provided in the growth robot apparatus senses an object located in the vicinity and photographs a person to identify the facial expression of a person. The camera includes a plurality of cameras as needed. The camera can measure light intensity, such as light intensity (day, night, light, dark, etc.).
For example, the control terminal can control the software of the growth robot to be changed according to the customized character and physical growth level of a person interacting with the growth robot apparatus, it can be installed via downloading, or it can be updated voluntarily. For instance, the growth robot apparatus can photograph a person with a camera-equipped therein and can store photographed pictures or moving pictures. Then the growth robot apparatus can provide the stored photographs or moving images to the control terminal or output it through a screen outside, and a child user can interact by pointing out the product on screen.
For another example, the control terminal can receive the height of a person from a camera or picture of a standing person. It can control the hardware of the growth robot apparatus which needs to be changed due to the input of the person’s height. The growth robot apparatus may be divided into a head part, a body part, and a leg part.
As shown in Figure 7, the head part of the growth robot apparatus is provided with a screen for displaying the facial expression change of the growth robot apparatus. The liquid crystal display attached on the front part of the robot apparatus displays its facial expression as well as the output image for the user from the processing signal of the robot as the face screen. For example, the LCD can be controlled by a single LCD control signal. A variety of reaction images can be generated internally from the robot. The control process uses: The image processor allowing images to be input through a camera and facial expression to be displayed on the screen as output. With the additional ultrasonic kit assembly of sensors for sensing multiple obstacles, the robot can detect a person or a product located in the vicinity of the robot apparatus. It can also detect the distance to a person from the difference of distances from all available ultrasonic sensors. When a child asks a question on something, the robot apparatus will recognize the person and the distance with the person simultaneously with preparing the answer.

3.2. Design Result: Function of Self-Growing

3.2.1. Growth Robot Implementation

The growth robot apparatus can perform hardware growth in which the height of the growth robot apparatus is changed according to the change of the key of the interacting person. That is, the growth robot apparatus can change the length of the body portion of the growth robot apparatus according to the height of the person obtained from the control terminal. For example, the growing robot apparatus can acquire the current person’s key from the control terminal and is provided in the body portion of the growth robot apparatus to have a key similar to the acquired person’s key. The length of the body part can be changed by increasing or decreasing the tube as a lifting device. In its appearance, the growth robot’s skin is covered with a stretchable material (Dragon skin 10) according to the change of body height.
The growth robot apparatus can grow in response to a change in the physical growth level and height of the interacting person, and the growth robot apparatus can perform both software growth and hardware growth. For hardware growth, there is an exemplary view showing a lifting device provided in a body portion of the growth robot apparatus. The lifting device of the growth robot apparatus may have a structure capable of varying the length, and the corrugated tube system appears like larva skin. The growth signal is implemented with a wrinkled tube system including scissor jack inside as shown in Figure 8.

3.2.2. Growth Robot Operation

Since the body portion is coupled with the lower-end of the head portion, the length of the body portion changes in a vertical direction so that the height of the growth robot apparatus varies. To this end, the body portion can be separated into an upper-end portion and a lower end portion, and a lifting device with a variable-length is provided between the upper-end portion and the lower end portion. As the length of the lifting device is changed, the length between the upper-end portion and the lower-end portion of the body portion is changed, such as, so is the height of the growth robot apparatus.
Also, the leg portion may be coupled with the lower end of the body portion, and at least one wheel may be provided at the lower-end of the leg portion so that the robot can be moved using the wheel. On the front part, a microphone senses a person’s voice, and a speaker outputs a sound. A method of controlling the growth of robot toys through the control terminal display will be described in Figure 7. The robot in Figure 8 can be settled according to the user in Figure 9.
As shown in the Figure 9a, the control terminal can receive the intelligent growth level of the person interacting with the growth robot apparatus from the user (S501). Then, the control terminal can search for a software setting corresponding to the growth level of the input and download the searched software setting (S502). The control terminal can collect information about the software installed in the growth robot apparatus and determine whether or not the user is similar to the growth level of the input user. At this time, when the physical growth level of the software differs from that of the person, the control terminal can search the software of the growth robot apparatus and request the settlement of the software corresponding to the growth level of the user. Thereafter, the control terminal can control the growth robot apparatus so that the settled software can be installed in the growth robot apparatus (S503).
On the other side, Figure 9b is a flowchart showing a method of controlling the keys of the growth robot apparatus by the control terminal. The control terminal can recognize the height of a person from the prepared picture of the user (S601). Then, the control terminal can operate the lifting apparatus provided in the growth robot apparatus to correspond to the height input (S602). Then the growth robot can introduce itself with the customized settled height in front of the user (S603).
The control method according to the embodiment described concerning Figure 10a,b can also be implemented in the form of a recording medium comprising instructions executable by program modules. One-dimensional height scaling logs and output files would be stored in computer-readable media, and can be available to any one that can be accessed by a computer. This can include both volatile and nonvolatile or, both removable and non-removable ones that are implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data. Communication media typically include any information delivery media including computer-readable instructions, data structures, program modules, or other data in a modulated data signal such as a carrier wave or other transport mechanism.

4. Use Case Discussion

The growth robot apparatus can ask a person a question and can detect a person’s behavior on a question. That is, the growth robot apparatus can ask a person about a name of an object, recognize a person’s answer to a question, and recognize a person’s expression or movement through a camera [21]. The following Figure 11 describes a situation of a growth robot apparatus with the embodiment of the present invention [22] interacting with a child user which previously appeared in Figure 6, Figure 7, Figure 8, Figure 9 and Figure 10.

5. Conclusions

We introduced the concept of a synthetic character applied to the robot toy. The character has the name of “Buddy” and the form of a humanoid robot that has a physical growing function. The robot toy can be utilized as a teacher assistant. This robot was developed via design processes dealing with qualitative approaches, which resulted in a humanoid robot behind the uncanny valley for the assistant function. To act as a teacher assistant primary communication function of the robot with a camera is to recognize that which the user points out. Above the functionality, the robot toy also grows physically in a sympathetic mind about children.
Further research is needed to compare the response of teacher assistants’ tasks by increasing the number of subjects in elementary school students in various settings. Also, it is necessary to search for an appropriate stimulus method of teacher assistant task, and further analysis according to students’ grades, personality, age, and so forth.

6. Patents

This research is registered in patent application: Eune, J., Lee, M., Jeong, H., Kim, J., Lee, P., Lee, C., Pham, A.Y., Soe, T. (2016). Growing robot device and method for controlling operation thereof. Korean patent No. 10-2016-0183090. Daejeon: Korean intellectual property office [22].

Author Contributions

Conceptualization, H.J. and A.P.; formal analysis, H.J.; investigation, M.L., A.P. and C.L.; methodology, A.P.; project administration, M.L. and J.E.; resources, M.L., J.K., C.L., T.S. and P.L.; software, J.K., C.L., T.S. and P.L.; supervision, S.-W.K. and J.E.; visualization, H.J.; writing—original draft, M.L. and J.K.; writing—review & editing, S.-W.K. and J.E. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Acknowledgments

We would like to acknowledge the significant contributions made by the entire Buddy team in ICD/AI Robot class, without which this paper would not be possible. Additional thanks to idea factory at Seoul National University for the workshop space and material. This study was mainly supported by a Brain Fusion Research Grant (600-20170008) from the GSCST (Graduate School of Convergence Science and Technology), Seoul National University. This research was partially supported by the MSIT (Ministry of Science and ICT), Korea, under the ITRC (Information Technology Research Center) support program (IITP-2019-2014-1-00743) supervised by the IITP(Institute for Information & communications Technology Planning & Evaluation) and creative challenge program for Seoul National University.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Woods, S.; Dautenhahn, K.; Schulz, J. The Design Space of Robots: Investigating Children’s Views. In Proceedings of the 2004 IEEE International Workshop on Robot and Human Interactive Communication, Okayama, Japan, 20–22 September 2004; pp. 47–52. [Google Scholar]
  2. Woods, S.; Dautenhahn, K.; Schulz, J. Child and adults perspectives on robot appearance. In Proceedings of the Symposium on Robot Companions: Hard Problems and Open Challenges in Robot-Human Interaction, Hatfield, UK, 12–15 April 2005; pp. 126–132. [Google Scholar]
  3. Osada, J.; Ohnaka, S.; Sato, M. The scenario and design process of childcare robot, PaPeRo. In Proceedings of the 2006 ACM SIGCHI International Conference on Advances in Computer Entertainment Technology, Hollywood, CA, USA, 14–16 June 2006. No. 80. [Google Scholar]
  4. Decuir, J.D.; Kozuki, T.; Matsuda, V.; Piazza, J. A friendly face in robotics: Sony’s AIBO entertainment robot as an educational tool. Comput. Entertain. 2004, 2, 14. [Google Scholar] [CrossRef]
  5. Kwak, S.S.; Lee, D.; Lee, M.; Han, J.; Kim, M. The Interaction Design of Teaching Assistant Robots based on Reinforcement Theory: With an Emphasis on the Measurement of Task Performance and Reaction rate. J. Korea Robot. Soc. 2006, 1, 142–150. [Google Scholar]
  6. Group Overview ‹ Synthetic Characters—MIT Media Lab. Available online: https://www.media.mit.edu/groups/synthetic-characters/overview/ (accessed on 15 May 2019).
  7. Al-Baali, M.; Caliciotti, A.; Fasano, G.; Roma, M. Quasi-Newton Based Preconditioning and Damped Quasi-Newton Schemes for Nonlinear Conjugate Gradient Methods. NAO 2017. In Numerical Analysis and Optimization; Al-Baali, M., Grandinetti, L., Purnama, A., Eds.; Springer: Muscat, Oman, 2018; pp. 1–21. ISBN 978-3-319-90025-4. [Google Scholar]
  8. Caliciotti, A.; Fasano, G.; Roma, M. Preconditioning strategies for nonlinear conjugate gradient methods, based on quasi-Newton updates. In Proceedings of the NUMTA-2016, Pizzo Calabro, Italy, 19–25 June 2016; p. 090007. [Google Scholar]
  9. Kwak, S.S.; Lee, D.; Lee, M.; Han, J.; Kim, M. The Interaction Design of Teaching Assistant Robots based on Reinforcement Theory-With an Emphasis on the Measurement of the Subjects’ Impressions and Preferences. J. Korean Soc. Des. Sci. 2007, 20, 97–106. [Google Scholar]
  10. Jagger, S. Affective learning and the classroom debate. Innov. Educ. Teach. Int. 2013, 50, 38–50. [Google Scholar] [CrossRef]
  11. Yoon, S.Y.; Blumberg, B.; Schneider, G.E. Motivation driven learning for interactive synthetic characters. In Proceedings of the 4th International Conference on Autonomous Agents, Barcelona, Spain, 3–7 June 2000; pp. 365–372. [Google Scholar]
  12. Rodrigues, S.H.; Mascarenhas, S.F.; Dias, J.; Paiva, A. “I can feel it too!”: Emergent empathic reactions between synthetic characters. In Proceedings of the 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops, Amsterdam, The Netherlands, 10–12 September 2009; pp. 1–7. [Google Scholar]
  13. Hanington, B.; Martin, B. Universal Methods of Design: 100 Ways to Research Complex Problems, Develop Innovative Ideas, and Design Effective Solutions; Rockport Publishers: Beverly, CA, USA, 2012; pp. 66–67. ISBN 978-1-59253-756-3. [Google Scholar]
  14. Kim, Y.D.; Kim, J.H.; Kim, Y.J. Behavior selection and learning for synthetic character. In Proceedings of the 2004 Congress on Evolutionary Computation, Portland, OR, USA, 19–23 June 2004; pp. 898–903. [Google Scholar]
  15. My Talking Hank. Available online: https://outfit7.com/apps/my-talking-hank/ (accessed on 18 May 2018).
  16. Kim, J.; Jeong, H.; Pham, A.; Lee, C.; Soe, T.; Lee, P.; Lee, M.; Kim, S.; Eune, J. Introduction of an Interactive Growing Robot/Toy for Babies. In Advances in Computer Science and Ubiquitous Computing; CUTIE 2017, CSA 2017; Lecture Notes in Electrical Engineering; Park, J., Loia, V., Yi, G., Sung, Y., Eds.; Springer: Singapore, 2017; pp. 120–125. ISBN 978-981-10-7604-6. [Google Scholar]
  17. Mori, M. Bukimi no tani [the uncanny valley]. Energy 1970, 7, 33–35. [Google Scholar]
  18. Seyama, J.I.; Nagayama, R.S. The uncanny valley: Effect of realism on the impression of artificial human faces. Presence Teleoper. Virtual Environ. 2007, 16, 337–351. [Google Scholar] [CrossRef]
  19. Mathur, M.B.; Reichling, D.B. Navigating a social world with robot partners: A quantitative cartography of the Uncanny Valley. Cognition 2016, 146, 22–32. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  20. Bright Beats Dance Move BeatBo. Available online: https://www.fisher-price.com/en_CA/products/Bright-Beats-Dance-and-Move-BeatBo/ (accessed on 18 May 2019).
  21. Kim, J.; Jeong, H.; Lee, C.; Pham, A.Y.; Soe, T.; Lee, P.; Lee, M.; Kim, S.W.; Eune, J. Buddy: Interactive Toy that can Play, Grow, and Remember with Baby. In Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems, Amsterdam, NY, USA, 10–12 May 2017; p. 467. [Google Scholar]
  22. Eune, J.; Lee, M.; Jeong, H.; Kim, J.; Lee, P.; Lee, C.; Pham, A.Y.; Soe, T. Growing Robot Device and Method for Controlling Operation Thereof. Korean Patent No. 10-2016-0183090, 29 December 2016. [Google Scholar]
Figure 1. Synthetic character and robot: (a) robot toy abandoned; (b) target area/scope of the research.
Figure 1. Synthetic character and robot: (a) robot toy abandoned; (b) target area/scope of the research.
Applsci 10 00923 g001
Figure 2. Growth level example of the synthetic animal character’s scale level as: (a) 1st Level. Infant; (b) second level. elementary school student; (c) third level. teenager; and (d) fourth level. adult. In order for synthetic animals to go through the levels (ad), they should be provided with food, play, bathing, and so on in their simulation.
Figure 2. Growth level example of the synthetic animal character’s scale level as: (a) 1st Level. Infant; (b) second level. elementary school student; (c) third level. teenager; and (d) fourth level. adult. In order for synthetic animals to go through the levels (ad), they should be provided with food, play, bathing, and so on in their simulation.
Applsci 10 00923 g002
Figure 3. Real design of interactive social robots, adapted from Mathur and Reichling (2016) [18], behind the uncanny valley, adapted from Mori (1970) [17].
Figure 3. Real design of interactive social robots, adapted from Mathur and Reichling (2016) [18], behind the uncanny valley, adapted from Mori (1970) [17].
Applsci 10 00923 g003
Figure 4. The total design process of physically growing robot “Buddy”.
Figure 4. The total design process of physically growing robot “Buddy”.
Applsci 10 00923 g004
Figure 5. The network linked among the growth robot, the control terminal, and the server.
Figure 5. The network linked among the growth robot, the control terminal, and the server.
Applsci 10 00923 g005
Figure 6. Design draft version: Use of the equipped camera to recognize both: (1) user height and (2) user position including the direction of the user and the distance between the robot and the user.
Figure 6. Design draft version: Use of the equipped camera to recognize both: (1) user height and (2) user position including the direction of the user and the distance between the robot and the user.
Applsci 10 00923 g006
Figure 7. Facial expression control on the LCD of robot head: (a) console window; (b) draft facial. (c) smile facial, and (d) Imagination of special memory mode that parents and children can frequently load.
Figure 7. Facial expression control on the LCD of robot head: (a) console window; (b) draft facial. (c) smile facial, and (d) Imagination of special memory mode that parents and children can frequently load.
Applsci 10 00923 g007
Figure 8. Robot growth implementation: (a) height from h1 and intent of extension of Δh (b) implementation of Δh with wrinkled tube including scissor jack structured inside and (c) after the physical growth of Δh up to h2.
Figure 8. Robot growth implementation: (a) height from h1 and intent of extension of Δh (b) implementation of Δh with wrinkled tube including scissor jack structured inside and (c) after the physical growth of Δh up to h2.
Applsci 10 00923 g008
Figure 9. Robot settles: (a) S500s: Donor to the target user; (b) S600: Reconfigure to the new user in other modes. A flowchart showing a method of controlling the software function of the growth robot apparatus.
Figure 9. Robot settles: (a) S500s: Donor to the target user; (b) S600: Reconfigure to the new user in other modes. A flowchart showing a method of controlling the software function of the growth robot apparatus.
Applsci 10 00923 g009
Figure 10. Robot growth: (a) physical appearance of the robot before growth and (b) after growth as shown on Table 1.
Figure 10. Robot growth: (a) physical appearance of the robot before growth and (b) after growth as shown on Table 1.
Applsci 10 00923 g010
Figure 11. Example: Robot-child interaction of fruit quiz.
Figure 11. Example: Robot-child interaction of fruit quiz.
Applsci 10 00923 g011
Table 1. Degree of robot growth over age (example). Adapted from [21].
Table 1. Degree of robot growth over age (example). Adapted from [21].
Growth Over Age4 Month3 Year-Old8 Year-Old (TBD)
Height (cm)65 cm95 cm122 cm
Height (ft)2.13 ft3.11 ft 14.00 ft

Share and Cite

MDPI and ACS Style

Lee, M.; Kim, J.; Jeong, H.; Pham, A.; Lee, C.; Lee, P.; Soe, T.; Kim, S.-W.; Eune, J. Communication with Self-Growing Character to Develop Physically Growing Robot Toy Agent. Appl. Sci. 2020, 10, 923. https://doi.org/10.3390/app10030923

AMA Style

Lee M, Kim J, Jeong H, Pham A, Lee C, Lee P, Soe T, Kim S-W, Eune J. Communication with Self-Growing Character to Develop Physically Growing Robot Toy Agent. Applied Sciences. 2020; 10(3):923. https://doi.org/10.3390/app10030923

Chicago/Turabian Style

Lee, Mingu, Jiyong Kim, Hyunsu Jeong, Azure Pham, Changhyeon Lee, Pilwoo Lee, Thiha Soe, Seong-Woo Kim, and Juhyun Eune. 2020. "Communication with Self-Growing Character to Develop Physically Growing Robot Toy Agent" Applied Sciences 10, no. 3: 923. https://doi.org/10.3390/app10030923

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop