Next Article in Journal
Resilience in Retrospective: The Trajectory of Agro-Pastoral Systems in the Centro Region of Portugal
Next Article in Special Issue
Hybrid Platform for Assessing Air Pollutants Released from Animal Husbandry Activities for Sustainable Livestock Agriculture
Previous Article in Journal
The Spatial and Temporal Decomposition of the Effect of Floods on Single-Family House Prices: A Laval, Canada Case Study
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Sustaining User Experience in a Smart System in the Retail Industry

Department of Management Information Systems, National Chengchi University, Taipei 11605, Taiwan
*
Author to whom correspondence should be addressed.
Sustainability 2021, 13(9), 5090; https://doi.org/10.3390/su13095090
Submission received: 28 March 2021 / Revised: 21 April 2021 / Accepted: 29 April 2021 / Published: 1 May 2021

Abstract

:
Retail enterprises are embracing new technologies to provide innovative services to customers and engage them. Unmanned retail stores offer a completely new seamless shopping experience for customers. Moreover, artificial intelligence (AI) technology and sophisticated customer behavior should be explored in depth to develop a smart system to serve customers. This study focused on achieving a sustainable user experience with a smart system installed in an autonomous store. The development of core functions and a rule-based knowledge set are regarded as the most important tasks in designing autonomous services. In the case study, the core functions were developed on the basis of the design science concept and the rule-based knowledge set was constructed using the action research approach. The developed smart system was optimized to understand in-store customer behavior by continuously observing and refining the user experience. The practical experience of this study provides insights into AI technology use in retailing and can guide enterprises in developing smart systems.

1. Introduction

With the rapid development of technology and changes in consumption patterns, the retail industry worldwide is attempting to develop new retail practices. Brick-and-mortar retailers continue to seek innovative approaches to attract consumers to their stores for sustaining the revenue base. Digital technologies, such as mobile apps and mobile payment, have become indispensable tools for business competition.
Retail enterprises are seeking new technologies, and they are dependent on technological progress. The ninth UN Sustainable Development Goal of “Industry, innovation and infrastructure” describes the aim of building resilient infrastructure, promoting inclusive and sustainable industrialization, and fostering innovation [1]. The autonomous store, an example of new technology in retail, uses cameras and computer vision to create a fully automated, frictionless, self-guided shopping experience. By reducing friction and increasing productivity, the autonomous store plays a key role in introducing and promoting new technologies that improve customer experiences and enable a more efficient use of resources.
To provide a superior consumer experience, many retailers have adopted self-service technologies (SSTs) to save the consumer time spent in checkout queues. For example, in 2017, Amazon designed an unmanned retail store driven by artificial intelligence (AI) technology and provided its customers with a counterless shopping experience. By 2021, Amazon was expected to extend from using a mobile payment app to adopting hand-recognition technology for identifying shoppers, tracking their behavior, and completing the checkout process.
In the new retail context, retailers have begun their journeys toward being more customer-centric to improve the retailer–consumer relation by adopting new technologies, such as AI, big data, Internet of Things, robots, social media, and virtual reality [2,3]. For example, they have been paying increasing attention to fields such as product digitalization and consumer behavior analysis. Knowing how technology innovations would influence retailers and engage customers is the key to the future of retailing [4,5,6]. Moreover, understanding how user experience would sustain as a suitable knowledge base is critical for establishing a successful retail service [7,8].
Therefore, the following research question is considered in this paper: how to develop a smart system to support an autonomous store and sustain user experiences in diverse situations. To answer this question, this paper proposes a model for designing the core functions that should be involved in developing a self-learning system and using rule-based knowledge with user experience to optimize the smart system to make it smart. Finally, on the basis of the strategy response for predefined rules and observations, fine-grained customer behaviors are integrated into an autonomous store operation.
Several contributions are made by this research. First of all, this research provides guidelines for developing an autonomous store and assists retail enterprises in understanding the implementation process. Second, this research is the first to conduct analyses of user experience by actively identifying customer behaviors in-store during the shopping journey. Furthermore, the data were collected from a real convenience store and validated by experienced experts, thereby increasing confidence in results. Finally, we adopted design science concepts and the action research approach to iteratively verify the system learning process to achieve the desired result.
The remainder of this study is structured as follows. In the next section, we review the literature to understand the technological retail revolution. These are the most successful in-store technologies that enhance and sustain the user experience. In the following section, we explain design science and the action research methodology used in our research procedure. Next, we demonstrate and evaluate the smart system’s learning and its results. Finally, we discuss the topics that can be addressed in future studies and conclude.

2. Conceptual Background

Stores, customers, and goods form the core elements of retailing. With advances in new technology, retailing has been evolving, redefining itself, and shifting into new paradigms, such as new retail or smart retail. However, the essence of retailing, which is a business activity that occurs during the interaction between consumers and goods, has not changed. New technologies continue to be a critical element of the retailing revolution [3,9]. It is critical to understand how technology (e.g., self-service devices, in-store kiosks, self-scanning devices, and various apps) can affect consumer experiences in the retail industry. We investigate how these autonomous technologies can benefit both consumers and businesses. Amazon’s unmanned retailing experience driven by computer vision technology highlights the critical enabling technologies for autonomous store construction. By reviewing their applicability in retailing, we provide an overview of the following technology-related topics: (1) SST, (2) computer vision, and (3) unmanned retail.

2.1. SST

To respond to the increasing labor cost and address operational problems, retailers have developed and applied several SSTs. The implementation of an SST can simplify the task of shopping and result in efficient management [10,11,12]. The term SST is defined as “a technological interface that enables the customer to enjoy a service independent of direct service employee involvement [13,14].” Some SST-related studies have identified the factors that influence the adoption of various SSTs and have justified the investment cost of SST adoption by highlighting the positive effects of SSTs on customer experience [15,16,17].
Retailers implementing SSTs can benefit from operational cost reduction, customer satisfaction improvement, and new customer segmentation. According to some customers, SSTs provide a superior experience in terms of a convenient shopping channel, enable time savings at checkout, and allow technology innovations, such as personalized recommendations. Nevertheless, some customers refuse to access SST-driven stores because of technology failures and poor user interfaces [7,14,18]. Thus, developing successful SSTs from the perspective of customer value addition, and not simply operational efficiency, can improve the service quality and efficiency from both customers’ and employees’ viewpoints [13,19,20,21]. However, most relevant studies have focused on understanding of the logic of customer behavior in specific activities or experiences. Only a few studies have approached the topic from a system-centric point of view [22]. Moreover, few studies have investigated the retail technologies necessary for retailers to actively identify and understand customer behaviors throughout the shopping journey. The conducted analyses of in-store human body actions face practical limitations [23].

2.2. Computer Vision

The previous applications of video analytics to in-store retail included customer counting and determining the traffic flow. Based on its computing ability, the development of video analytics has evolved from customer tracking, fraud prevention, and identification of customer interests to customer–product interaction [24,25,26]. Retailers can benefit from video analytics to understand in-store customer behavior better, gain insight into customer engagement, optimize the shop layout, and adjust the marketing campaign strategy of the store [25,27].
The analysis of customer behavior from computer vision is an important open topic for retailing [25,28,29]. Computer vision includes the tasks of acquiring, processing, and analyzing digital images and videos to understand and produce high-level information to aid human decisions [30]. With machine learning and autonomous planning in the field of AI, the information provided by computer-vision-based systems can support the retail service in various areas, such as people detection, product recognition, or people–product association.
Thus, retailers are adopting AI technologies to improve operational efficiency and enhance customer experience online and offline [25]. However, computer-vision-based systems still have considerable scope for improvement in pattern recognition and machine learning [30,31]. To maintain pace with the development of the aforementioned systems, retailers are seeking the best practice to demonstrate the ability and verify the availability of an autonomous store with a computer-vision-based system [22,32].
In-store customer motion can be detected using digital devices. Diverse customer behaviors can only be recognized by experienced store staff. However, limited research has been conducted on in-store customer behavior pattern recognition in the entire shopping journey and response inspection in the retail context. Therefore, to meet the market requirement, a smart and effective system with sustainability must be constructed for an autonomous store.

2.3. Unmanned Retail

With the growing economy and technological changes being incorporated in retail organizations, an increasing number of retailers are choosing to provide additional cost-effective services to consumers and employees to enhance their experience [3,20,22]. The convergence of new technologies with SSTs aims to provide customers with an intelligent system in the shopping context. Critical systems based on shopping channels, such as smart shelves, intelligent shopping carts, and unmanned stores, are developed to attract customers [5,23].
In 2017, Amazon opened its “grab-and-go” grocery shopping concept by adopting advanced technologies to facilitate shoppers to leave the store without going through a checkout line. Advanced technologies, such as computer vision, sensor fusion, and mobile payment systems, were integrated into the system for identifying the transactions occurring between the products and consumers. In addition, the potential for sustainability and industry impacts was investigated with the help of retail experts. However, only the self-checkout process has been widely accepted so far, and the costs of implementing and maintaining this operation remain unknown [22,28].
Grewal et al. [4] highlighted the future of retailing in the following five key areas: technology and tools that facilitate decision-making, visual displays and merchandise that aid in decision-making, consumption and engagement, big data collection and usage, and analytics and profitability. However, further exploration and evaluation must be conducted on how integrating technologies into the retail context can enhance customers’ shopping experience [33,34,35]. Understanding customer experience has always been one of the most important issues for retail enterprises. Customer experience is defined as “a multidimensional construct focusing on a customer’s cognitive, emotional, behavioral, sensorial, and social responses to a firm’s offerings during the customer’s entire purchase journey.” It mediates the retail success through marketing, financial, and social drivers [6,7,33,36].
Understanding the in-store customer shopping behavior is a complex task from the perspective of operational staff but is a relatively simple recognition in this complex activity. Nevertheless, customer behavior patterns are complicated to recognize in the closed loop of an unmanned store [23,37]. Moreover, little is known about how advanced technologies can aid in developing an unmanned retail system.

3. Methodology and Research Design

To develop a smart system with a sustainable user experience, we adopted a design science methodology to construct and evaluate of the technological artifact required to meet the qualitative demands of an autonomous store. To extend this methodology to understand in-store customer behavior, we used action research with semi-supervised learning to optimize the rule-based knowledge training of the customer behavior recognition systems. This approach is suitable for increasing the maturity of an autonomous system [38,39].
As presented in Figure 1, a rule-based knowledge set was constructed within a smart system through human involvement and theoretically evaluated using action research from the stakeholder perspective. The smart system, which was developed using the design science methodology, iteratively poses and tests artifacts to achieve the desired result [39]. The processes of designing the smart system and developing the rule-based knowledge set are described in the following text.

3.1. System Design

We used a design science methodology to evaluate whether the smart system could satisfy the requirements for autonomous stores.
A smart system requires the creation and evaluation of sequential activities to ensure its utility and effectiveness for autonomous store operation. The system is intended to demonstrate ideas and technical capabilities as well as be applicable for meeting retail business requirements. In this study, a smart system in an autonomous store is regarded as an information technology (IT) artifact developed using the design science methodology. Design science requires the application of rigorous methods in the construction and evaluation of the IT artifact and plays an important role in disciplines oriented toward the creation of successful IT artifacts. This approach focuses on developing an artifact with the explicit intention of improving its functional performance [40,41].
We developed a system with five core functions by tracking a customer’s actions and detecting a product’s status in a store. Figure 2 presents the core functions of a smart system. We used the framework depicted in Figure 2 to design related system interfaces and evaluate whether a person and an item can be associated when they are interacting. Our developed smart system has five core functions: the shopping app and global tracking system, which are developed to identify the person status; item recognition and inventory management, which are developed to identify the item status; and the people–item association function, which indicates the binding between a person and an item. These functions are defined in the following text.

3.1.1. Shopping App

The shopping app is a digital booklet in the mobile application that maintains personal information, shopping lists, and preferences to provide assistance to consumers when shopping in an autonomous store.

3.1.2. Global Tracking

It is a holistic multicamera grid coordinate system that facilitates customer identification through the computer vision technology and starts operating when a customer scans his/her smartphone into a turnstile. Global tracking detects the customer’s interaction with visual sensing and identifies the item exchanged and replaced through people’s interaction.

3.1.3. Item Recognition

Item recognition refers to the convergence of gravity and visual sensing to the real-time tracking of items, shelf change events, and specific item information.

3.1.4. Person–Item Association

The person–item association is a mechanism for identifying who picks up or puts back the items by using multiple sensors and a time stamp. Its functions cover the binding of people and items as well as automatic billing.

3.1.5. Inventory Management

Inventory management refers to calculating the number of items required to trigger replenishment through a gravity sensor, where the position of each item to be picked is identified using a camera.

3.2. Rule-Based Knowledge Design

We designed a rule-based knowledge set for the developed system to enable it to pre-recognize customer behavior through user experience. Human involvement can make the system robust to recognition accuracy risks and accelerate the time to market.
A participant observation-based case study was conducted to investigate the development process of the smart system. To interpret assumptions about the observations in detail and collaboratively explore scientific knowledge regarding customer experiences, the action research methodology was adopted. This methodology is an interactive process that aids in problem diagnosis and solution development through collaboration. In this approach, researchers and clients are involved as coparticipants in the inquiry and exchange experiences [42,43].
During the research process, the action research methodology was used in an iterative sequence of “planning–acting–observing–reflecting” to identify the underlying factors that aid the developing system in predicting customer behavior in an autonomous store [44,45,46]. Stakeholders, such as experienced store staffs and consumers, are involved in the cyclical process of preliminary diagnosis, action planning, action taking, evaluation, and reflection learning.

3.2.1. Preliminary Diagnosis

After the development of core functions, the knowledge set was created through human involvement. Understanding user experience is crucial for retailers. An increased understanding of user experience in retailing can enhance retail operation and aid in achieving customer satisfaction. However, the knowledge of user experience did not exist in the developed system initially. In-store user experience is a combination of knowledge and practice and requires human involvement. The developed system can obtain knowledge on user experience by continuously learning and thinking similar to a human.

3.2.2. Action Planning

With the rapid development of innovative stores, we selected a multiformat retail enterprise that applied the smart system concept to its convenience store in China as an experimental environment. A convenience store has fewer aisles, less commodities, and shorter cashier lines than conventional grocery stores or supermarkets do. With an in-depth knowledge of customer experience and efficient operations, this retail enterprise began to explore the vaguely perceived concept of sustainable user experience by using new technologies.
The studied convenience store received approximately 300–400 customers over 24 h. We set a capacity limitation of 30 people and 600 Stock Keeping Unit for the 40 shelves in the 160-m2 store space.
The stakeholders were two senior store staff, three experienced consumers, and two AI technology engineers, who were invited to participate in the analysis of customer behavior observation and recognition. The store staff, consumers, and engineers played complementary roles in sharing in-store shopping experiences simultaneously and were involved in collaboratively assisting in practical customer behavior recognition and explanation. Table 1 describes each stakeholder’s role and responsibility.
In a brick-and-mortar store, the conventional shopping process involves customers entering the store, browsing, selecting products while grabbing a bite, and finally, visiting the cashier before leaving [6]. In an autonomous store, customers register their personal information through a mobile app in advance to be identified and admitted into the store.
The in-store customer behavior is captured, identified, tracked, and analyzed using computer vision technology. The usual activities followed during in-store shopping include check in, product browsing, product pick up, choice making, and check out.
Some activities are regarded as normal, such as products exchanged through people’s interaction and replaced by another choice. However, abnormal activities should also be considered by humans and recorded to be learned as rules by the machine. At the end of shopping, customers receive a receipt and ask for refund if required.
The “during shopping” stage of the customer shopping process comprises the following four normal steps (Figure 3):
  • Check in: A customer checks in through a mobile app (QR code). The turnstile opens after a successful account validation.
  • Browse: The customer passes through the turnstile and starts browsing the products. Many customers can be identified and tracked in the store simultaneously.
  • Pick up: If a customer grabs a product, this product is added into a virtual basket. Moreover, if they place a product back on the shelf, the product is erased from the virtual basket.
  • Check out: A customer completes the purchase, passes the door line on the way out, and is billed for the picked-up products.

3.2.3. Action Taking

We emphasized customer behavior and product state for global tracking during shopping. When developing rules, we assumed that the use cases were based on the shopping process. Sufficient images captured by videos were analyzed concerning “what it is” and were estimated by stakeholders. Thus, any captured and analyzed behavior was marked with a confidence level (CL) and annotated response strategy as a reference. The CL was upgraded by the occurrence number of the recognized behaviors. We denoted the response strategy as “human-assisted” if the occurrence number was less than 101, which indicates that the system still had to learn to recognize different customer behaviors. The response strategy was annotated as “self-identified” if the occurrence number exceeds 101, which indicates that the system can recognize the customer behaviors by itself.
Table 2 presents the definition of CL and its correspondence to the occurrence number and response strategy. The system was iteratively tested and assisted by human decisions until it could be self-identified using computer vision technology.

3.2.4. Evaluation

From the stakeholder experience data collected during convenience store operation, we attempted to converge to a common view to obtain eight rules in Stage I. Then, through observation and recognition, we increased the number of rules to less than 20 in Stage II and Stage III. The stakeholders viewed the daily customer shopping journey through the captured video in a real convenient store environment during the 7-day test period. We formulated the Stage II observations in the first 3 days and Stage III observations in the next 4 days and counted the occurrence number of each rule. The captured and detected customer behaviors were identified and annotated by the stakeholders as normal or abnormal. The behavior with the response strategy was recognized incrementally to improve the system’s learning performance. Table 3 describes the rule generation plan in each stage.

4. Reflection Learning

A convenience store typically sells a range of prepared and ready-to-eat items with short shopping trips, with the customers looking for a quick and easy purchasing experience. Efficiently satisfying the customer demands can enhance the customer experience in a convenience store. Statistics have indicated that on average, a customer takes less than 5 min to walk in, select and purchase items, and depart. Most of the time is spent on looking for items and waiting in the checkout line. Often, customers walk out of a store without making a purchase due to the preferred item being unavailable and the presence of long checkout lines. Estimating the time spent standing in checkout lines is difficult.
Our study findings provide a preliminary understanding for developing a smart system for an autonomous store. We determined that a customer completed their shopping journey in the convenience store within 3–4 min. More than 90% of customers end up buying something from the store if they spent more time in browsing and making a choice. We completed the design and verification of 20 rules, which comprised six rules for the check-in step, four rules for the browsing step, seven rules for the pick-up step, and three rules for the checkout step. In each step, we initialized two rules with a normal type according to the stakeholder’s experience. After performing the observation and recognition in a real environment, six rules were found to reach CL 5 and two rules each reached CLs 3 and 4. Table 4 presents the results of rule design and verification.
In the following sections, we discuss the findings of each step.

4.1. Check In

  • Most customers were permitted to enter the store by using the mobile app and following the rules serially. Few customers were identified as sharing one mobile app, except for children and a couple. However, in the future, the case in which customers might not own a mobile phone should also be considered.
  • If the system could not autonomously identify a customer’s characteristic through computer vision technology, the characteristic was regarded as an abnormal one.
  • Computer vision could precisely capture and identify each customer’s characteristics, with the key factor being that each customer would stop in front of the turnstile while scanning the mobile app.

4.2. Browse

  • We barely found any irregular behavior, such as stealing and eating food in the store. This phenomenon was observed because the customers were already aware of the in-store surveillance operation before entering.
  • The action in which a customer put the selected product back in place or left it anywhere was considered regular. A stakeholder perspective was discussed where alternatives occurred from the promotion of other products. The customer would prefer to replace the original one after picking up a product.
  • A store staff was considered to set the products in order within the specified time. Thus, the autonomous store still involved manual operation for disordered products.

4.3. Pick Up

  • Through customer experience, the process in which a customer picked up a product and put it into the cart or passed it to other people was regarded as “selected.” At this moment, the people–item association was recognized. The product was simultaneously recognized as “sold” if continuously remaining in a specific customer’s cart.
  • We found that pick-up behaviors were not recognized by a single action from a customer; other sequential actions, such as putting products in a cart, exchanging products on the shelf, or handing products over to other people, also had to be considered.
  • The customers browsed products through their eyes instead of touching them because of the in-store surveillance operation.
  • Customer behaviors were observed and recognized to be more complicated in the pick-up step because customers spent more time in selecting different products and making decisions.

4.4. Check Out

  • In this system learning journey, each product was regarded as “paid” in the process when it, in association with a customer, crossed the checkout line. The system synchronized a receipt service and inventory operation after the customer passed the checkout line.
  • We found that a few consumers went back into the store through the checkout line within 5 min and made repeated attempts of repurchasing or replacing products. This behavior was regarded as an abnormal action. However, from the perspectives of in-store staff, the product was regarded as sold once it crossed the checkout line even if it was put back in place.
  • The product quality degrades if replacing behavior from consumer is not appropriately controlled. Some quality issues related to sold products cannot be solved in the real environment. An alternative of manually issued refunds is suggested.

5. Future Opportunities

This paper mainly discusses the development of a sustainable user experience in a smart system. A series of learning processes demonstrate the effectiveness of the proposed method and its performance. The following topics can be examined in future studies.

5.1. Understanding Complicated Behavior

The customer behavior in a convenience store is relatively simple and involves making quick choices and decisions in the shopping journey. We designed 20 rules for a developed smart system through user experience and recognized that a product is selected on behalf of a customer in terms of the people–item association status. However, unusual situations still exist; for example, a customer may perform irregular activities, such as unpacking a product to eat its content, exchanging fake products, or modifying the facing of shelves while moving products, all of which lead to recognition interference within the system. Some regular activities, such as keeping a similar product before entering the store, passing products to others or people with similar shapes, or wearing similar clothes, may also lead to misjudgment.
Considerable scope exists for improving an autonomous store in terms of identifying customer characteristics and understanding customer behavior. Additional practical understanding of fine-grained customer behavior can sustain the experience of the smart system.

5.2. Diverse Retail Environment

Retail can be divided into categories, such as specialty shops, convenience stores, supermarkets, hypermarkets, department stores, shopping malls, and factory outlets.
In this research, we used a convenience store as the experimental environment. We limited the capacity of people, commodities, and shelves, and we made assumptions about the time customers tend to remain in such a store. The types of goods, store size, promotion methods, and the number of checkout counters in convenience stores differ from those in other types of retail stores. For example, supermarkets sell fresh fruits and vegetables, department stores have a wider variety of goods, and the shopping malls have a variety of shops and customer experience areas. Customer behavior and duration in the stores also differ across store types.
Further understanding of autonomous technologies is necessary for application to different types of retail stores. For example, more powerful computer vision computing capabilities or IoT technologies to assist computer recognition capabilities could be used. Future work can extend this research to improve user experience in autonomous stores.

5.3. Autonomous Store Maturity

At the time of writing this paper, the AI technology is already a major focus of discussion in the market. For autonomous vehicles, several car manufacturers have adopted the AI technology to demonstrate systems for the self-driving of cars according to the SAE International “Levels of Driving Automation” standard. This standard defines six levels of driving automation and provides an initial regulatory framework to guide manufacturers and other entities toward the safe design, development, testing, and deployment of highly automated vehicles [47].
In retailing, the technology behind an autonomous store is already evolving. In this study, we developed a smart system to detect and track products, recognize human action, and enable frictionless checkout. However, it is unclear how to verify whether an autonomous store is smart enough to operate on its own.
Brands and convenience stores have unveiled and demonstrated “smart shops” with cutting-edge technologies in the retail context. However, inconsistency exists in the “smart shop” terminology used in the retail industry. Some of this inconsistency is attributed to single automated and connected cognitive retailing. From the business perspective, developing a smart shop and ensuring its commercial value are difficult tasks. Different levels of operating automation must be formulated, from no automation to full autonomy. “Automation” connotes control or operation by a machine, whereas “autonomy” connotes acting alone or making decisions independently [21]. It can be determined by technology adoption whether a store is sufficiently smart to stand with the AI technology.
For example, the lowest level of autonomy (Level 1), namely a human with tools, features retail operation managed by human experience. It emphasizes face-to-face customer daily care in shopping experience. At a level higher (Level 2), namely machines augmenting humans, feature management is assisted by simple handheld tools and promotion activity is recommended by the store staff. At this level, technology for single-point functionality in a store is adopted. At a further level (Level 3), collaborative machines, a system-based sales operation, and partial sensor detection are adopted. In retailing, consumption data are used to increase the quality of customer service and improve the operational efficiency. The next advanced level (Level 4), which involves humans augmenting machines, leverages robotics into retail systems to work together to optimize functions such as customer experience, forecasting, and inventory management. The highest level (Level 5), which involves autonomous machines, completely uses AI technologies, such as computer vision, to introduce near-real-time intelligence to stores. In this stage, more context-aware services are unlocked, such as on-demand recommendation, unconscious checkout, and seamless shopping experience.
The aforementioned levels of an autonomous store help eliminate confusion by providing clarity in retailing. Autonomous stores exhibit certain characteristics that distinguish them from the maturity of technology adoption. Due to these characteristics, autonomous stores can be accurately understood in more widespread use.

6. Conclusions

Technology is playing an increasingly important role in retailing, which has led to the introduction of the retail service in an autonomous store. To achieve lower labor costs and greater operational efficiency, retail enterprises are seeking sustainable approaches for providing their technological capabilities. For retailing to be sustainable in the future, unmanned retailing is essential.
An autonomous store operates entirely or mostly without any human intervention and is critical to a smart system’s success. A customer can walk in a store, quickly find items and grab them, and directly walk out with a frictionless experience. To sustain this customer experience, a smart system can gradually understand customer behaviors from the aspects of their shopping journey and improve the store’s daily operation efficiency.
In this study, we developed an IT artifact that can guide the design of a smart system in an autonomous store and designed rules to enable the system to learn and understand in-store customer behaviors. By adopting the action research methodology, experienced retail stakeholders were involved in enabling the system to recognize diverse actions from customers. The pick-up step comprised the highest number of complicated behaviors because it involved making choices or interactions with others.
This study conducted analyses of in-store human body actions by actively identifying and understanding customer behaviors during the shopping journey. By developing a smart system and designing a rule-based knowledge set, autonomous stores can be implemented that understand in-store customer shopping behaviors through user experience in a sustainable manner.
An autonomous store with computer vision technology still faces numerous challenges that remain to be addressed, such as unpredicted customer behaviors, customer interaction behaviors, or more complicated retail contexts. Other influencing factors from retail operation, such as increasing traffic and sales promotion, were not considered in this study.
This study focused on the future work to be conducted in autonomous store development and implementation. We believe that the computer vision technology will become a promising approach for the development of a smart system in an autonomous store. Future studies should conduct further research on creating a rule-based knowledge set to improve the quality of computer vision.

Author Contributions

Conceptualization, S.-C.C.; methodology, S.S.C.S.; software, S.-C.C.; validation, S.-C.C.; formal analysis, S.S.C.S.; investigation, S.-C.C.; resources, S.-C.C.; data curation, S.-C.C.; writing—original draft preparation, S.-C.C.; writing—review and editing, S.S.C.S.; visualization, S.-C.C.; supervision, S.S.C.S.; project administration, S.-C.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. The UN Sustainable Development Goals 9: Industry, Innovation and Infrastructure. Available online: https://www.un.org/sustainabledevelopment/ (accessed on 20 April 2021).
  2. Bradlow, E.T.; Gangwar, M.; Kopalle, P.; Voleti, S. The Role of Big Data and Predictive Analytics in Retailing. J. Retail. 2017, 93, 79–95. [Google Scholar] [CrossRef] [Green Version]
  3. Grewal, D.; Noble, S.M.; Roggeveen, A.L.; Nordfalt, J. The Future of In-store Technology. J. Acad. Mark. Sci. 2020, 48, 96–113. [Google Scholar] [CrossRef] [Green Version]
  4. Grewal, D.; Roggeveen, A.L.; Nordfalt, J. The Future of Retailing. J. Retail. 2017, 93, 1–6. [Google Scholar] [CrossRef]
  5. Inman, J.J.; Nikolova, H. Shopper-Facing Retail Technology: A Retailer Adoption Decision Framework Incorporating Shopper Attitudes and Privacy Concerns. J. Retail. 2017, 93, 7–28. [Google Scholar] [CrossRef]
  6. Lemon, K.N.; Peter, C.V. Understanding Customer Experience throughout the Customer Journey. J. Mark. 2016, 80, 69–96. [Google Scholar] [CrossRef]
  7. Andajani, E. Understanding Customer Experience Management in Retailing. Procedia Soc. Behav. Sci. 2015, 211, 629–633. [Google Scholar] [CrossRef] [Green Version]
  8. Gentile, G.; Spiller, N.; Noci, G. How to Sustain the Customer Experience: An Overview of Experience Components that Co-create Value with the Customer. Eur. Manag. J. 2007, 25, 395–410. [Google Scholar] [CrossRef]
  9. Willems, K.; Smolders, A.; Brengman, M.; Luyten, K.; Schoning, J. The Path-to-purchase is Paved with Digital Opportunities: An Inventory of Shopper-oriented Retail Technologies. Technol. Forecast. Soc. Chang. 2017, 124, 228–242. [Google Scholar] [CrossRef]
  10. Bitner, M.; Ostrom, A.; Meuter, M. Implementing Successful Self-Service Technologies. Acad. Manag. Exec. 2002, 16, 96–108. [Google Scholar] [CrossRef]
  11. Dabholkar, P.A. Consumer Evaluations of New Technology-based Self-options: An Investigation of Alternative Models of Service Quality. Int. J. Res. Mark. 1996, 13, 29–52. [Google Scholar] [CrossRef]
  12. Grewal, D.; Levy, M.; Kumar, V. Customer Experience Management in Retailing: An Organizing Framework. J. Retail. 2009, 85, 1–14. [Google Scholar] [CrossRef]
  13. Hsieh, C.T. Implementing Self-Service Technology to Gain Competitive Advantages. Commun. IIMA 2005, 5, 77–83. [Google Scholar]
  14. Meuter, M.; Ostrom, A.; Roundtree, R.; Bitner, M. Self-Service Technologies: Understanding Customer Satisfaction with Technology-Based Service Encounters. J. Mark. 2000, 64, 50–64. [Google Scholar] [CrossRef] [Green Version]
  15. Curran, J.; Meuter, M. Encouraging Existing Customers to Switch to Self-Service Technologies: Put a Little Fun in Their Lives. J. Mark. Theory Pract. 2007, 15, 283–298. [Google Scholar] [CrossRef]
  16. Lee, J.; Allaway, A. Effects of Personal Control on Adoption of Self-Service Technology Innovations. J. Serv. Mark. 2002, 16, 553–573. [Google Scholar] [CrossRef]
  17. Walker, R.; Johnson, L. Why Consumers Use and Do Not Use Technology-Enabled Services. J. Serv. Mark. 2006, 20, 125–135. [Google Scholar] [CrossRef]
  18. Roy, S.K.; Balaji, M.S.; Sadeque, S.; Nguyen, B.; Melewar, T.C. Constituents and Consequences of Smart Customer Experience in Retailing. Technol. Forecast. Soc. Chang. 2017, 124, 257–270. [Google Scholar] [CrossRef]
  19. Lin, J.S.; Hsieh, P.L. The Influence of Technology Readiness on Satisfaction and Behavioral Intentions toward Self-service Technologies. Comput. Hum. Behav. 2007, 23, 1597–1615. [Google Scholar] [CrossRef]
  20. Meuter, M.; Ostrom, A.; Bitner, M.; Roundtree, R. The Influence of Technology Anxiety on Consumer Use and Experiences with Self-service Technologies. J. Bus. Res. 2003, 56, 899–906. [Google Scholar] [CrossRef]
  21. Wood, S.P.; Chang, J.; Healy, T.; Wood, J. The Potential Regulatory Challenges of Increasingly Autonomous Motor Vehicles. St. Clara Law Rev. 2012, 52, 1423–1502. [Google Scholar]
  22. Demirkan, H.; Spohrer, J. Developing a Framework to Improve Virtual Shopping in Digital Malls with Intelligent Self-service Systems. J. Retail. Consum. Serv. 2014, 21, 860–868. [Google Scholar] [CrossRef]
  23. Guo, B.; Wang, Z.; Wang, P.; Xin, T.; Zhang, D.; Yu, Z. DeepStore: Understanding Customer Behavior in Unmanned Stores. IT Prof. 2020, 22, 55–63. [Google Scholar] [CrossRef]
  24. Liu, J.; Gu, Y.; Kamijo, S. Customer Behavior Recognition in Retail Store from Surveillance Camera. In Proceedings of the 2015 IEEE International Symposium on Multimedia (ISM), Miami, FL, USA, 14–16 December 2015; pp. 154–159. [Google Scholar]
  25. Mora, D.; Nalbach, O.; Werth, D. How Computer Vision Provides Physical Retail with a Better View on Customers. In Proceedings of the 2019 21st IEEE Conference on Business Informatics (CBI), Moscow, Russia, 15–17 July 2019; pp. 462–471. [Google Scholar]
  26. Neman, A.J.; Daniel, K.; Oulton, D.P. New Insights into Retail Space and Format Planning from Customer-tracking Data. J. Retail. Consum. Serv. 2002, 9, 253–258. [Google Scholar] [CrossRef] [Green Version]
  27. Singh, P.; Katiyar, N.; Verma, G. Retail Shoppability: The Impact of Store Atmospherics & Store Layout on Consumer Buying Patterns. Int. J. Sci. Technol. Res. 2014, 3, 15–23. [Google Scholar]
  28. Polacco, A.; Backes, K. The Amazon Go Concept: Implications, Applications, and Sustainability. J. Bus. Manag. 2018, 24, 79–92. [Google Scholar]
  29. Senior, A.W.; Brown, L.; Hampapur, A.; Shu, C.F.; Zhai, Y.; Feris, R.S.; Tian, Y.L.; Borger, S.; Carlson, C. Video Analysis for Retail. In Proceedings of the Advanced Video and Signal Based Surveillance (AVSS), London, UK, 5–7 September 2007; pp. 423–428. [Google Scholar]
  30. Ross, D.A.; Lim, J.; Lin, R.S.; Yang, M.H. Incremental Learning for Robust Visual Tracking. Int. J. Comput. Vis. 2008, 77, 125–141. [Google Scholar] [CrossRef]
  31. Smeulders, A.W.; Chu, D.M.; Cucchiara, R.; Calderara, S.; Dehghan, A.; Shah, M. Visual Tracking: An Experimental Survey. IEEE Trans. Pattern Anal. Mach. Intell. 2014, 36, 1442–1468. [Google Scholar] [PubMed] [Green Version]
  32. Weber, F.; Schutte, R. A Domain-Oriented Analysis of the Impact of Machine Learning—The Case of Retailing. Big Data Cogn. Comput. 2019, 3, 11. [Google Scholar] [CrossRef] [Green Version]
  33. Grewal, D.; Roggeveen, A.L. Understanding Retail Experiences and Customer Journey Management. J. Retail. 2020, 96, 3–8. [Google Scholar] [CrossRef]
  34. Lo, C.H.; Wang, Y.W. Constructing an Evaluation Model for User Experience in an Unmanned Store. Sustainability 2019, 11, 4965. [Google Scholar] [CrossRef] [Green Version]
  35. Roy, S.K.; Balaji, M.S.; Nguyen, B. Consumer-Computer Interaction and In-Store Smart Technology (IST) in the Retail Industry: The Role of Motivation, Opportunity, and Ability. J. Mark. Manag. 2020, 36, 299–333. [Google Scholar] [CrossRef]
  36. Grewal, D.; Roggeveen, A.L.; Sisodia, R.; Nordfalt, J. Enhancing Customer Engagement through Consciousness. J. Retail. 2017, 93, 55–64. [Google Scholar] [CrossRef]
  37. Wu, H.C.; Ai, C.H.; Cheng, C.C. Experiential Quality, Experiential Psychological States and Experiential Outcomes in an Unmanned Convenience Store. J. Retail. Consum. Serv. 2014, 51, 860–868. [Google Scholar] [CrossRef]
  38. Cole, R.; Purao, S.; Rossi, M.; Sein, M.K. Being Proactive: Where Action Research meets Design Research. In Proceedings of the International Conference on Information Systems (ICIS), Las Vegas, NV, USA, 11–14 December 2005; pp. 325–336. [Google Scholar]
  39. Lee, J.; Wyner, G.; Pentland, B. Process Grammar as a Tool for Business Process Design. MIS Q. 2008, 32, 757–778. [Google Scholar] [CrossRef]
  40. Hevner, A.R.; March, S.T.; Park, J. Design Science in Information Research. MIS Q. 2004, 28, 75–105. [Google Scholar] [CrossRef] [Green Version]
  41. Peffers, K.; Tuunanen, T.; Rothenberger, M.A.; Chatterjee, S. A Design Science Research Methodology for Information Systems Research. J. Manag. Inf. Syst. 2007, 24, 45–77. [Google Scholar] [CrossRef]
  42. Baskerville, R.L. Investigating Information Systems with Action Research. Commun. Assoc. Inf. Syst. 1999, 2, 2–32. [Google Scholar] [CrossRef]
  43. Hult, M.; Lennung, S.Å. Towards a Definition of Action Research: A Note and Bibliography. J. Manag. Stud. 1980, 17, 241–250. [Google Scholar] [CrossRef]
  44. Baskerville, R.L.; Wood-Harper, T. A Critical Perspective on Action Research as a Method for Information Systems Research. J. Inf. Technol. 1996, 11, 235–246. [Google Scholar] [CrossRef]
  45. Lewin, K. Action research and minority problems. J. Soc. Issues 1946, 2, 34–46. [Google Scholar] [CrossRef]
  46. Sein, M.K.; Henfridsson, O.; Purao, S.; Rossi, M.; Lindgren, R. Action Design Research. MIS Q. 2011, 35, 37–56. [Google Scholar] [CrossRef] [Green Version]
  47. SAE International. Taxonomy and Definitions for Terms Related to Driving Automation Systems for On-road Motor Vehicles (SAE Standard J3016, Report No. J3016-201806); SAE International: Warrendale, PA, USA, 2018. [Google Scholar]
Figure 1. Design of core functions and the development of a rule-based knowledge set.
Figure 1. Design of core functions and the development of a rule-based knowledge set.
Sustainability 13 05090 g001
Figure 2. Core functions of a smart system.
Figure 2. Core functions of a smart system.
Sustainability 13 05090 g002
Figure 3. Customer shopping process.
Figure 3. Customer shopping process.
Sustainability 13 05090 g003
Table 1. Definition of stakeholders’ roles and responsibilities.
Table 1. Definition of stakeholders’ roles and responsibilities.
RoleResponsibilityNumber
Store staffTo interpret the possible intention of shoppers in the shopping journey and provide operation experience in service, such as product offering, space design, and Stock Keeping Unit management2
Experienced consumerTo share in-store shopping experience and possible reactions in the browsing, selecting, exchange, and checkout processes3
EngineerTo provide technical practicability in product display, shopper characteristics, camera setting, and computing capability2
Table 2. Definition of the CL and response strategy.
Table 2. Definition of the CL and response strategy.
Confidence Level (CL)Occurrence NumberResponse Strategy
1<10Human-assisted
211–100Human-assisted
3101–500Self-identified
4501–1000Self-identified
5>1001Self-identified
Table 3. Rule generation plan in each stage.
Table 3. Rule generation plan in each stage.
Stage
IIIIII
Test Period (Day)134
Rule Number884
EvaluationConverge to a common view to obtain pre-defined rulesIdentify and increment rules by detecting customer behaviorsContinue recognizing rules to improve the system’s learning performance
Table 4. Results of rule design and verification.
Table 4. Results of rule design and verification.
StepRulesTypeCL in Stage
IIIIII
Check inIf a customer is alone, they use the mobile app to be permitted to enter the store.Normal155
Check inIf customers are wearing similar clothes, they are identified separately and considered overlapping in the store.Normal-23
Check inIf a group of friends has only one app, the app account owner identifies each person wanting to enter with their app and all actions are linked to the same account.Normal123
Check inIf an employee is at work, they can be identified and permitted to enter the store through the mobile app but cannot shop. Normal-12
Check inIf a customer enters with a kid in the stroller or on the shoulder, they are identified and tracked using the same account.Normal-12
Check inIf a customer’s face is not visible, they are identified by their other body features instead.Abnormal-12
BrowseIf a customer picks up and then puts back a product, the product is removed from the virtual basket.Normal155
BrowseIf a product is grabbed at one place and then put back at another place in the store, it is tracked. The product is removed from the customer’s virtual basket.Normal134
BrowseIf a customer is trying to steal or exchange fake products in an irregular behavior, they are identified to be tracked and annotated to the mobile app.Abnormal--1
BrowseIf a customer eats a product and then puts back the packaging on the shelf, they are identified to be charged and annotated to the mobile app.Abnormal--1
Pick upIf a product is grabbed by a customer, it is added to their virtual basket.Normal155
Pick upIf two or more items of the same product are grabbed by a customer, these are added to their virtual basket.Normal-13
Pick upIf the product is grabbed and validated in the customer’s hand or bag, it is added to their virtual basket.Normal155
Pick upIf a customer passes on a product to another customer, it is identified as a transfer action and updated in the virtual basket if they are using different accounts.Normal-12
Pick upIf a customer grabs a product but not with his hands, this product is identified to be added to their virtual basket.Abnormal--1
Pick upIf a customer picks up a product lying on the floor and puts it back on the shelf, this product is not added to their virtual basket.Abnormal-12
Pick upIf a customer enters the store with a product that is also sold in the store, it is recognized while they are entering the store.Abnormal-13
Check outIf a customer passes through the store exit line, they are detected and recognized as leaving the store.Normal155
Check outIf a customer passes through the store exit line, they can receive an invoice on the mobile app with the correct shopping item details and price within 3 min.Normal155
Check outIf a customer passes through the store exit line and turns back to the exit line immediately, their shopping process is identified as ongoing.Abnormal--1
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Chen, S.-C.; Shang, S.S.C. Sustaining User Experience in a Smart System in the Retail Industry. Sustainability 2021, 13, 5090. https://doi.org/10.3390/su13095090

AMA Style

Chen S-C, Shang SSC. Sustaining User Experience in a Smart System in the Retail Industry. Sustainability. 2021; 13(9):5090. https://doi.org/10.3390/su13095090

Chicago/Turabian Style

Chen, Sheng-Chi, and Shari S. C. Shang. 2021. "Sustaining User Experience in a Smart System in the Retail Industry" Sustainability 13, no. 9: 5090. https://doi.org/10.3390/su13095090

APA Style

Chen, S. -C., & Shang, S. S. C. (2021). Sustaining User Experience in a Smart System in the Retail Industry. Sustainability, 13(9), 5090. https://doi.org/10.3390/su13095090

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop