**4. Results**

In this section, we evaluate the proposed contextual mining of sedentary behaviour model and present the results of the performed experiments. Our approach belongs to the family of instance-based learning (i.e., *k*-nearest neighbour). Such approaches do not require optimizing the classifier parameters. It stores the training instances and classifies the new data by calculating the similarities of the stored instances. In order to get these training instances, we asked the participants to annotate the daily routines by miming short duration trials and keeping a note of the start and end time of each context. The training dataset is labelled over the time intervals' information. To assess the performance of our approach, we split the dataset into a ratio of 60:10:30 (i.e., training:validation:test) of the annotated dataset. Our dataset is balanced by considering the equal instances of each considered context. In this experimental setting, the simple performance metric "accuracy" is able to provide correct information about the ability of the model. Initially, we get an overall accuracy of 93% over the collected dataset. We analysed the dataset and performed the data pre-processing. In this step, we discard the first and last few instances of the recorded context. Our analysis showed that start and end instances do not present the true representation of the class. Such a setting enhanced the quality of the training instances in terms of better context representation. In the same experimental setting with the same dataset, we get an accuracy of 98%. The real test setup consists of six volunteer graduate students. The participants installed our developed application on their smartphone for two weeks in order to have enough time to analyse the significance and to perform a comparative analysis of sedentary behaviour. Examples of the scenes of context mining are shown in Figure 6.

**Figure 6.** Example photos for "short break", "active", "working on a PC", "watching TV" and "sedentary-context unknown".

**Figure 7.** Current progress of the user in the last hour.

In Figure 7, we can observe the progress of the last hour in real time by identifying the context, either active or still. We can see in the "progress graph" (i.e., Figure 7) that the *x*-axis presents the recognized context, the while *y*-axis provides the time stamp in minutes. Furthermore, each point presents each minute of the human behaviour and reports the information about the last 52 min. The annotation of the recognized context shows that the participant was waling from the dormitory to the campus. A user can also visualize the hourly status of the sedentary behaviour of the day. We presented the hourly status of the behaviour, which is the "today context", as shown in Figure 8.

In Figure 8, each bubble presents the number of minutes, and the size of each bubble increases or decreases with the recognized context. For example, at 11:00, the person is in sedentary activity for the whole 60 min. We can also observe that 0.00 means that person is not active even for a single minute.

**Figure 8.** Hourly sedentary behaviour recognition.

In order to provide rich contextual information, we facilitate the user awareness about the micro-contexts of sedentary behaviour, which explains how much time the user spent watching TV, working on a PC or sedentary-context unknown. As we discussed earlier, our micro-context recognition list is very limited due to limited processing of environmental sound. Figure 9 shows the details of micro-contexts that our model identifies by processing the environmental sound.

**Figure 9.** Micro-context recognition.

In Figure 9, the *x*-axis represents the time in hours, while *y*-axis presents the recognized micro-context. All the sedentary contexts other than watching TV and working on a PC are considered as sedentary-context unknown. In the unknown context, a user can be located on a public bus, in a library, in a cafeteria, sleeping or any other situation. We also present the entire week of behaviour in terms of recognized context and visualize it through our developed application. The user can query any specific context from the recognized context to get the information about the time spent. Figure 10 shows the total active hours of the user each day throughout the week.

**Figure 10.** Total time spent during a week while being "active".

It is obvious in Figure 10 that very limited activity is observed during Friday and Wednesday against the context "active", while the user is very active on Tuesday and Sunday. Figure 11 presents the total duration spent in short breaks for each day. Our model recognized the short breaks between the sedentary hours of a user. Along the *x*-axis, we placed the time in hours, and the *y*-axis presents the days.

**Figure 11.** Total time spent during a week for "short breaks".

In Figure 11, the user took a small number of short breaks during Saturday, while a large number of short breaks can be seen on Monday. This information about the number of short breaks may help the subject to avoid longer sedentary activity, as well as provide an abstraction to compare different

days. In Figure 12, we present the recognized context information while the user is working on a PC. During the wee, the user spent a maximum of 7 h on a PC, while zero hours were recognized on Wednesday.

**Figure 12.** Total time spent during a week for "working on a PC".

In Figure 13, we can observe the total time spent while watching TV during leisure time. In the presented "weekly context", the *x*-axis presents the context of watching TV, while the *y*-axis presents the time spent in hours. Furthermore, zero means that the user did not watch the television on Monday, Tuesday, Wednesday and Thursday.

**Figure 13.** Total time spent during a week for "watching TV".

In Figure 14, we present the sedentary behaviour when the context is unknown. In unknown context sleeping time is also included and sedentary behaviour that is other than watching TV or working on a PC. The *y*-axis presents the context for the whole week, and each bar presents the number of hours spent during the sedentary activity. We found that the SitCoach [27] application is aligned in the same direction of our research. The SitCoach application monitors office workers' prolonged sitting routines and generates alerts. The alerts may help to reduce sedentary behaviour, and their intervention successfully helps office workers. Their application is restricted in terms of visualization of the user behaviour, as well as mining the micro-contexts. In our proposed research, we are providing rich information to the user about the recognized contexts. We found that self-awareness helps to reduce the sedentary behaviour and motivate the user to avoid prolonging sitting. The following section provides more details about this.

**Figure 14.** Total time spent during a week for "sedentary-context unknown".
