*2.1. Concept*

Previous studies demonstrate that body expressions are as powerful as facial expressions in conveying emotions [13,69,70]. Body expressions can play an important role in non-verbal communication than previously thought [69,71]. Studies have also utilized body expressions to detect stress [72]. According to Wallbott et al. [73], changes in body posture provides a strong indication for the changes in affective states. Body expressions are stated as being greater in revealing a deception than relying on facial expressions [68]. Some affective expressions may even be better pronounced using body posture than facial expressions [74]. Recent advances in ubiquitous computing enable the recognition of these body expressions. For instance, computer games such as Nintendo Wii and Microsoft Kinect [13,75] not only utilise body movements as a means to control the game but also to capture the emotional and cognitive performance of the player. The majority of previous works on affective recognition systems utilising body postures and body expressions rely on vision-based techniques that analyse motion data by rgb and depth cameras [75,76]. For instance, Kleinsmith et al. [75] classified four affective states (concentration, defeat, frustration and triumph) with people playing a sports game on Nintendo Wii, and achieved an accuracy of 50%. Another work demonstrates the detection of sadness, joy, anger and fear with a 84–94% accuracy using a 6-camera Vicon motion tracking system. Although vision-based systems seem promising in identifying body postures, these are impractical to integrate into one's daily routine when aiming to identify emotions, such as those induced by acute stress. As an alternative for vision-based approaches, inferring stress based on posture has been explored previously by embedding pressure sensor in chairs [77]. In this work, authors identified

fast movements of Centre of Pressure as a valuable feature in detecting stress. A similar approach was used to detect interest levels of students [78]. This approach required instrumenting chairs with high-resolution pressure sensors, which may not be scalable. However, we believe the sitting posture variations should reflect from the foot posture and motion variations. In literature, smart insoles has been used to detect postures unobtrusively [23,79]. Hence, a smart insole-based solution could be a practical and unobtrusive solution to identify stress that is induced by body language, such as through tracking leg movements and posture characteristics.
