Next Article in Journal
Quantization-Aware NN Layers with High-throughput FPGA Implementation for Edge AI
Next Article in Special Issue
Anomaly Detection in Multi-Wavelength Photoplethysmography Using Lightweight Machine Learning Algorithms
Previous Article in Journal
Privacy Risk Assessment of Smart Home System Based on a STPA–FMEA Method
Previous Article in Special Issue
The Effects of Directional and Non-Directional Stimuli during a Visuomotor Task and Their Correlation with Reaction Time: An ERP Study
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Probabilistic Model of Human Activity Recognition with Loose Clothing †

1
Centre for Robotics Research, Department of Engineeing, King’s College London, London WC2R 2LS, UK
2
Centre for Human and Applied Physiological Sciences, King’s College London, London SE1 1UL, UK
*
Author to whom correspondence should be addressed.
This paper is an extended version of our paper published in 2023 International Conference on Robotics and Automation.
Sensors 2023, 23(10), 4669; https://doi.org/10.3390/s23104669
Submission received: 7 February 2023 / Revised: 24 April 2023 / Accepted: 5 May 2023 / Published: 11 May 2023
(This article belongs to the Special Issue Biosignal Sensing Analysis (EEG, MEG, ECG, PPG))

Abstract

:
Human activity recognition has become an attractive research area with the development of on-body wearable sensing technology. Textiles-based sensors have recently been used for activity recognition. With the latest electronic textile technology, sensors can be incorporated into garments so that users can enjoy long-term human motion recording worn comfortably. However, recent empirical findings suggest, surprisingly, that clothing-attached sensors can actually achieve higher activity recognition accuracy than rigid-attached sensors, particularly when predicting from short time windows. This work presents a probabilistic model that explains improved responsiveness and accuracy with fabric sensing from the increased statistical distance between movements recorded. The accuracy of the comfortable fabric-attached sensor can be increased by 67% more than rigid-attached sensors when the window size is 0.5 s . Simulated and real human motion capture experiments with several participants confirm the model’s predictions, demonstrating that this counterintuitive effect is accurately captured.

1. Introduction

Human motion analysis is important in many different fields of study. For instance, human–robot interaction [1], physical rehabilitation and medical care [2]. The recent development of electronic textiles (e-textiles) has made it possible to incorporate sensors into garments [3]. This has significant advantages, such as the ability to capture natural behavior and ensure the wearer’s comfort through unobtrusive sensing, and the sensors can be attached to any position on the clothing [4].
However, one of the issues with clothing-embedded sensing is the additional motion of the fabric’s movement with respect to the body (see Figure 1). The prevailing view is that this motion needs to be treated as an unwanted artifact that should be eliminated. For this purpose, several approaches to remove or limit it have been suggested, such as (i) ensuring a rigid attachment between sensor and body [5], (ii) supervised errors-in-variables regression [6], (iii) unsupervised latent space learning [7] (iv) and difference mapping distributions [8]. However, several recent works focus on fabric motion: (i) Michael and Howard [9] found that fabric motion may actually assist human motion analysis, particularly in activity recogntion (AR). This could be due to some possible factors (e.g., uncertainty in the fabric itself, movement frequency, etc.). (ii) Jayasinghe et al. [10] discovered that the movement of clothing can be useful to describe human daily activities (walking, running, sitting and riding a bus) and gait analysis [11]. However, this phenomenon so far lacks a satisfactory theoretical explanation or model.
To this end, this paper proposes a probabilistic framework as a basis for understanding this phenomenon. A probabilistic model is introduced in which it can be shown that statistical distance measures such as the Kolmogorov-Smirnov (KS) test imply that stochastic fabric movements lead to greater discriminative ability. The predictions of the model are verified in a set of simulated and real human motion capture experiments, where it is evident that sensors loosely attached to fabric yield greater accuracy than rigidly attached ones, especially when making predictions under time pressure. Despite the simplicity of the model, empirical data show it is surprisingly accurate at capturing the aforementioned phenomenon in a variety of conditions. This suggests that it could be a useful tool in the design and analysis of motion capture systems using ordinary garments that allow for enjoyment of their comfort and user acceptability.
The remainder of this paper is organised as follows. Section 2 reviews previous work in machine learning applied to fabric-based sensing. Section 3 introduces a probabilistic model to predict the effect of fabric movement on statistical AR. Section 4 evaluates the model predictions as applied to physical motion recognition tasks. Section 5 provides a summary and discussion of the findings.

2. Related Work

There are numerous research papers focusing on human motion analysis based on wearable sensing [12]. In the majority of these studies, sensors are rigidly mounted on the body using tape, glue, straps, etc. Data analysis is strictly focused on body movement [10]. Relatively few studies investigate the case of motion signals arising from loose-clothing-attached sensors. Table 1 compares a selection of studies that involve fabric/clothing-based movement sensing and analysis. Some studies [13,14,15,16] focus on how to embedded the sensor into clothes using different e-textile technologies. Others [9,10,11,17] apply machine learning algorithms to sensor readings. However, they have not compared the performance of AR of body-attached and clothing-attached sensors with human movement, and they lack probabilistic models to understand the performance of AR of sensors with two different attachments.
Jayasinghe et al. [10] investigated and quantified the extent to which data obtained from clothing-attached sensors can be used to characterize activity compared with body-attached sensor data. Three pairs of sensors (accelerometers) were attached to three segments of the body (waist, thigh and ankle) and clothing (slacks, pencil skirt and loose frock) in similar positions and used to record daily activities (e.g., walking, running and sitting). The results show that clothing sensor data are strongly correlated with body-attached sensor data. In particular, the signal collected from the pencil skirt attached tightly at the waist had the highest correlation, whereas that collected from the loose frock had low correlation because motion artifacts formed a greater part of the signal. However, the authors did not analyze how those artifacts might contribute toward enhancing AR accuracy. Moreover, the study focused on activities that are relatively easy to distinguish (walking, running and sitting). Recently, Jayasinghe et al. [11] extended this work to focus on the analysis of gait-related movements (e.g., standing, sitting, sit-to-stand, stand-to-sit, leg raises and walking back and forth) using sensorized trousers with similar findings reported. Jayasinghe et al. [17] recently further explored the practicability of clothing-attached sensors to classify static postures using K-nearest neighbors (KNN) and achieved a promising result.
Michael and Howard [9] directly compared the performance of AR between rigid- and fabric-attached sensors using an instrumented pendulum. Three types of fabric (i.e., denim, roma and jersey) were affixed to the tip of a pendulum and inertial sensors were mounted such that one measured the movement of the tip of the pendulum (i.e., measurement from a rigidly attached sensor), one measured the movement of the the middle of the fabric and a third measured movement at the edge of the fabric. Data from these were used to train support vector machine (SVM) and discriminative regression machines (DRM) classifiers to distinguish between movements with small differences in frequency. Surprisingly, the results show higher accuracy in the classifiers trained on the data from the fabric-attached sensors. This study was the first to suggest that fabric movement may be beneficial to AR but lacked a sound theoretical model as to the cause of this effect. This lack of a model makes it hard to predict the accuracy of AR under different situations, such as how and when AR may benefit from fabric movement. Moreover, these findings relied on a pendulum as the experimental platform, so they lacked verification that the effect is reproduced in more realistic settings, such as the AR of humans wearing real clothing.
This paper proposes a probabilistic model and verifies it in a physical realization and a real human AR task. The experiment result shows that this model can be used to predict the performance of AR in different situations. This study is also the first to suggest that fabric movement can be beneficial to AR in a real human task.

3. Probabilistic Modeling Framework

The following introduces the proposed framework and presents example of its application to a simple movement classification task.

3.1. Problem Definition

Activity recogntion is defined as the classification of movements into a number of discrete categories of activity (e.g., walking, running, etc.) based on motion data. For simplicity, in the following, it is assumed that the motion data, Y , consist of N measurements of the absolute position of a point p y on a garment, collected over an extended duration of time at a regular sampling frequency. In AR, each data point is also associated with a label (throughout the paper, without loss of generality, the class labels are assumed to be binary) c { 0 , 1 } corresponding to the category of movement, so Y = { ( y 1 , c 1 ) , , ( y N , c N ) } . This is contrasted with so-called rigid data, U , recorded under the same conditions, except that the sensor is rigidly attached to the moving body. The goal of AR is to train a classifier on Y such that, when presented with (previously unseen) movement data y * , the corresponding class label c * can be accurately predicted.
As discussed in Section 2, previous studies have provided empirical evidence that, contrary to expectations, AR performance is improved when using data from loose fitting garments [9]. This paper proposes a probabilistic model that predicts this effect.

3.2. Probabilistic Model of Fabric Motion

The effect of fabric motion on the data Y is subject to a high degree of uncertainty, arising from challenges in estimating the fabric’s physical properties and the resultant movement complexity. To deal with this, it is proposed to model the data generation process through stochastic methods.
Specifically, the position y of the point p y on the fabric at any given time is modeled as the stochastic process consisting of the corresponding position u of a point p u on the rigid body plus a random offset δ introduced by the fabric motion. In the univariate case, this can be written as
Y = U + Δ
where y ( t ) Y , u ( t ) U and δ ( t ) Δ .
The key to determining whether the fabric movement is beneficial to AR is to understand the effect of Δ on the statistical distance between movement classes. A condition for improved classification performance is a greater statistical distance in data distribution between movement classes in Y compared with U , i.e.,
D ( Y c = 0 , Y c = 1 ) > D ( U c = 0 , U c = 1 )
where Y c = 0 is the distribution of fabric data for movement class c = 0 , Y c = 1 is the same for movement class c = 1 , U c = 0 and U c = 1 are the equivalent distributions for rigid data and D ( · , · ) is a suitable metric in probability space. For the latter, several choices are available (e.g., Kullback-Leibler (KL) divergence or Jensen-Shannon (JS) divergence). In the following, the KS metric is used [21]
D ( Y c = 0 , Y c = 1 ) = sup F ( Y c = 0 ) F ( Y c = 1 ) D ( U c = 0 , U c = 1 ) = sup F ( U c = 0 ) F ( U c = 1 )
where F ( · ) denotes the CDF of a random variable. Note that KS is chosen since the range of possible fabric positions may differ depending on the movement, meaning that Y c = 0 and Y c = 1 occupy different probability spaces, preventing measures such as KL or JS from being computed. In the next section, a simple working example is presented to illustrate the effect of fabric motion on activity recogntion predicted by this model.

3.3. Example: Oscillatory Motion

Consider the problem of movement analysis for a one-dimensional, scotch yoke mechanism using data from a sensor mounted on an inextensible piece of fabric attached to the mechanism (see Figure 2a). Here, an AR task might involve classifying a set of movements of the rigid body (e.g., those with different frequencies) from raw positional data.
In this system, the horizontal position of the rigid body (point p u ) over time is given by
u ( t ) = a sin ( ω t + ψ )
where a is the amplitude, ω is the frequency (equivalently, the angular speed of the rotary wheel of the mechanism) and ψ is the phase (equivalently, the starting angle of the wheel), and (without loss of generality) it is assumed a = 1 and 0 < L 1 . In this system, samples of the rigid body position follow a beta distribution
U B ( 1 2 , 1 2 ) .
As the rigid body moves, the fabric will move alongside it but undergo additional displacement due to the complex fabric dynamics according to (1). In general, the nature of these displacements will depend on (i) its physical properties (e.g., mass distribution, stiffness, fiber structure, length, width and orientation) and (ii) the movement pattern of the rigid body (e.g., amplitude, frequency). Noting that here the fabric is inextensible, the maximum possible displacement δ of the point p y from p u is L, suggesting that (in the absence of other prior information) a simple choice of its distribution could be
Δ ( L , L ) .
However, this would imply that any displacement L δ L is equally likely, whereas in practice, δ tends to be greater when there is greater excitation of the fabric by the movement of the yoke, e.g., for higher-frequency movements (see Figure 3a,b). Therefore, the following assumes
Δ ( z L , z L ) ,
where
z = 1 exp ( ω 2 ) .
Note that this respects the constraint that L δ L (since (8) causes 0 z 1 ), while capturing the tendency for δ to increase at higher frequencies up to a saturation point. Note also that the modeling approach is easily extensible to take account of more factors according to their relevance; for instance, wind of a constant direction and speed would introduce a bias in the fabric movements, which could be included by introducing an offset to the expectation of Δ .
According to this model, the CDF of U is [22]
F ( u ) = 0 u < 1 sin 1 ( u ) π + 1 2 1 u 1 1 u > 1
and it can be shown (a derivation is provided in Appendix A.1.1 and Appendix A.1.2) that the CDF of Y is dependent on z L , with two possible cases.
Case 1 ( 0 < z L < 1 ):
F ( y ) = 0 y < 1 z L F 1 ( y ) 1 z L y < 1 + z L F 2 ( y ) 1 + z L y 1 z L F 3 ( y ) 1 z L < y z L + 1 1 y > 1 + z L
where
F 1 ( y ) = π z + + 2 1 z + 2 + 2 z + sin 1 z + 4 z L π ,
F 2 ( y ) = 1 2 + 1 z + 2 1 z 2 + z + sin 1 z + z sin 1 z 2 z L π ,
F 3 ( y ) = 3 π z L + π y 2 z s i n 1 ( z ) 1 z 2 2 z L π ,
Case 2 ( z L = 1 ):
F ( y ) = 0 y < 2 F 1 ( y ) 2 y < 0 F 2 ( y ) y = 0 F 3 ( y ) 0 < y 2 1 y > 2
where
F 1 ( y ) = π ( 0.5 + y ) + 2 1 ( 0.5 + x ) 2 + ( 1 + 2 y ) sin 1 ( 0.5 + y ) 2 π ,
F 2 ( y ) = 1 2 ,
F 3 ( y ) = 3 π + π y 2 + ( y 1 ) sin 1 ( 1 y ) 1 ( 1 y ) 2 2 π ,
and z ± = z L ± y . The PDF and CDF of U and Y are shown in Figure 4a,b, respectively. As can be seen in Figure 4a, the range of fabric positions is larger than that of the rigid body, meaning they occupy different probability spaces.
From (9), it is apparent that the CDF of U is independent of ω, meaning that if the AR task is to classify movements of different frequencies (e.g., ω 1 and ω 2 ), the data distribution cannot increase statistical distance to discriminate between classes, i.e., D ( U ω 1 , U ω 2 ) = 0 . However, the dependency of the CDF of Y on ω is apparent in (10) and (14), leading to a statistical distance between movements of different frequency (to simplify the computation, ω 1 and z 1 represent low frequency, and ω 2 and z 2 represent high frequency). It is apparent that the largest statistical distance is when y = 1 + z 1 L . The statistical distance is shown below:
D ( Y ω 1 , Y ω 2 ) = π ( z 21 L 1 ) + 2 1 ( z 21 L 1 ) 2 2 ( 1 z 21 L ) s i n 1 ( z 21 L 1 ) 4 z 2 L π ,
where z 21 = z 2 z 1 .
As D ( Y ω 1 , Y ω 2 ) > 0 , the condition (2) is met, suggesting that AR based on the motion of the fabric will lead to higher classification performance. Moreover, D ( Y ω 1 , Y ω 2 ) increases monotonically with L, as can be seen by examining its derivative
d D ( Y ω 1 , Y ω 2 ) d L = π 2 ( L z 21 ( L z 21 2 ) + sin 1 ( 1 L z 21 ) ) 4 π z 2 L 2 ,
where z 21 = z 2 z 1 . Noting that 0 < z 21 L 1 , as max { 2 ( L z 21 ( L z 21 2 ) + sin 1 ( 1 L z 21 ) ) } = π , it is clear that d D ( Y ω 1 , Y ω 2 ) d L > 0 for 0 < z 1 L < z 2 L 1 . This suggests that the looser the fabric, the greater the statistical distance. To the authors’ knowledge, this is the first analytical model to capture and explain the empirical finding that data from loose clothing can lead to enhanced AR. An empirical study of this example is provided in the next section.

4. Activity Recogntion Via Statistical Methods

As noted in Section 3.2, the extent to which fabric motion helps AR in practice will depend on both the complexity of the movement and the physical properties of the fabric. In this section, three empirical case studies are presented to test the model’s predictions when using a well-established statistical machine learning approach for AR with fabric-induced motion data. The cases considered are (i) a numerical simulation of the example described in Section 3.3, (ii) its physical realization and (iii) a real human AR task (data and source code for these experiments are available online at https://doi.org/10.18742/22182358 (accessed on 9 May 2023).).

4.1. Case Study 1: Simple Harmonic Motion

This evaluation aims to verify the predictions of the proposed model using a numerical simulation.

4.1.1. Materials and Methods

Data were collected from a numerical simulation of the system shown in Figure 2a implemented in MATLAB R2019b (MathWorks, Natick, Massachusetts, USA) consisting of trajectories of length T = 2 π   s generated at a sampling rate of f s = 10 Hz using (1), from random initial angles ψ ( π , π ) with a = 1 m. Each data set contains N = 400 trajectories with 200 trajectories from the yoke running at low frequency (i.e., ω 1 = 1 rad s 1 ) and 200 at high frequency ( ω 2 = 2 rad s 1 ). The same procedure was used to collect fabric movement data Y for lengths L { 1 3 , 2 3 , 1 } m (i.e., F 1 , F 2 , F 3 ) and rigid body data U (i.e., R 1 ). The data were split into equal-sized training and test sets and used to train an SVM classifier (Libsvm toolbox [23]) to perform AR with Gaussian radial basis functions (RBFs) as the kernel function. The SVM was trained to predict the mapping
ϕ n c n
in an online fashion, where ϕ n is a fragment of the nth trajectory and c n { 0 , 1 } is the corresponding class label ( c = 0 for ω 1 , c = 1 for ω 2 ). Specifically, following [9], each trajectory is segmented into overlapping windows of size i 1 (where i < K and K = T / f s ), i.e.,
Φ : = ϕ 1 , ϕ 2 , ϕ N = ( y 1 , , y i ) Τ , ( y 2 , , y i + 1 ) Τ , ( y 3 , , y i + 2 ) Τ , .
The procedure was repeated 100 times for each condition and the classification accuracy computed.

4.1.2. Results

Figure 5a shows the overall accuracy of AR using the SVM classifier with different window sizes. As can be seen, the increased accuracy between the rigid-attached sensor and the fabric-attached sensors is higher when the window sizes are small. Moreover, the accuracy of the four sensors is higher when L is greater, in line with the prediction of the model. As the window size increases (i.e., the classifier is given more of the trajectory history), the overall accuracy increases up to 100% , and the increased accuracy between the rigid-attached sensors and the fabric-attached sensors gradually disappears.

4.2. Case Study 2: Scotch Yoke

This evaluation aims to validate the proposed framework in a physical system.

4.2.1. Materials and Methods

To ensure accurate and repeatable data collection, the experiment reported here uses a physical realization of the system shown in Figure 2a (i.e., an actuated, instrumented scotch yoke) as a data acquisition device. The experimental setup is shown in Figure 3c. The mechanism consists of a sliding yoke with rigid rods affixed to either side and a rotating disk with a diameter of 20 cm mounted on two bearing blocks driven by a DC motor with encoder (30:1, 37 D gear-motor, Pololu Corporation, North Las Vegas, NV, USA) at the fulcrum. The motion of the disk and yoke are coupled via a sliding pin, ensuring a pure sinusoidal movement of the yoke. Affixed to the latter, 10 cm away from the fulcrum, was a 30 cm × 5 cm strip of woven cotton fabric, upon which were mounted three sensors (NDI Aurora Magnetic Tracking device, NDI, Canada) that synchronously record the horizontal position at 40 Hz at an accuracy of approximately 0.09 mm , which were attached along the length of the fabric at (i) 20 cm ( F 2 ), (ii) 30 cm ( F 3 ) and (iii) 40 cm from the fulcrum ( F 4 ) (i.e., at the tip of the fabric). A further sensor ( R 1 ) was rigidly attached to the yoke at the fabric attachment point (see Figure 3c). The error in the yoke movement against the sinusiodal reference is 0.05 π rad s 1 (Details of the error estimation process are given in Appendix A.2).
With this set up, data were collected from the device driven at the desired speed for the experimental condition (see below) (note that at high speeds, the NDI device occasionally loses track of its sensors, resulting in missing data (i.e., gaps) within trajectories. These are filled using piecewise cubic spline interpolation). Specifically, the following reports the effect of varying (i) the window size 0.025 i / f s 2.5 s , where ω 1 = 1.05 π rad s 1 and ω 2 = 1.48 π rad s 1 , and (ii) the difference in frequencies (i.e., | ω 2 ω 1 | ), where i / f s = 0.025 s and ω 1 = 1.05 π rad s 1 . For this, 30 sample trajectories of length T = 5 s at each speed were recorded. These data were segmented for online learning through a similar procedure, as described in Section 4.2, randomly split into equal-sized training and test sets and used to train an SVM classifier to perform AR. All data were standardized using the z-score. The performance of the classifier was assessed by computing its accuracy and the KS test statistic via (3). This process was repeated for 100 trials of every experimental condition tested.

4.2.2. Results

Figure 5b shows the accuracy of AR using the SVM classifier for different window sizes. As can be seen, a similar but more pronounced trend is seen here, as predicted by the simulation in Section 4.2 (q.v.): The accuracy is higher for fabric-attached sensors at small window sizes (and higher for sensors at greater L). As the window size increases, the accuracy converges toward the same value, regardless of the sensor used.
Figure 5c,d shows the classification accuracy and KS test statistic D for discriminating between ω 1 = 1.05 π rad s 1 and the different ω 2 when i / f s = 0.025 s . As can be seen, for the fabric-mounted sensors, the statistical distance D is larger. Moreover, the larger the difference between movement frequencies, the greater the statistical distance and therefore the better the performance of AR (Pearson’s correlation coefficient (PCC) between D and accuracy is 0.85 , indicating a strong positive relationship). For the rigidly attached sensor, there is no obvious increase in D or AR performance. This is consistent with the prediction of (2).

4.3. Case Study 3: Human Activity Recognition

In this section, the predictions of the proposed framework are evaluated in a human motion recognition task. The AR task chosen for this experiment is that of the recognition of constrained periodic movements (such as operating a crank, winch, or ratchet system) from loose, sensorized clothing (the experiments reported here were conducted with the ethical approval of King’s College London, UK: MRPP-21/22-33739).

4.3.1. Hypothesis

The null hypothesis in this experiment is the classification accuracy of the wrist-attached and sleeve-attached sensors has no significant difference. The alternative hypothesis is the classification accuracy of the wrist-attached sensors and sleeve-attached sensors has significant difference.

4.3.2. Experimental Procedure

The experimental set up (see Figure 6) consists of a hand winch with a 15 cm rotating crank handle that experimental subjects must operate at prespecified speeds while wearing a sensorized shirt.
Specifically, participants were asked to remove or roll up the sleeve of any clothing on their arms and to wear an instrumented shirt ( 96% woven cotton, 4% spandex) with one Aurora sensor attached to the cuff of the right-side sleeve (with a maximum possible displacement from the wrist of ± 10 cm ). They were then directed to operate the winch at low ( ω 1 = 1.25 π rad s 1 ) and high frequency ( ω 2 = 2.5 π rad s 1 ) while their movement was recorded.
All data collection was conducted in an isolated room with no visual or audible distractions. To control the speed, a display screen was used to show the target and actual position of the hand winch in real time (the latency in the display is 0.0078 s ) as red and blue points, respectively. To accustomize the participant to the set up, the experimenter demonstrated the desired movement in 1 trial, and subjects were directed to perform the target movement for 10 trials at each frequency prior to the experiment. Before data collection, all participants’ sitting positions were standardized. The chair’s height was adjusted to ensure the wrist/hand extension angle was zero when the crank handle moved parallel to the ground. Sleeves were adjusted to ensure that the cuff was aligned flush with the wrist, using clips/tape where necessary.
After the above-mentioned preparatory work was completed, an equal number of sample trajectories (thirty) of length T = 8 s at high and low frequency were recorded from each participant. During data collection, participants were asked to keep their palms parallel to the ground and to rotate the hand winch to follow the target as closely as possible. Participants were given a 20 s rest between each recording. The order of collecting high and low frequencies was alternated between participants (i.e., the first participant performed low-frequency trials first, followed by high-frequency ones; the second participant performed the high-frequency trials first, and so on).
The first 3 s of each sample trajectory was discarded to allow for the time it took the participant to adjust to follow the target accurately. The remaining data were used for AR using the method described in Section 4.2 with a window size of 0.2 s .

4.3.3. Participants

In the results reported below, 13 healthy volunteers recruited from the local community (6 males, 7 females; mean ± standard derivation; age years old; height 168 ± 11 cm; mass; arm length; wrist circumference 15.3 ± 0.1 cm; laterality index assessed by Edinburgh handedness questionnaire [24]) took part in this experiment. An information sheet was given to all participants before visiting the laboratory, and a consent form was signed by all participants before data collection. To ensure a good, comfortable fit of the instrumented shirt, all participants had an arm length not longer than 78 cm. As the shirt was sensorized on the right-side sleeve, only right-handed participants were recruited.

4.3.4. Results

The relative AR accuracy when using the sleeve-attached sensors as compared with the wrist-attached ones ranged from 0.64% to 8.1% (median 2.1% ). Greater accuracy when using the sleeve-attached sensor was seen for 11 / 13 participants. Using a Wilcoxon matched-pairs signed-rank test, it was seen that there is a statistically significant difference in the accuracy of AR between using wrist- and sleeve-attached sensors ( p < 0.05 ). Therefore, the null hypothesis is rejected. The performance of AR between sleeve-attached and wrist-attached sensors had a significant difference. These results indicate a modest average increase in accuracy in this experiment, as might be expected given the relatively small L in this task (see Section 3).

5. Discussion

This work presents a framework with which to understand the effect of textile motion on AR, including how it may enhance performance compared with the use of rigidly attached sensors. By taking a statistical modeling approach, it is seen how stochasticity in the fabric motion can amplify the statistical distance between movement signals, enhancing AR performance. The predictions of this model were verified through numerical and physical evaluations, including human motion capture for simple oscilliatory movements, with the findings that it (i) improves as the fabric becomes looser (L increases) and that (ii) the discrepancy with rigid sensor use is the most pronounced at small window sizes. The latter suggests that use of fabric-mounted sensors may enable faster and more accurate predictions in the context of online AR. The probabilistic model introduced in this paper is the first to predict this effect, and the strong positive correlation relationship between D and accuracy in the numerical analysis and empirical findings (Section 4.1) indicate the proposed probabilistic model is a good starting point to predict AR performance in real-world applications. More broadly, the fact that these effects can be analytically modeled with a relatively simple model opens up the possibility of enhanced design and analysis of motion capture systems that use ordinary garments, thereby enjoying a high level of comfort and user-acceptance.
In experiments involving real human movements (see Section 4.3), it is seen that AR based on data from sleeve-attached sensors is more accurate for the majority of participants, albeit the average increase is modest and there is some variability. Possible reasons for this includes the relatively small L in relation to the amplitude of the movement. It is notable that the increase in AR accuracy for sensor F 2 (similarly located 10 cm from sensor R 1 ) in the experiment reported in Section 4.2 is also modest, in line with the numerical predictions from the proposed model, as predicted by the proposed model. It is also possible that several other factors may play a role, such as the mass of the sleeve material (relatively heavy due to its multilayer design) or the complexity of its geometry (leading to complex internal forces and motion constraints) as compared with the lightweight, simple strip of fabric studied in the scotch yoke experiments. Future work will examine how such factors may be incorporated into the proposed modeling approach to verify whether its predictions can be improved further. Other areas of future work include extending the model to account for external factors that might affect fabric movement, such as wind, humidity, impact of the fabric with objects in the environment or different fabric materials (e.g., polyester). Moreover, the analytical modeling can also be extended to multidimensional movements, such as may be required for full-body motion analysis. These, in turn, may open up many applications in robotics and automation, such as analyzing workers’ behavior in manufacturing lines [25,26] and human–robot interaction (e.g., for control of exoskeletons [27] or prostheses [28], etc.).

Author Contributions

Conceptualization, T.S., I.D.G. and M.H.; methodology, T.S.; software, T.S.; validation, T.S.; formal analysis, T.S., I.D.G. and M.H.; investigation, T.S., I.D.G. and M.H.; resources, T.S., I.D.G. and M.H.; data curation, T.S.; writing—original draft preparation, T.S.; writing—review and editing, T.S., I.D.G. and M.H.; visualization, T.S.; supervision, I.D.G. and M.H.; project administration, M.H.; funding acquisition, T.S., I.D.G. and M.H. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by King’s College London, the China Scholarship Council and Engineering and Physical Sciences Research Council(EP/M507222/1). For the purpose of open access, the authors have applied a Creative Commons Attribution (CC BY) license to any Accepted Manuscript version arising.

Institutional Review Board Statement

This study received the ethical approval of King’s College London, UK: MRPP-21/22-33739.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

All data and source code in this paper is available online at https://doi.org/10.18742/22182358.

Acknowledgments

We thank all participants who were involved in Section 4.3 for their time and patience.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Appendix A.1. Simulation of Scotch Yoke

Appendix A.1.1. The PDF of Fabric Position

The following equations are the PDF of rigid position U [22] and artifact Δ .
f ( u ) = 1 π 1 u 2 1 < u < 1 0 otherwise
f ( δ ) = 1 2 z L z L < δ < z L 0 otherwise .
The PDF for the sum of U and Δ is given by [29]
f U + Δ ( y ) = f ( u ) f ( δ ) = + f ( u ) f ( y δ ) d ( u ) = 1 π 1 u 2 1 2 z L d ( u ) ,
where the integral is over all possible fabric positions y. Note that the antiderivative of (A4) is
F u + δ = 1 2 z L π ( sin 1 ( u ) ) .
Firstly, flip f ( δ ) about the y-axis. Then, start y at to slide it to + . To derive an explicit solution, there are two conditions that need to be considered (i) 0 < z L < 1 , (ii) z L = 1 .
Condition (i) ( 0 < z L < 1 ): The intersection of two functions starts at y + z L = 1 , and another condition needs to be considered at y z L = 1 . The range of convolution product is from 1 to y + z L .
For 1 z L y < 1 + z L ,
f ( y ) = 1 2 z L π ( sin 1 ( u ) ) | 1 y + z L = 1 2 z L π ( sin 1 ( y + z L ) + π 2 ) .
Continue to move y to + . The next case is considered when y is moved at y z L = 1 and ends at y + z L = 1 . The range of convolution product is from z L + y to z L + y .
For 1 + z L y 1 z L ,
f ( y ) = 1 2 z L π ( sin 1 ( u ) | z L + y z L + y = 1 2 z L π ( sin 1 ( y + z L ) sin 1 ( y z L ) ) .
Continue to move y to + . The intersection of two functions is zero when y z L = 1 . The range of convolution product is from y z L to 1.
For 1 z L < y z L + 1 ,
f ( y ) = 1 2 z L π ( sin 1 ( u ) ) | y z L 1 = 1 2 z L π ( sin 1 ( y + z L ) + π 2 ) .
With a similar process, (ii) can be computed as below.
Condition (ii) ( z L = 1 ):
For 2 y < 0 ,
f ( y ) = 1 2 = π ( sin 1 ( u ) ) | 1 y + 1 = 1 2 π ( sin 1 ( y + 1 ) + π 2 ) .
For y = 0 ,
f ( y ) = 1 2 π ( sin 1 ( u ) ) | 1 + y 1 + y = 1 2 .
For 0 < y 2 ,
f ( y ) = 1 2 π ( sin 1 ( u ) ) | y 1 1 = 1 2 π ( sin 1 ( y + 1 ) + π 2 ) .
In summary, the PDF of fabric position Y (when 0 < z L < 1 ) is
f ( y ) = sin 1 ( y + z L ) + π 2 2 z L π 1 z L y < 1 + z L 1 2 1 + z L y 1 z L sin 1 ( y + z L ) + π 2 2 z L π 1 z L < y z L + 1 0 otherwise
The PDF of fabric position Y (when z L = 1 ) is
f ( y ) = sin 1 ( y + 0.5 ) + π 2 π 1.5 y < 0.5 sin 1 ( y + 0.5 ) sin 1 ( y 0.5 ) π 0.5 y 0.5 sin 1 ( y + 0.5 ) + π 2 π 0.5 < y 1.5 0 otherwise

Appendix A.1.2. The CDF of Fabric Position

F ( y ) is computed using f ( y ) .
Condition (i) ( 0 < z L < 1 ):
For y < 1 z L ,
F ( y ) = 0 .
For 1 z L y < 1 + z L ,
F ( y ) = 1 z L y sin 1 ( y + z L ) + π 2 2 z L π d ( y ) = π ( z L + y ) + 2 1 ( z L + y ) 2 + 2 ( z L + y ) sin 1 ( z L + y ) 4 z L π .
For 1 + z L y 1 z L ,
F ( y ) = 1 z L 1 + z L sin 1 ( y + z L ) + π 2 2 z L π d ( y ) + 1 + z L y sin 1 ( y + z L ) sin 1 ( y z L ) 2 z L π d ( y ) = π z L + 1 ( y + z L ) 2 1 ( y z L ) 2 + ( y z L ) sin 1 ( z L y ) + ( y + z L ) sin 1 ( y + z L ) 2 z L π .
For 1 z L < y z L + 1 ,
F ( y ) = 1 z L 1 + z L sin 1 ( y + z L ) + π 2 2 z L π d ( y ) + 1 + z L 1 z L sin 1 ( y + z L ) sin 1 ( y z L ) 2 z L π d ( y ) + 1 z L y 1 2 z L π ( sin 1 ( y + z L ) + π 2 ) d ( y ) = 3 π z L + π y 2 + ( y z L ) sin 1 ( z L y ) 1 ( z L y ) 2 2 z L π .
For y > z L + 1 ,
F ( y ) = 1 z L 1 + z L sin 1 ( y + z L ) + π 2 2 z L π d ( y ) + 1 + z L 1 z L sin 1 ( y + z L ) sin 1 ( y z L ) 2 z L π d ( y ) + 1 z L 1 + z L 1 2 z L π ( sin 1 ( y + z L ) + π 2 ) d ( y ) = 1 .
Condition (ii) ( z L = 1 ):
For y < 2 ,
F ( y ) = 0 .
For 2 y < 0 ,
F ( y ) = 2 y sin 1 ( y + 1 ) + π 2 2 π d ( y ) = π ( 1 + y ) + 2 1 ( 1 + y ) 2 + 2 ( 1 + y ) sin 1 ( 1 + y ) 4 π .
For y = 0 ,
F ( y ) = 2 0 sin 1 ( y ) + π 2 2 π d ( y ) + 0 0 sin 1 ( y + z L ) sin 1 ( y z L ) 2 z L π d ( y ) = 1 2 .
For 0 < y 2 ,
F ( y ) = 2 0 sin 1 ( y + 1 ) + π 2 2 π d ( y ) + 0 0 sin 1 ( y + 1 ) sin 1 ( y 1 ) 2 π d ( y ) + 0 y 1 2 π ( sin 1 ( y + 1 ) + π 2 ) d ( y ) = 3 π + π y 2 + ( y 1 ) sin 1 ( 1 y ) 1 ( 1 y ) 2 2 π .
For y > 2 ,
F ( y ) = 1 .

Appendix A.2. The Error of Yoke Movement Estimation

The frequency at which the yoke moves in each cycle varies slightly due to minor factors, such as friction and play in the mechanism. The discrepancy between the desired and actual frequency is estimated as
e = 1 n i = 1 n arg min t i + 1 ( y t i + 1 ) arg min t i ( y t i ) f s ω , i = { 1 , 2 , , n }
where n is the number of rotating cycles, f s is sampling frequency (i.e., 40 Hz ), ω is the desired frequency (i.e., ω 1 = 1.05 π rad s 1 for low frequency and ω 2 = 1.48 π rad s 1 for high frequency) and y t i + 1 and y t i mean the minimum number of sensor reading in a cycle and the previous one, respectively.

References

  1. Anagnostis, A.; Benos, L.; Tsaopoulos, D.; Tagarakis, A.; Tsolakis, N.; Bochtis, D. Human activity recognition through recurrent neural networks for human–robot interaction in agriculture. Appl. Sci. 2021, 11, 2188. [Google Scholar] [CrossRef]
  2. Meng, Z.; Zhang, M.; Guo, C.; Fan, Q.; Zhang, H.; Gao, N.; Zhang, Z. Recent progress in sensing and computing techniques for human activity recognition and motion analysis. Electronics 2020, 9, 1357. [Google Scholar] [CrossRef]
  3. Castano, L.M.; Flatau, A.B. Smart fabric sensors and e-textile technologies: A review. Smart Mater. Struct. 2014, 23, 053001. [Google Scholar] [CrossRef]
  4. Yang, K.; Isaia, B.; Brown, L.J.; Beeby, S. E-Textiles for Healthy Ageing. Sensors 2019, 19, 4463. [Google Scholar] [CrossRef] [PubMed]
  5. Slyper, R.; Hodgins, J.K. Action Capture with Accelerometers. In Proceedings of the Proceedings of the 2008 ACM SIGGRAPH/Eurographics symposium on computer animation, Dublin, Ireland, 7–9 July 2008; pp. 193–199. [Google Scholar]
  6. Michael, B.; Howard, M. Eliminating motion artifacts from fabric-mounted wearable sensors. In Proceedings of the 2014 IEEE-RAS International Conference on Humanoid Robots, Madrid, Spain, 18–20 November 2014; pp. 868–873. [Google Scholar]
  7. Michael, B.; Howard, M. Learning predictive movement models from fabric-mounted wearable sensors. IEEE Trans. Neural Syst. Rehabil. Eng. 2015, 24, 1395–1404. [Google Scholar] [CrossRef] [PubMed]
  8. Lorenz, M.; Bleser, G.; Akiyama, T.; Niikura, T.; Stricker, D.; Taetz, B. Towards Artefact Aware Human Motion Capture using Inertial Sensors Integrated into Loose Clothing. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Philadelphia, PA, USA, 23–27 May 2022; pp. 1682–1688. [Google Scholar]
  9. Michael, B.; Howard, M. Activity recognition with wearable sensors on loose clothing. PLoS ONE 2017, 12, 10. [Google Scholar] [CrossRef] [PubMed]
  10. Jayasinghe, U.; Hwang, F.; Harwin, W.S. Comparing Clothing-Mounted Sensors with Wearable Sensors for Movement Analysis and Activity Classification. Sensors 2019, 20, 82. [Google Scholar] [CrossRef] [PubMed]
  11. Jayasinghe, U.; Hwang, F.; Harwin, W.S. Comparing loose clothing-mounted sensors with body-mounted sensors in the analysis of walking. Sensors 2022, 22, 6605. [Google Scholar] [CrossRef] [PubMed]
  12. Lara, O.D.; Labrador, M.A. A survey on human activity recognition using wearable sensors. IEEE Commun. Surv. Tutorials 2012, 15, 1192–1209. [Google Scholar] [CrossRef]
  13. Bello, H.; Zhou, B.; Suh, S.; Lukowicz, P. Mocapaci: Posture and gesture detection in loose garments using textile cables as capacitive antennas. In Proceedings of the 2021 International Symposium on Wearable Computers, Virtual, 21–26 September 2021; pp. 78–83. [Google Scholar]
  14. Cha, Y.; Kim, H.; Kim, D. Flexible piezoelectric sensor-based gait recognition. Sensors 2018, 18, 468. [Google Scholar] [CrossRef] [PubMed]
  15. Skach, S.; Stewart, R.; Healey, P.G. Smart arse: Posture classification with textile sensors in trousers. In Proceedings of the 20th ACM International Conference on Multimodal Interaction, Boulder, CO, USA, 16–20 October 2018; pp. 116–124. [Google Scholar]
  16. Lin, Q.; Peng, S.; Wu, Y.; Liu, J.; Hu, W.; Hassan, M.; Seneviratne, A.; Wang, C.H. E-jacket: Posture detection with loose-fitting garment using a novel strain sensor. In Proceedings of the 2020 19th ACM/IEEE International Conference on Information Processing in Sensor Networks (IPSN), Sydney, NSW, Australia, 21–24 April 2020; pp. 49–60. [Google Scholar]
  17. Jayasinghe, U.; Janko, B.; Hwang, F.; Harwin, W.S. Classification of static postures with wearable sensors mounted on loose clothing. Sci. Rep. 2023, 13, 131. [Google Scholar] [CrossRef] [PubMed]
  18. Tang, W.; Fu, C.; Xia, L.; Lyu, P.; Li, L.; Fu, Z.; Pan, H.; Zhang, C.; Xu, W. A flexible and sensitive strain sensor with three-dimensional reticular structure using biomass Juncus effusus for monitoring human motions. Chem. Eng. J. 2022, 438, 135600. [Google Scholar] [CrossRef]
  19. Lu, D.; Liao, S.; Chu, Y.; Cai, Y.; Wei, Q.; Chen, K.; Wang, Q. Highly durable and fast response fabric strain sensor for movement monitoring under extreme conditions. Adv. Fiber Mater. 2022, 5, 1–12. [Google Scholar] [CrossRef]
  20. Xu, D.; Ouyang, Z.; Dong, Y.; Yu, H.Y.; Zheng, S.; Li, S.; Tam, K.C. Robust, Breathable and Flexible Smart Textiles as Multifunctional Sensor and Heater for Personal Health Management. Adv. Fiber Mater. 2022, 5, 1–14. [Google Scholar] [CrossRef]
  21. Justel, A.; Peña, D.; Zamar, R. A multivariate Kolmogorov-Smirnov test of goodness of fit. Stat. Probab. Lett. 1997, 35, 251–259. [Google Scholar] [CrossRef]
  22. Shin, K.; Hammond, J. Fundamentals of Signal Processing for Sound and Vibration Engineers; John Wiley & Sons: Hoboken, NJ, USA, 2008. [Google Scholar]
  23. Chang, C.C.; Lin, C.J. LIBSVM: A library for support vector machines. TIST 2011, 2, 1–27. [Google Scholar] [CrossRef]
  24. Oldfield, R.C. The assessment and analysis of handedness: The Edinburgh inventory. Neuropsychologia 1971, 9, 97–113. [Google Scholar] [CrossRef] [PubMed]
  25. Koskimaki, H.; Huikari, V.; Siirtola, P.; Laurinen, P.; Roning, J. Activity recognition using a wrist-worn inertial measurement unit: A case study for industrial assembly lines. In Proceedings of the 2009 17th Mediterranean Conference on Control and Automation, Thessaloniki, Greece, 24–26 June 2009; pp. 401–405. [Google Scholar]
  26. Forkan, A.R.M.; Montori, F.; Georgakopoulos, D.; Jayaraman, P.P.; Yavari, A.; Morshed, A. An industrial IoT solution for evaluating workers’ performance via activity recognition. In Proceedings of the 2019 International Conference on Distributed Computing Systems, Dallas, TX, USA, 7–10 July 2019; pp. 1393–1403. [Google Scholar]
  27. Mai, J.; Yi, C.; Ding, Z. Human Activity Recognition of Exoskeleton Robot with Supervised Learning Techniques. 20 December 2021. Available online: https://doi.org/10.21203/rs.3.rs-1161576/v1 (accessed on 6 February 2023).
  28. Pitou, S.; Wu, F.; Shafti, A.; Michael, B.; Stopforth, R.; Howard, M. Embroidered electrodes for control of affordable myoelectric prostheses. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Brisbane, QLD, Australia, 21–25 May 2018; pp. 1812–1817. [Google Scholar]
  29. Domínguez, A. A history of the convolution operation [Retrospectroscope]. IEEE Pulse 2015, 6, 38–49. [Google Scholar] [CrossRef] [PubMed]
Figure 1. New sensing technologies have led to the potential to capture human movement from clothing-embedded sensors Y instead of relying on those rigidly attached to the body U. However, use of ordinary garments exposes the former to additional and unpredictable artifacts in the signal.
Figure 1. New sensing technologies have led to the potential to capture human movement from clothing-embedded sensors Y instead of relying on those rigidly attached to the body U. However, use of ordinary garments exposes the former to additional and unpredictable artifacts in the signal.
Sensors 23 04669 g001
Figure 2. (a) A scotch yoke mechanism with a piece of fabric (red) attached. Sensors are affixed at p u (the tip of the sliding yoke) and p y (the tip of the fabric). (b) Simulated signals of the sensors from this model ( z L = 0.3 ).
Figure 2. (a) A scotch yoke mechanism with a piece of fabric (red) attached. Sensors are affixed at p u (the tip of the sliding yoke) and p y (the tip of the fabric). (b) Simulated signals of the sensors from this model ( z L = 0.3 ).
Sensors 23 04669 g002
Figure 3. The scotch yoke mechanism with a piece of fabric attached moving at (a) low and (b) high frequency. (c) Front view of the experiment set up. Sensors are placed (i) at equidistant intervals along the fabric strip ( F 2 F 4 ) and (ii) rigidly at the attachment point ( R 1 ).
Figure 3. The scotch yoke mechanism with a piece of fabric attached moving at (a) low and (b) high frequency. (c) Front view of the experiment set up. Sensors are placed (i) at equidistant intervals along the fabric strip ( F 2 F 4 ) and (ii) rigidly at the attachment point ( R 1 ).
Sensors 23 04669 g003
Figure 4. The (a) PDF and (b) CDF of rigid body and fabric position ( z L = 0.5 , 1 ).
Figure 4. The (a) PDF and (b) CDF of rigid body and fabric position ( z L = 0.5 , 1 ).
Sensors 23 04669 g004
Figure 5. Accuracy in recognizing movements of different frequencies in the (a) simulated and (b) physical scotch yoke when varying window size; accuracy (c,d) test statistic D for different pairs of frequencies with window size 0.025 s . Reported are the mean ± s.d. (the shaded area) over 100 trials.
Figure 5. Accuracy in recognizing movements of different frequencies in the (a) simulated and (b) physical scotch yoke when varying window size; accuracy (c,d) test statistic D for different pairs of frequencies with window size 0.025 s . Reported are the mean ± s.d. (the shaded area) over 100 trials.
Sensors 23 04669 g005
Figure 6. Experimental set up for the crank task. The participant operates a hand winch at prespecified speeds while their motion is captured with a wrist-attached R and garment-mounted sensor F.
Figure 6. Experimental set up for the crank task. The participant operates a hand winch at prespecified speeds while their motion is captured with a wrist-attached R and garment-mounted sensor F.
Sensors 23 04669 g006
Table 1. Selected state-of-the-art sensing on clothes research.
Table 1. Selected state-of-the-art sensing on clothes research.
ReferenceSensor TypeSensor PlacementType of Clothes
or Fabric
ActivitiesMethodStrength/FindingLimitation
Jayasinghe et al.
[10]
accelerometerswaist, thigh, ankle
and clothes
in similar position
slacks, skirt
and frock
four daily activitiescorrelation coefficients
decision tree
clothing and body
worn sensor data
are correlated
sensors are heavy
Jayasinghe et al.
[11]
IMUwaist, thigh, lower shank
and clothes
in similar position
daily clothesgait cyclecorrelation coefficientsclothing worn sensor
data have key points
in the gait cycle
classification accuracy
of the sensors has
not been investigated
Jayasinghe et al.
 [17]
IMUwaist, thigh, ankle
and clothes
in similar position
daily clothes4 static and
2 dynamic activities
KNNclothing worn sensor
has good posture
classification
the number of
participants is limited
Michael et al.
 [9]
accelerometersrigid pendulum
and a piece of
fabric attached
denim, jersey
and roma
low and high
swing speed
SVM, DRMfabric sensors has
higher accuracy of AR
lack theoretical
model
Bello et al.
[13]
capacitivefour antennas to cover
the chest, shoulders,
back and arms
loose blazer20 posture/gesturesconv2Dsensor is not
affected from
muscular strength
affected from
conductors
Cha et al.
 [14]
piezoelectricclothing near knee, hiploose trousersgaitrule-based algorithmfeasibility of
gait detection
gender of participants
is not balanced
Skach et al.
 [15]
pressureclothing near thighloose trousers19 posturesrandom forestsensor can
detect human postures
the upper body
has not been tested
Lin et al.
 [16]
strainclothing near
shoulder, elbow, waist
and abdomen
loose jacketdaily activities,
postures and slouch
CNN-LSTMsensor can
detect human postures
gender of participants
is not balanced
Tang et al.
 [18]
strainseveral positions on the bodyjuncus effusus
fiber
several
daily activities
gauge factorsensitive
stretchable
the maximum sensing
range is limited
Lu et al.
 [19]
strainhuman jointsconductive
PSKF@rGO
exercise monitoringgauge factoruseful under
extreme conditions
the strain range is limited
Xu et al.
 [20]
strainvarious human
body parts
composite fiberlanguage recognition
pulse diagnosis
gauge factorsensitivity, stability
and durability
the number of
participants is limited
Our approachmagneticscotch yoke with
a piece of fabric attached
wrist and sleeve
woven cottonvarious
rotating frequencies
SVM
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Shen, T.; Di Giulio, I.; Howard, M. A Probabilistic Model of Human Activity Recognition with Loose Clothing. Sensors 2023, 23, 4669. https://doi.org/10.3390/s23104669

AMA Style

Shen T, Di Giulio I, Howard M. A Probabilistic Model of Human Activity Recognition with Loose Clothing. Sensors. 2023; 23(10):4669. https://doi.org/10.3390/s23104669

Chicago/Turabian Style

Shen, Tianchen, Irene Di Giulio, and Matthew Howard. 2023. "A Probabilistic Model of Human Activity Recognition with Loose Clothing" Sensors 23, no. 10: 4669. https://doi.org/10.3390/s23104669

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop