Next Article in Journal
Early Results on GNSS Receiver Antenna Calibration System Development
Previous Article in Journal
Resolution Enhancement of Brain MRI Images Using Deep Learning
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Proceeding Paper

Analysing the Contributing Factors to Activity Recognition with Loose Clothing †

Department of Engineering, King’s College London, London WC2R 2LS, UK
*
Author to whom correspondence should be addressed.
Presented at the 5th International Conference on the Challenges, Opportunities, Innovations and Applications in Electronic Textiles, Ghent, Belgium, 14–16 November 2023.
Eng. Proc. 2023, 52(1), 10; https://doi.org/10.3390/engproc2023052010
Published: 15 January 2024
(This article belongs to the Proceedings of Eng. Proc., 2023, E-Textiles 2023)

Abstract

:
The integration of sensors into garments has paved the way for human activity recognition (AR), enabling users to engage in extended human motion recordings. The inherent fluidity of loose clothing allows it to mirror the wearer’s movements. From a statistical standpoint, clothing captures additional valuable insights beyond rigid body motions, improving AR. This work demonstrates how fabric’s orientation, layering and width contribute to the enhanced performance of AR with clothing in periodic motion. Experiments are reported in which a scotch yoke and a KUKA robot manipulator are used to induce the periodic motion of fabric cloth at different frequencies. These reveal that clothing-attached sensors exhibit higher frequency classification accuracy among sensors with an improvement of 27% for perpendicular-oriented fabric, 18% for triple-layered fabric, and 9% for large-width fabric, exceeding that seen with rigid attached sensors.

1. Introduction

Human activity recognition (AR) is the process of interpreting human motion based on data [1]. In other words, it is the ability to classify human movements into discrete categories. For instance, basic movements can be categorised into walking, running, sitting, and standing. Motion data acquisition for human AR can be either vision-based via video recordings, or sensor-based via the attachment of sensors [1]. Activity recognition with loose clothing may hold potential for accelerating the deployment of e-textile technology. For instance, extended-period-of-motion recordings outside the lab could be possible with loose clothing while preserving the comfort of ordinary textile garments [2]. Loose hospital gowns could be facilitated with sensors for human AR to monitor the physical activity of patients [3]. This may help in the detection of Parkinson’s disease and Essential Tremor via changes in balance [4] and rhythmic movements [5], respectively.
Numerous studies [6] focusing on sensor-based human AR involve on-body wearables that are fixed with straps, tape, or tight clothing. Loosely attaching sensors on clothing for human AR is uncommon as it is perceived to be accompanied with undesirable artefacts of fabric motion that corrupt actual body recordings [7]. Yet, textile garments are not limited to tight clothing and frequently consist of loose clothing in a variety fabric materials and styles due to their added comfort over tight-fitting clothing.
The few studies [2,8] that have been conducted using data recorded from loose clothing with sensors have reported a surprising degree of success in human AR. Among those is a recent study [9] that expressed motion via a probabilistic model composed of rigid body motion and an offset due to fabric movement. That offset appeared to improve AR, revealing higher accuracy in distinguishing between different movements of an object via sensors loosely attached on cloth compared to rigidly attached ones.
The probabilistic model incorporated the length of the fabric as a contributing factor in the predicted accuracy of AR [9]. However, several other factors may play a role in the introduction of the added fabric offset that are as-yet unexplored. For instance, environmental factors including air currents and humidity could affect AR. Factors arising from the fabric itself include its weight, length, width, orientation, layering, stiffness, and elasticity. The latter affects the movement of cloth during its interaction with air, which in turn may affect the performance of activity recognition. This paper investigates the impact of the fabric’s orientation, layering, and width on accuracy when classifying movements of different frequencies. Moreover, it analyses how these factors can be tuned to enhance activity recognition with loose clothing.

2. Materials and Methods

2.1. Experimental Setup

The experiment setup of this study is shown in Figure 1a. It was composed of a scotch yoke, woven cotton fabric cloth, a magnetic tracking device, and development computers. The scotch yoke is a reciprocating mechanism that converts rotary motion into linear motion and is used to induce simple harmonic motion on a single axis. It consists of laser-cut acrylic sheets assembled into the design shown in Figure 1a, and is driven by a DC motor (30:1, 37D gear-motor, Pololu Corporation, Las Vegas, NV, USA). The motor is operated by an L298N driver, which is provided with input of 10 V from a DC power supply, and outputs a voltage that drives the motor at a constant velocity as commanded by an Arduino Uno via pulse width modulation. The motor angular velocity signalled from the built-in encoder is monitored via a serial interface, such that it is fixed at 3.1 rad/s for low-frequency and 4.4 rad/s for high-frequency movement, respectively.
The fabric used in the experiments consisted of woven cotton cloth cut into strips of fixed length (30 cm) and different widths: small (5 cm), medium (10 cm), and large (15 cm). Double- and triple-layered fabric strips were formed by joining fabric strips together with basting stitches around their perimeter using a sewing machine. The cloth was attached to the movement device by means of a hanger mounted on the yoke (see Figure 1a).
To measure the fabric motion, an NDI Aurora Magnetic Tracking Device (Waterloo, ON, Canada) was set up with four sensors (R1, F2, F3, and F4). These were mounted along the cloth at 0, 10, 20, and 30 cm, respectively, as shown in Figure 1b. The rigidly attached sensor (R1) was mounted on the hanger and the clothing-attached sensors (F2, F3, and F4) were mounted on the loose fabric cloth. The sensors recorded the absolute position of the mechanism and cloth at a frequency of 40 Hz.

2.2. Experiments

Three experiments were executed (summarised in Table 1) to study the effect of fabric orientation, layering, and width on activity recognition. Following [9], the activity recognition task was simply to distinguish between low- versus high-frequency movements of the yoke. Figure 1c,d show the fabric cloth in perpendicular and parallel orientation to the scotch yoke, respectively.

2.3. Data Collection

While the device was in motion, positional tracking data were recorded for 10 s at a sampling rate of 40 Hz using the Aurora Software (NDI Toolbox 5.002.022), resulting in 400 datapoints recorded per trajectory. This was repeated such that 30 trajectories were recorded for each movement type (low and high frequency) in each experimental condition. A total of 120 trajectories were collected for Experiment 1, and 180 trajectories were collected for each of experiments 2 and 3.

2.4. Frequency Classification

Frequency classification was made using Support Vector Machines (SVMs) and was implemented with LIBSVM library (version 3.31) [10] in MATLAB R2022 (MathWorks, Natick, MA, USA) following the approach reported in [9]. The default SVM configuration and parameters were used, with non-linearity in the dataset accommodated through the Gaussian radial basis kernel (RBF). As in [8,9], the classification model was trained based on the mapping φ i     l i such that
φ   : = φ 1 , φ 2 , φ 3 , = d 1 ,   d 2 , d n T ,   d 2 ,   d 3 , d n + 1 T ,   d 3 ,   d 4 , d n + 2 T ,
where φ i denotes a segment of the motion trajectory φ , l i denotes the corresponding class label {0, 1}, d denotes a datapoint in the trajectory (in time-ordered sequence), and n denotes the segment size. Each trajectory is sliced into overlapping segments of size n (40 Hz × window size). The first segment is initially passed to train the classifier and make decisions. The window is then shifted forward by one time step for the second segment to be passed for training, and the process repeats until the end of the trajectory is reached. This is repeated for 100 trials while varying the sample of trajectories chosen for testing.

3. Results

Figure 2a–c show frequency classification accuracy plots with error bars of one standard deviation for the four sensors at a selected window size (1.0 s) when using different fabric orientations, layering, and widths, respectively. Clothing-attached sensors (F4, F3, and F2) demonstrate higher classification accuracies than the rigidly attached sensor (R1) in most of the experimental conditions shown in Figure 2.
Considering orientation, as can be seen in Figure 2a, the classification accuracy for sensor F4 on perpendicular-oriented fabric had, on average, 5% greater accuracy than R1, while the parallel-oriented fabric revealed almost no difference. Looking at Figure 2b, the triple-layered fabric resulted in a difference in accuracy of 18%, followed by the double-layered (15%) and the single-layered fabric (5%). The unforeseen increase in the accuracy of R1 in the triple-layered fabric experiment, despite maintaining fixed frequencies, may be due to the added weight of layering impacting the movement of the scotch yoke, in turn increasing the overall accuracy for all sensors.
Considering fabric width (Figure 2c), the greatest difference in accuracy was seen between F3 and R1. For the large-width fabric, the difference was 9%, followed by the medium-width (4%) and the small-width fabric (1%). It is noticeable that F4 no longer gave the highest accuracy among the sensors using medium- and large-width fabric. This suggests that larger-width fabrics exhibit more complex dynamics, which shifts the optimal sensor position to an intermediate point on its surface.
To extend the analysis, Experiment 1 was repeated with the KUKA LBR iiwa robot manipulator (Augsburg, BY, Germany) in place of the scotch yoke (Figure 1e) to determine the effect of a different movement pattern. The robot manipulator was programmed with basic spline linear (SLIN) motion commands in KUKA Sunrise Workbench (KUKA Sunrise.OS 1.11) to induce the periodic motion with a trapezoidal velocity profile. The peak velocity was set to 150 mm/s for the low frequency and 300 mm/s for the high frequency.
Figure 2d,e show the classification accuracy across different window sizes for the two fabric orientations. Both graphs reveal increases in sensor accuracies with greater window sizes until converging to 100% accuracy at window size 2.5 s (100 datapoints). The rate at which sensors reached maximum accuracy varied in an ascending order of R1, F2, F3, F4; this is in support of the findings in [9], with clothing-attached sensors exhibiting higher accuracies compared to rigidly attached ones. Moreover, similar to the case of simple harmonic motion, the difference was more pronounced when the fabric was oriented perpendicularly to the motion (Figure 2d) than parallelly (Figure 2e). At window size 1.0 s, the difference in classification accuracy between R1 and F4 was 27% using fabric with a perpendicular orientation, compared to 10% using parallel-oriented fabric.

4. Discussion

The improved classification accuracy of perpendicular-oriented fabric over parallel-oriented fabric may be explained via their interaction with air. Perpendicular-oriented fabric is “head on” to the direction of airflow, such that incoming air pushes against it rather than sliding past. Similarly, enlarging fabric width exposes more surface area to contact with air. Hence, a greater range of motion is achieved in both cases, improving AR.
The increased classification accuracy of multi-layered fabric may be due to the added weight in motion. An object’s momentum is the product of its mass and velocity. Given the same velocity, the triple-layered fabric exhibited higher momentum in periodic motion than the single-layered fabric due to its tripled mass, making it swing further. In turn, this enlarged the separation between high- and low-frequency data for clothing-attached sensors.
These findings may be used in the design of garments considering the desired AR task. For instance, garments constructed of heavier textiles may perform better if sensors are positioned away from the extremities.

5. Conclusions

In conclusion, tuning fabric parameters such as orientation, layering, and width can contribute and enhance AR with loose clothing. It is speculated that fabric that is perpendicularly oriented, triple-layered, and of large width demonstrates higher classification accuracy since this set of parameters allow the fabric to experience more air turbulence during motion. These findings may help refine the probabilistic model of fabric motion introduced in earlier studies, contributing to the development of human AR with loose electronic textiles.

Author Contributions

Conceptualisation, R.A. and M.H.; methodology, R.A.; software, R.A. and T.S.; validation, R.A. and M.H.; formal analysis, R.A.; investigation, R.A.; resources, M.H.; data curation, R.A.; writing—original draft preparation, R.A.; writing—review and editing, R.A., T.S. and M.H.; visualisation, R.A.; co-supervision, T.S.; supervision, M.H.; project administration, R.A.; funding acquisition, R.A., T.S. and M.H. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by King’s College London and the Engineering and Physical Sciences Research Council (EP/M507222/1). The first author acknowledges the funding from the Saudi Ministry of Education and KACST (King Abdulaziz City for Science and Technology) for sponsoring her MSc studies while completing this research. The second author acknowledges the funding from China Scholarship Council for his PhD studies.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data are publicly available through the following URL: https://github.com/renad-allagani/Dataset-of-Factors-Contributing-to-AR-with-Loose-Clothing (accessed on 26 November 2023).

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Isakava, T. Human Activity Recognition: Everything You Should Know. InData Labs. Available online: https://indatalabs.com/blog/human-activity-recognition (accessed on 21 August 2023).
  2. Jayasinghe, U.; Hwang, F.; Harwin, W.S. Comparing Loose Clothing-Mounted Sensors with Body-Mounted Sensors in the Analysis of Walking. Sensors 2022, 22, 6605. [Google Scholar] [CrossRef] [PubMed]
  3. Chesser, M.; Jayatilaka, A.; Visvanathan, R.; Fumeaux, C.; Sample, A.; Ranasinghe, D.C. Super Low Resolution RF Powered Accelerometers for Alerting on Hospitalized Patient Bed Exits. In Proceedings of the 2019 IEEE International Conference on Pervasive Computing and Communications (PerCom), Kyoto, Japan, 11–15 March 2019; IEEE: New York, NY, USA, 2019; pp. 1–10. [Google Scholar] [CrossRef]
  4. Parkinson’s Disease—Symptoms, nhs.uk. Available online: https://www.nhs.uk/conditions/parkinsons-disease/symptoms/ (accessed on 23 October 2023).
  5. Essential Tremor—Symptoms and Causes, Mayo Clinic. Available online: https://www.mayoclinic.org/diseases-conditions/essential-tremor/symptoms-causes/syc-20350534 (accessed on 21 August 2023).
  6. Lara, O.D.; Labrador, M.A. A Survey on Human Activity Recognition using Wearable Sensors. IEEE Commun. Surv. Tutor. 2013, 15, 1192–1209. [Google Scholar] [CrossRef]
  7. Michael, B.; Howard, M. Eliminating motion artifacts from fabric-mounted wearable sensors. In Proceedings of the 2014 IEEE-RAS International Conference on Humanoid Robots, Madrid, Spain, 18–20 November 2014; pp. 868–873. [Google Scholar] [CrossRef]
  8. Michael, B.; Howard, M. Activity recognition with wearable sensors on loose clothing. PLoS ONE 2017, 12, e0184642. [Google Scholar] [CrossRef]
  9. Shen, T.; Di Giulio, I.; Howard, M. A Probabilistic Model of Human Activity Recognition with Loose Clothing. Sensors 2023, 23, 4669. [Google Scholar] [CrossRef] [PubMed]
  10. Chang, C.-C.; Lin, C.-J. LIBSVM—A Library for Support Vector Machines. Available online: http://www.csie.ntu.edu.tw/~cjlin/libsvm/ (accessed on 21 August 2023).
Figure 1. (a) Scotch yoke experiment setup. (b) Attachment of motion tracking sensors. Fabric in (c) perpendicular and (d) parallel orientation. (e) KUKA robot manipulator experiment setup.
Figure 1. (a) Scotch yoke experiment setup. (b) Attachment of motion tracking sensors. Fabric in (c) perpendicular and (d) parallel orientation. (e) KUKA robot manipulator experiment setup.
Engproc 52 00010 g001
Figure 2. Scotch yoke classification accuracy plots at window 1.0 s for different fabric (a) orientations, (b) layering, and (c) widths. KUKA manipulator classification accuracy across different window sizes for fabric in (d) perpendicular and (e) parallel orientation. Shown are mean +/− s.d. over 100 trials.
Figure 2. Scotch yoke classification accuracy plots at window 1.0 s for different fabric (a) orientations, (b) layering, and (c) widths. KUKA manipulator classification accuracy across different window sizes for fabric in (d) perpendicular and (e) parallel orientation. Shown are mean +/− s.d. over 100 trials.
Engproc 52 00010 g002
Table 1. Experimental variables.
Table 1. Experimental variables.
Independent VariableDependent VariableControlled Variables
Experiment 1Fabric Orientation (perpendicular, parallel)Frequency
Classification
Single layer, small width
Experiment 2Fabric Layering (single, double, triple)Perpendicular orientation, small width
Experiment 3Fabric Width (small, medium, large)Perpendicular orientation, single layer
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Allagani, R.; Shen, T.; Howard, M. Analysing the Contributing Factors to Activity Recognition with Loose Clothing. Eng. Proc. 2023, 52, 10. https://doi.org/10.3390/engproc2023052010

AMA Style

Allagani R, Shen T, Howard M. Analysing the Contributing Factors to Activity Recognition with Loose Clothing. Engineering Proceedings. 2023; 52(1):10. https://doi.org/10.3390/engproc2023052010

Chicago/Turabian Style

Allagani, Renad, Tianchen Shen, and Matthew Howard. 2023. "Analysing the Contributing Factors to Activity Recognition with Loose Clothing" Engineering Proceedings 52, no. 1: 10. https://doi.org/10.3390/engproc2023052010

APA Style

Allagani, R., Shen, T., & Howard, M. (2023). Analysing the Contributing Factors to Activity Recognition with Loose Clothing. Engineering Proceedings, 52(1), 10. https://doi.org/10.3390/engproc2023052010

Article Metrics

Back to TopTop