Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (3)

Search Parameters:
Keywords = joint angle snapping

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
17 pages, 5876 KB  
Article
Optimization of Knitted Strain Sensor Structures for a Real-Time Korean Sign Language Translation Glove System
by Youn-Hee Kim and You-Kyung Oh
Sensors 2025, 25(14), 4270; https://doi.org/10.3390/s25144270 - 9 Jul 2025
Viewed by 426
Abstract
Herein, an integrated system is developed based on knitted strain sensors for real-time translation of sign language into text and audio voices. To investigate how the structural characteristics of the knit affect the electrical performance, the position of the conductive yarn and the [...] Read more.
Herein, an integrated system is developed based on knitted strain sensors for real-time translation of sign language into text and audio voices. To investigate how the structural characteristics of the knit affect the electrical performance, the position of the conductive yarn and the presence or absence of elastic yarn are set as experimental variables, and five distinct sensors are manufactured. A comprehensive analysis of the electrical and mechanical performance, including sensitivity, responsiveness, reliability, and repeatability, reveals that the sensor with a plain-plated-knit structure, no elastic yarn included, and the conductive yarn positioned uniformly on the back exhibits the best performance, with a gauge factor (GF) of 88. The sensor exhibited a response time of less than 0.1 s at 50 cycles per minute (cpm), demonstrating that it detects and responds promptly to finger joint bending movements. Moreover, it exhibits stable repeatability and reliability across various angles and speeds, confirming its optimization for sign language recognition applications. Based on this design, an integrated textile-based system is developed by incorporating the sensor, interconnections, snap connectors, and a microcontroller unit (MCU) with built-in Bluetooth Low Energy (BLE) technology into the knitted glove. The complete system successfully recognized 12 Korean Sign Language (KSL) gestures in real time and output them as both text and audio through a dedicated application, achieving a high recognition accuracy of 98.67%. Thus, the present study quantitatively elucidates the structure–performance relationship of a knitted sensor and proposes a wearable system that accounts for real-world usage environments, thereby demonstrating the commercialization potential of the technology. Full article
(This article belongs to the Section Wearables)
Show Figures

Figure 1

19 pages, 2949 KB  
Article
Sensor Fusion-Based Anthropomorphic Control of a Robotic Arm
by Furong Chen, Feilong Wang, Yanling Dong, Qi Yong, Xiaolong Yang, Long Zheng, Yi Gao and Hang Su
Bioengineering 2023, 10(11), 1243; https://doi.org/10.3390/bioengineering10111243 - 24 Oct 2023
Cited by 7 | Viewed by 4414
Abstract
The main goal of this research is to develop a highly advanced anthropomorphic control system utilizing multiple sensor technologies to achieve precise control of a robotic arm. Combining Kinect and IMU sensors, together with a data glove, we aim to create a multimodal [...] Read more.
The main goal of this research is to develop a highly advanced anthropomorphic control system utilizing multiple sensor technologies to achieve precise control of a robotic arm. Combining Kinect and IMU sensors, together with a data glove, we aim to create a multimodal sensor system for capturing rich information of human upper body movements. Specifically, the four angles of upper limb joints are collected using the Kinect sensor and IMU sensor. In order to improve the accuracy and stability of motion tracking, we use the Kalman filter method to fuse the Kinect and IMU data. In addition, we introduce data glove technology to collect the angle information of the wrist and fingers in seven different directions. The integration and fusion of multiple sensors provides us with full control over the robotic arm, giving it flexibility with 11 degrees of freedom. We successfully achieved a variety of anthropomorphic movements, including shoulder flexion, abduction, rotation, elbow flexion, and fine movements of the wrist and fingers. Most importantly, our experimental results demonstrate that the anthropomorphic control system we developed is highly accurate, real-time, and operable. In summary, the contribution of this study lies in the creation of a multimodal sensor system capable of capturing and precisely controlling human upper limb movements, which provides a solid foundation for the future development of anthropomorphic control technologies. This technology has a wide range of application prospects and can be used for rehabilitation in the medical field, robot collaboration in industrial automation, and immersive experience in virtual reality environments. Full article
Show Figures

Figure 1

36 pages, 19792 KB  
Article
Broadacre Crop Yield Estimation Using Imaging Spectroscopy from Unmanned Aerial Systems (UAS): A Field-Based Case Study with Snap Bean
by Amirhossein Hassanzadeh, Fei Zhang, Jan van Aardt, Sean P. Murphy and Sarah J. Pethybridge
Remote Sens. 2021, 13(16), 3241; https://doi.org/10.3390/rs13163241 - 15 Aug 2021
Cited by 18 | Viewed by 4288
Abstract
Accurate, precise, and timely estimation of crop yield is key to a grower’s ability to proactively manage crop growth and predict harvest logistics. Such yield predictions typically are based on multi-parametric models and in-situ sampling. Here we investigate the extension of a greenhouse [...] Read more.
Accurate, precise, and timely estimation of crop yield is key to a grower’s ability to proactively manage crop growth and predict harvest logistics. Such yield predictions typically are based on multi-parametric models and in-situ sampling. Here we investigate the extension of a greenhouse study, to low-altitude unmanned aerial systems (UAS). Our principal objective was to investigate snap bean crop (Phaseolus vulgaris) yield using imaging spectroscopy (hyperspectral imaging) in the visible to near-infrared (VNIR; 400–1000 nm) region via UAS. We aimed to solve the problem of crop yield modelling by identifying spectral features explaining yield and evaluating the best time period for accurate yield prediction, early in time. We introduced a Python library, named Jostar, for spectral feature selection. Embedded in Jostar, we proposed a new ranking method for selected features that reaches an agreement between multiple optimization models. Moreover, we implemented a well-known denoising algorithm for the spectral data used in this study. This study benefited from two years of remotely sensed data, captured at multiple instances over the summers of 2019 and 2020, with 24 plots and 18 plots, respectively. Two harvest stage models, early and late harvest, were assessed at two different locations in upstate New York, USA. Six varieties of snap bean were quantified using two components of yield, pod weight and seed length. We used two different vegetation detection algorithms. the Red-Edge Normalized Difference Vegetation Index (RENDVI) and Spectral Angle Mapper (SAM), to subset the fields into vegetation vs. non-vegetation pixels. Partial least squares regression (PLSR) was used as the regression model. Among nine different optimization models embedded in Jostar, we selected the Genetic Algorithm (GA), Ant Colony Optimization (ACO), Simulated Annealing (SA), and Particle Swarm Optimization (PSO) and their resulting joint ranking. The findings show that pod weight can be explained with a high coefficient of determination (R2 = 0.78–0.93) and low root-mean-square error (RMSE = 940–1369 kg/ha) for two years of data. Seed length yield assessment resulted in higher accuracies (R2 = 0.83–0.98) and lower errors (RMSE = 4.245–6.018 mm). Among optimization models used, ACO and SA outperformed others and the SAM vegetation detection approach showed improved results when compared to the RENDVI approach when dense canopies were being examined. Wavelengths at 450, 500, 520, 650, 700, and 760 nm, were identified in almost all data sets and harvest stage models used. The period between 44–55 days after planting (DAP) the optimal time period for yield assessment. Future work should involve transferring the learned concepts to a multispectral system, for eventual operational use; further attention should also be paid to seed length as a ground truth data collection technique, since this yield indicator is far more rapid and straightforward. Full article
(This article belongs to the Section Remote Sensing in Agriculture and Vegetation)
Show Figures

Graphical abstract

Back to TopTop