Smart Textile Gloves Powered by Machine Learning for Accurate Hand Movement Capture

In an article recently published in the journal Nature Machine Intelligence, researchers proposed machine learning (ML)-powered stretchable smart textile gloves to effectively and accurately capture complex object interactions and hand movements.

Study: Smart Textile Gloves Powered by Machine Learning for Accurate Hand Movement Capture. Image credit: FOTOGRIN/Shutterstock
Study: Smart Textile Gloves Powered by Machine Learning for Accurate Hand Movement Capture. Image credit: FOTOGRIN/Shutterstock

Limitations of camera-based systems

Hand-movement tracking in real-time has notable applications in robotics, sports training, rehabilitation, tele-surgical applications, augmented reality and metaverse, electronic gaming, and human-computer interaction. Significant developments in gesture recognition/tracking using both wearable and computer vision (CV) technologies have been achieved due to recent advancements in flexible electronics and ML.

Expensive and fixed motion-capture camera systems with markers are commonly utilized for articulated and detailed finger and hand tracking. CV-based solutions using one/several cameras placed in a particular location/attached to a user’s headset serve as low-cost consumer solutions. However, both CV and motion capture-based technologies are spatially constrained to the field of view of cameras and face significant challenges due to occlusion by body parts such as hands, objects, background noise, and poor lighting.

Modern wearable technologies are mostly utilized for gesture recognition in the form of arm sleeves, wrist bands, or gloves. Various sensors, including inertial measurement units (IMUs) and surface electromyography electrodes, can be integrated into different wearable devices.

However, most of these devices have not effectively addressed the issues related to washability, accuracy, and reliability of the device and can only identify specific gestures with limited accuracy, resulting in limited usability in practical applications. Overall, capturing realistic hand movements is challenging due to the large number of articulations and degrees of freedom.

The proposed solution

In this study, researchers proposed stretchable, washable, accurate, and multi-modal ML-powered smart textile gloves with embedded stretchable helical sensor yarns (HSYs), IMUs, and interconnects to dynamically and precisely track finger and hand movements and grasp forces during object interactions.

The smart textile glove’s key functionalities were detecting grasp pressure during interaction with objects and joint-angle estimation. The stretchable HSYs embedded in the smart glove were composed of an elastic core yarn wrapped using metal-coated nanofibres in helical form.

Additionally, the HSY structure had a final protective polydimethylsiloxane coating and silicone as the outer shell. An ML model was developed that can estimate the joint angles for the wrist and all finger joints dynamically with high accuracy. This ML model, designated as the GlovePoseML, was the core of the algorithm, which was also supplemented with neural network (NN)-based models in the output to adjust the response for specific demonstrations and applications.

A large dataset containing more than 3,000,000 frames with a 20 Hz sampling rate was collected using the smart textile glove with attached markers for the motion-capture system from five participants with various hand sizes for GlovePoseML model development and training.

Specifically, the participants performed complex hand transient movements for different tasks, including switching between multiple gestures, grasping objects, and random finger movements. The GlovePoseML model contained a two-layer stacked recurrent neural network (RNN)-based bidirectional long short-term memory (Bi-LSTM), two activation layers, and two fully connected (FC) layers.

Researchers used this regression architecture to estimate tactile information and hand-joint angles from incoming data. The model utilized a 2 s history of normalized motion capture and sensor data as input for training. One participant’s data was selected as the testing dataset and the other participants’ data as the training dataset, and the step was repeated for all participants before calculating the average results for inter-participant cross-validation.

Tenfold cross-validation was performed over every user’s data, and the average value was obtained for intra-participant cross-validation. After the core model was trained, the test data was fed to the model, and the joint angles estimated by the model were sent through a WebSocket for three-dimensional (3D) visualization using the Unity software.

Significance of the study

Reliable tracking of complex finger and hand movements during interaction with objects was realized using ML-powered stretchable smart textile gloves. The insulated and lightweight sensor yarns demonstrated high dynamic range in response to strains as high as 155% and as low as 0.005%, high stability, and low hysteresis during extensive washing and use cycles.

Additionally, the ML-powered smart gloves displayed a high accuracy in estimating hand-joint angles. The average joint-angle estimation root mean square errors were less than 1.45° and 1.21° for inter- and intra-participant cross-validation, respectively, using the multi-stage ML, which matched the accuracy of expensive motion-capture cameras without field-of-view/occlusion limitations.

Moreover, a data augmentation technique was demonstrated that enhances robustness to noise and variations of sensors. Based on these results, researchers successfully displayed complex potential applications of their proposed ML-powered smart gloves, including object recognition from grasp patterns, static gesture recognition, highly accurate dynamic gesture recognition, accurate typing on a mock paper keyboard, and dynamic tracking of finger and hand movements.

To summarize, the findings of this study demonstrated that the proposed ML-powered smart gloves can be used for tracking dexterous hand movements and object interactions with high accuracy and without the limitations of camera-based systems.

Journal reference:
Samudrapom Dam

Written by

Samudrapom Dam

Samudrapom Dam is a freelance scientific and business writer based in Kolkata, India. He has been writing articles related to business and scientific topics for more than one and a half years. He has extensive experience in writing about advanced technologies, information technology, machinery, metals and metal products, clean technologies, finance and banking, automotive, household products, and the aerospace industry. He is passionate about the latest developments in advanced technologies, the ways these developments can be implemented in a real-world situation, and how these developments can positively impact common people.

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Dam, Samudrapom. (2024, January 17). Smart Textile Gloves Powered by Machine Learning for Accurate Hand Movement Capture. AZoAi. Retrieved on November 24, 2024 from https://www.azoai.com/news/20240117/Smart-Textile-Gloves-Powered-by-Machine-Learning-for-Accurate-Hand-Movement-Capture.aspx.

  • MLA

    Dam, Samudrapom. "Smart Textile Gloves Powered by Machine Learning for Accurate Hand Movement Capture". AZoAi. 24 November 2024. <https://www.azoai.com/news/20240117/Smart-Textile-Gloves-Powered-by-Machine-Learning-for-Accurate-Hand-Movement-Capture.aspx>.

  • Chicago

    Dam, Samudrapom. "Smart Textile Gloves Powered by Machine Learning for Accurate Hand Movement Capture". AZoAi. https://www.azoai.com/news/20240117/Smart-Textile-Gloves-Powered-by-Machine-Learning-for-Accurate-Hand-Movement-Capture.aspx. (accessed November 24, 2024).

  • Harvard

    Dam, Samudrapom. 2024. Smart Textile Gloves Powered by Machine Learning for Accurate Hand Movement Capture. AZoAi, viewed 24 November 2024, https://www.azoai.com/news/20240117/Smart-Textile-Gloves-Powered-by-Machine-Learning-for-Accurate-Hand-Movement-Capture.aspx.

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of AZoAi.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
Machine Learning Optimizes Polymer Analysis