Datasets and Code
Dual-HGR: IMU-sEMG based Dual Hand Gesture Recognition Dataset for Virtual Reality [Dataset Download]
IMU-sEMG DATASET
This repository contains components utilized in the research study entitled "Dual-HGR: IMU-sEMG based Dual Hand Gesture Recognition Dataset for Virtual Reality".
Overview
The purpose of this research project is to develop and assess a dual hand gesture recognition system that utilizes surface electromyography (sEMG) and inertial measurement unit (IMU) sensors. This system aims to accurately identify and classify simultaneous hand gestures performed by both hands by leveraging the complementary capabilities of sEMG and IMU sensors. Key research objectives encompass recording comprehensive datasets from diverse participants, designing algorithms to process and filter raw sensor data, extracting meaningful features, training and evaluating machine learning models for gesture classification, integrating these models into a real-time system, and evaluating its performance in applications such as human-computer interaction, rehabilitation, virtual reality and assistive technologies. The ultimate goal is to advance human-computer interaction by developing robust methods to recognize complex hand gestures, potentially leading to innovative applications in virtual reality, gaming, robotics, and medical rehabilitation.
Dataset details
This study is approved by the Office of Defence Institute of Advanced Technology, Pune, India. The dataset was collected at the Sensors and Signal Intelligence (SenSigI) Lab, Department of Electronics Engineering at the Defence Institute of Advanced Technology in Pune, India. The experimental setup consists of Salus 4C EMG machine (i.e., 4 channel EMG machine) and IMU wrist band. Two electrodes are placed on each of the left and right forearm of an individual. In this study, two muscle positions namely extensor digitorum and flexor carpi ulnaris are taken into consideration. The IMU wristband is worn on the wrist of left and right hand of an individual. Twenty abled subjects (7 females and 13 males) (S1-S20) between the ages of 20 and 28 years participated in the sEMG-IMU data collection process. For the experimental session, the subject is seated on the chair with their hands in resting position. Before the data acquisition, instructions were provided to each subject about the detailed experimental process to be performed and potential risks. The subjects are asked to fill the consent form prior to the data collection experiments. All the subjects are asked to perform 14 dual hand gestures in the defined order. These gestures include single finger select (SFS), double finger select (DFS), focused zoom in (FZI), focused zoom out (FZO), global zoom in (GZI), global zoom out (GZO), right hand vertical drag (RHVD), right hand horizontal drag (RHHD), right finger hold-and-scroll down (RFHSD), right finger hold and scroll up (RFHSU), left hand horizontal drag (LHHD), left hand vertical drag (LHVD), left finger hold and scroll down (LFHSD), and left finger hold and scroll up (LFHSU) gestures. Each subject has performed each of the 14 hand gestures for 10 times (i.e.,10 trials). A total of 2800 trials have been recorded from each sensor. Each hand gesture is performed for 4-5 seconds followed by 6-7 seconds of rest.
Data arrangement
The data consists of two folders namely IMU_data and EMG_data. The IMU_data folder consists of two sub folders namely “left” and “right” for left hand and right hand IMU data. Each of these sub folders consists IMU data and named as “si_gj_segment_tk.csv”. Further, the EMG_data folder consists 20 sub folders for each of the 20 subjects in which all the gesture data is saved in .csv format. The files are saved in format of “Si_Gj_Tk.csv”, where i represents subject number (i=1, 2, 3, …, 20), j indicate gesture number (j=1, 2, 3, …, 14), and k is the trial number (j=1, 2, 3, …, 10).
References
Pattajoshi, A., Sharma, S. and Sharma, R. R., Dual-HGR: Dual Hand Gesture Recognition for Virtual Reality Through IMU-sEMG Utilizing Transformer-based Multi-Sensor Fusion Network. (Submitted in Information Fusion, Elsevier)
The IIITM Face Emotion dataset originates from IIITM Face Data. It consists of a total of 1928 images collected from 107 participants (87 male and 20 female). These images have been recorded in three different vertical orientations (i.e., Front, Up, and Down) while exhibiting six different facial expressions (Smile, Surprise, Surprise with Mouth Open, Neutral, Sad, and yawning). The original IIITM Face dataset provides information about various other attributes such as gender, presence of mustaches, beard, eyeglasses, clothes worn by the subjects, and density of their hair. The original IIITM Face dataset has been modified to study facial expressions in different orientations. IIITM Face Emotion dataset has only facial region segmented for all the subjects and then all images are resized to fixed dimensions [800 x 1000 pixels] with the aspect ratio of 4:5. This method ensures uniform scale for varying face positions for the same subject.
The nomenclature of each image is as follows: 'SUB XX EE O'
where,
XX -> denotes the subject ID
EE -> denotes the expressed emotion ( Smile -> SM, Surprise -> SU, Surprise with mouth open -> SO, Neutral -> NE, Sad-> SA, Yawning -> YN)
O -> denotes the orientation (Front -> F, Down -> D, Up -> U)
For example. 'SUB1NEF' shows subject 1 depicting Neutral emotion in Front facing orientation.
For using The Dataset, please cite the following papers:
[1] U. Sharma, K. N. Faisal, R. R. Sharma, and K. V Arya, “Facial Landmark-Based Human Emotion Recognition Technique for Oriented Viewpoints in the Presence of Facial Attributes,” SN Comput. Sci., vol. 4, no. 3, p. 273, 2023. https://doi.org/10.1007/s42979-023-01727-y
[2] Arya, K. V., Verma, S., Gupta, R. K., Agarwal, S., & Gupta, P. (2020). IIITM Face: A Database for Facial Attribute Detection in Constrained and Simulated Unconstrained Environments. In Proceedings of the 7th ACM IKDD CoDS and 25th COMAD (pp. 185-189).
Abstract
This study aims to create a robust hand grasp recognition system using surface electromyography (sEMG) data collected from four electrodes. The grasps to be utilized in this study include cylindrical grasp, spherical grasp, tripod grasp, lateral grasp, hook grasp, and pinch grasp. The proposed system seeks to address common challenges, such as electrode shift, inter-day difference, and individual difference, which have historically hindered the practicality and accuracy of sEMG-based systems. By addressing these issues, the researchers intend to develop a more reliable and user-friendly interface for applications in prosthetic devices, rehabilitation systems, and human-computer interaction. The challenge of electrode shift refers to the slight movement or repositioning of electrodes on the skin, which can occur due to natural muscle contractions or external factors, leading to variations in the sEMG signals. Similarly, inter-day differences arise from variations across different recording days. Also, individual difference which demonstrate variability between individuals through factors like body mass, thickness of surface muscles, etc. These factors contribute to inconsistencies in the sEMG signals, making it challenging to achieve stable and repeatable gesture recognition. Furthermore, creating a system that is independent of specific electrode locations on the arm enhances the device's adaptability and ease of use, allowing for greater flexibility in real-world applications. This project will employ advanced signal processing and machine learning techniques to mitigate the effects of these variables, thereby ensuring high accuracy and reliability in hand grasp recognition regardless of the electrode position, day of recording or individual difference. The ultimate goal is to develop a versatile sEMG-based control system that can be seamlessly integrated into various practical applications, enhancing the quality of life for users.
Ni-HGr DATASET
This repository contains components utilized in the research study entitled "Ni-HGr: Non-Ideal Analysis for sEMG-Based Hand Grasp Recognition using Multi-Band Mutual Information". The article has been submitted for possible publication with IEEE Transactions on Instrumentation and Measurement.
Dataset details
This study is approved by the Office of Defence Institute of Advanced Technology, Pune, India. The dataset was collected at the Sensors and Signal Intelligence (SenSigI) Lab, Department of Electronics Engineering at the Defence Institute of Advanced Technology in Pune, India. The experimental setup consists of Salus 4C EMG machine (i.e., 4 channel EMG machine) and a fixed size bracelet of circumference 23 cm and diameter of 7.32 cm for angular estimation. The sEMG electrodes are connected on an elbow support band to maintain their initial alignment. Six intact-abled participants (2 females and 4 males) between the ages 20 and 27 years participated in the EMG data collection process in non-ideal conditions. These include electrode shift, individual difference, and inter-day difference. During the experimental session, each participant was seated in a chair with their hands in a resting position. EMG data was collected over 5 different days. On each day, 6 participants executed 5 experiments at 5 different angular positions (i.e., 0°, 5°, 10°, 355°, 350°). At every position, participants were instructed to perform 6 hand grasp movements, including cylindrical, spherical, tripod, lateral, hook, and pinch grasps. Each grasp type consisted of 5 trials. Each grasp was executed for a duration of 4-5 seconds, followed by an inter-grasp interval of 6-7 seconds. The inter-grasp interval was divided into an approximate 2-second return phase, a 2-second rest phase, and a 2-second reach phase.
Data arrangement
The data collected from each day is stored in different cells in .mat format. The data for each day is formatted and saved as: emg_data{1,v}{1,o}{1,c}{1,m}{1,n}.mat, where v represents number of days (i.e., v = 1,2,...,5), o indicate number of subjects (i.e., o = 1,2,...,6), c defines the angular positions which are arranged in order (i.e., c = 0°, 10°, 350°, 355°, and 5° ), m represents six hand grasp movements in order of cylindrical (CY), hook (HO), lateral (LA), pinch (PI), spherical (SP), and tripod (TR) grasps, and n indicate number of trials.
Note: One trial of SP grasp from 0° angular position of subject 6 from day 4 is excluded from the dataset.
Reference:
Sharma, S. and Sharma, R. R., Ni-HGr: Non-Ideal Analysis for sEMG-Based Hand Grasp Recognition using Multi-Band Mutual Information. (Submitted in IEEE Transactions on Instrumentation and Measurement)