Landing Signalman gesture recognition dataset
These two gesture datasets are from the Computer & Graphics Journal submitted article: "Helicopter visual signaling simulation: Integrating VR and ML into a low-cost solution to optimize Brazilian Navy training". Both files have 17 helicopter visual signals using two hands, and each gesture was performed 25 times. They were created using MiVRy – 3D Gesture Recognition at Unity environment. They contain the raw relative positions of each gesture sample, the neural network itself, and the metadata. File “HVSS_gest_contr.dat” was recorded using VIVE(TM) controllers, for the virtual nighttime environment, and “HVSS_gest_tracker.dat” using VIVETM Tracker wrist-mounted, for the virtual daytime environment. The gestures are illustrated and explained according to "'Landing Signalman dataset.pdf".
Steps to reproduce
Files can be reproduced using Unity's asset MiVRy - 3D Gesture Recognition from MARUI-PlugIn, using the Gesture Manager plugin.