Hand Sign Interpretation through Virtual Reality Data Processing
Dublin Core
Title
Hand Sign Interpretation through Virtual Reality Data Processing
Subject
Sign Language, Quaternion, Padded, RNN
Description
The research lays the groundwork for further advancements in VR technology, aiming to develop
devices capable of interpreting sign language into speech via intelligent systems. The uniqueness of
this study lies in utilizing the Meta Quest 2 VR device to gather primary hand sign data, subsequently
classified using Machine Learning techniques to evaluate the device's proficiency in interpreting hand signs. The initial stages emphasized collecting hand sign data from VR devices and processing the data to comprehend sign patterns and characteristics effectively. 1021 data points, comprising ten distinct hand sign gestures, were collected using a simple application developed with Unity Editor. Each data contained 14 parameters from both hands, ensuring alignment with the headset to prevent hand movements from affecting body rotation and accurately reflecting the user's facing direction. The data processing involved padding techniques to standardize varied data lengths resulting from diverse recording periods. The Interpretation Algorithm Development involved Recurrent Neural Networks tailored to data characteristics. Evaluation metrics encompassed Accuracy, Validation Accuracy, Loss,
Validation Loss, and Confusion Matrix. Over 15 epochs, validation accuracy notably stabilized at
0.9951, showcasing consistent performance on unseen data. The implications of this research serve as a foundation for further studies in the development of VR devices or other wearable gadgets that can function as sign language interpreters.
devices capable of interpreting sign language into speech via intelligent systems. The uniqueness of
this study lies in utilizing the Meta Quest 2 VR device to gather primary hand sign data, subsequently
classified using Machine Learning techniques to evaluate the device's proficiency in interpreting hand signs. The initial stages emphasized collecting hand sign data from VR devices and processing the data to comprehend sign patterns and characteristics effectively. 1021 data points, comprising ten distinct hand sign gestures, were collected using a simple application developed with Unity Editor. Each data contained 14 parameters from both hands, ensuring alignment with the headset to prevent hand movements from affecting body rotation and accurately reflecting the user's facing direction. The data processing involved padding techniques to standardize varied data lengths resulting from diverse recording periods. The Interpretation Algorithm Development involved Recurrent Neural Networks tailored to data characteristics. Evaluation metrics encompassed Accuracy, Validation Accuracy, Loss,
Validation Loss, and Confusion Matrix. Over 15 epochs, validation accuracy notably stabilized at
0.9951, showcasing consistent performance on unseen data. The implications of this research serve as a foundation for further studies in the development of VR devices or other wearable gadgets that can function as sign language interpreters.
Creator
Teja Endra Eng Tju, Muhammad Umar Shalih
Source
http://dx.doi.org/10.21609/jiki.v17i2.1280
Publisher
Faculty of Computer Science Universitas Indonesia
Date
2024-06-04
Contributor
Sri Wahyuni
Rights
e-ISSN : 2502-9274 printed ISSN : 2088-7051
Format
PDF
Language
English
Type
Text
Coverage
Jurnal Ilmu Komputer dan Informasi (Journal of Computer Science and Information)
Files
Collection
Citation
Teja Endra Eng Tju, Muhammad Umar Shalih, “Hand Sign Interpretation through Virtual Reality Data Processing,” Repository Horizon University Indonesia, accessed May 22, 2025, https://repository.horizon.ac.id/items/show/8885.