Abstract
In recent years, several software and hardware solutions have been proposed for object detection, motion tracking, and gesture identification. However, these solutions failed to efficiently identify appropriate gestures and track their motion in a stipulated time range. To overcome the above-mentioned challenges, we propose a novel sign language translator application, which uses the MediaPipe Holistic model along with Long Short-Term Memory (LSTM), integrated with Neural Network (NN) model to build the proposed translator model. This model will pick up the gestures from different angles quickly and accurately to translate them into appropriate text. This paper has been divided into two parts, the first part involves research and data collection, and the second part involves training and testing the collected data for real-time implementation. This model has been trained and tested with real-time data collected from the Leap motion controller and cameras. The model provides appreciable results.
Original language | English |
---|---|
Pages (from-to) | 425-430 |
Number of pages | 6 |
Journal | Procedia Computer Science |
Volume | 224 |
DOIs | |
Publication status | Published - 2023 |
Event | 18th International Conference on Future Networks and Communications, FNC 2023 / 20th International Conference on Mobile Systems and Pervasive Computing, MobiSPC 2023 / 13th International Conference on Sustainable Energy Information Technology, SEIT 2023 - Halifax, Canada Duration: Aug 14 2023 → Aug 16 2023 |
Keywords
- Sign language
- artificial intelligence (AI);convolutional neural networks (CNN)
- hand gesture recognition
- machine learning
ASJC Scopus subject areas
- General Computer Science