TY - GEN
T1 - GLoT
T2 - 2024 IEEE Future Networks World Forum, FNWF 2024
AU - Shahin, Nada
AU - Ismail, Leila
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - Machine Translation has played a critical role in reducing language barriers, but its adaptation for Sign Language Machine Translation (SLMT) has been less explored. Existing works on SLMT mostly use the Transformer neural network which exhibits low performance due to the dynamic nature of the sign language. In this paper, we propose a novel Gated-Logarithmic Transformer (GLoT) that captures the long-term temporal dependencies of the sign language as a time-series data. We perform a comprehensive evaluation of GloT with the transformer and transformer-fusion models as a baseline, for Sign-to-Gloss-to-Text translation. Our results demonstrate that GLoT consistently outperforms the other models across all metrics. These findings underscore its potential to address the communication challenges faced by the Deaf and Hard of Hearing community.
AB - Machine Translation has played a critical role in reducing language barriers, but its adaptation for Sign Language Machine Translation (SLMT) has been less explored. Existing works on SLMT mostly use the Transformer neural network which exhibits low performance due to the dynamic nature of the sign language. In this paper, we propose a novel Gated-Logarithmic Transformer (GLoT) that captures the long-term temporal dependencies of the sign language as a time-series data. We perform a comprehensive evaluation of GloT with the transformer and transformer-fusion models as a baseline, for Sign-to-Gloss-to-Text translation. Our results demonstrate that GLoT consistently outperforms the other models across all metrics. These findings underscore its potential to address the communication challenges faced by the Deaf and Hard of Hearing community.
KW - Artificial intelligence
KW - Deep learning
KW - Natural language processing
KW - Neural machine translation
KW - Neural Network
KW - Sign language translation
KW - Time-Series data
KW - Transformers
UR - https://www.scopus.com/pages/publications/105009036724
UR - https://www.scopus.com/pages/publications/105009036724#tab=citedBy
U2 - 10.1109/FNWF63303.2024.11028758
DO - 10.1109/FNWF63303.2024.11028758
M3 - Conference contribution
AN - SCOPUS:105009036724
T3 - 2024 IEEE Future Networks World Forum, FNWF 2024
SP - 885
EP - 890
BT - 2024 IEEE Future Networks World Forum, FNWF 2024
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 15 October 2024 through 17 October 2024
ER -