TY - GEN
T1 - Sign Language Recognition of Autistic Children using Wearable Sensors Signals Features Selection
AU - Ullah, Farman
AU - Isa, Mohamed
AU - Alhammadi, Ahmed Abdulraheim
AU - Alyassi, Abdalla Ahmed Ali
AU - Yaqoob, Shumayla
AU - Jadoon, Ihtesham
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - Children with autism spectrum disorder (ASD) often face difficulties and challenges in social interactions, communication, and eye contact due to neurological disorders. This neurological disorder causes children to have a lack of social interaction, sensory processing issues, repetitive behaviours, trouble adapting to change, and executive functioning problems. This article proposes the autistic children's gesture recognition framework using wearable sensors feature selection-based machine learning techniques. The system records and analyzes individuals' gestures, providing valuable data for understanding their behaviour. A dataset of 2 4 gestures from 10 children with ASD was collected, and various machine learning models, such as KNN-Manhattan, KNN-Euclidean, neural networks, and random forests, were trained using time-and frequency-domain features extracted from the sensor data. Cross-validation techniques were applied to evaluate the models' accuracy, with most classifiers achieving around 8 8% and the highest accuracy of about 9 1. 5%. This approach demonstrates the potential of wearable technology and machine learning in improving the understanding and support of individuals with ASD.
AB - Children with autism spectrum disorder (ASD) often face difficulties and challenges in social interactions, communication, and eye contact due to neurological disorders. This neurological disorder causes children to have a lack of social interaction, sensory processing issues, repetitive behaviours, trouble adapting to change, and executive functioning problems. This article proposes the autistic children's gesture recognition framework using wearable sensors feature selection-based machine learning techniques. The system records and analyzes individuals' gestures, providing valuable data for understanding their behaviour. A dataset of 2 4 gestures from 10 children with ASD was collected, and various machine learning models, such as KNN-Manhattan, KNN-Euclidean, neural networks, and random forests, were trained using time-and frequency-domain features extracted from the sensor data. Cross-validation techniques were applied to evaluate the models' accuracy, with most classifiers achieving around 8 8% and the highest accuracy of about 9 1. 5%. This approach demonstrates the potential of wearable technology and machine learning in improving the understanding and support of individuals with ASD.
KW - Austim
KW - Features Selection
KW - Machine Learning
KW - Wearable Sensors
UR - http://www.scopus.com/inward/record.url?scp=85216420406&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85216420406&partnerID=8YFLogxK
U2 - 10.1109/CommNet63022.2024.10793290
DO - 10.1109/CommNet63022.2024.10793290
M3 - Conference contribution
AN - SCOPUS:85216420406
T3 - Proceedings - 7th International Conference on Advanced Communication Technologies and Networking, CommNet 2024
BT - Proceedings - 7th International Conference on Advanced Communication Technologies and Networking, CommNet 2024
A2 - El Bouanani, Faissal
A2 - Ayoub, Fouad
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 7th International Conference on Advanced Communication Technologies and Networking, CommNet 2024
Y2 - 4 December 2024 through 6 December 2024
ER -