TY - GEN
T1 - EmoNet
T2 - 5th International Conference on Intelligent Computing, Communication, Networking and Services, ICCNS 2024
AU - Salloum, Said
AU - Tahat, Khalaf
AU - Mansoori, Ahmed
AU - Alfaisal, Raghad
AU - Tahat, Dina
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - In the burgeoning field of natural language processing, emotion classification from textual data has emerged as a critical task with applications ranging from sentiment analysis to mental health assessment. This paper explores the utilization of Convolutional Neural Networks (CNNs), traditionally dominant in image processing, for classifying emotions in text. Our proposed CNN model leverages the inherent hierarchical structure of language to identify and learn emotion-specific features, with an emphasis on capturing contextual n-grams through convolutional filters. The approach is substantiated by a comprehensive dataset, subjected to rigorous preprocessing and vectorization via TF-IDF to convert text into a numerical format suitable for deep learning. The model's architecture is meticulously crafted, incorporating convolutional layers followed by global max pooling and dense layers, culminating in a softmax activation function tailored for multi-class classification. Our findings demonstrate the model's robustness, achieving a notable accuracy of 96.08% on the test set. This high level of precision is further corroborated by the Receiver Operating Characteristic (ROC) analysis, revealing exceptional area under the curve (AUC) values across various emotion categories. The results suggest that CNNs hold significant promise for emotion recognition tasks in textual data, providing an effective framework for future explorations in the domain.
AB - In the burgeoning field of natural language processing, emotion classification from textual data has emerged as a critical task with applications ranging from sentiment analysis to mental health assessment. This paper explores the utilization of Convolutional Neural Networks (CNNs), traditionally dominant in image processing, for classifying emotions in text. Our proposed CNN model leverages the inherent hierarchical structure of language to identify and learn emotion-specific features, with an emphasis on capturing contextual n-grams through convolutional filters. The approach is substantiated by a comprehensive dataset, subjected to rigorous preprocessing and vectorization via TF-IDF to convert text into a numerical format suitable for deep learning. The model's architecture is meticulously crafted, incorporating convolutional layers followed by global max pooling and dense layers, culminating in a softmax activation function tailored for multi-class classification. Our findings demonstrate the model's robustness, achieving a notable accuracy of 96.08% on the test set. This high level of precision is further corroborated by the Receiver Operating Characteristic (ROC) analysis, revealing exceptional area under the curve (AUC) values across various emotion categories. The results suggest that CNNs hold significant promise for emotion recognition tasks in textual data, providing an effective framework for future explorations in the domain.
KW - Convolutional Neural Networks (CNNs)
KW - Emotion Classification
KW - Natural Language Processing (NLP)
KW - Receiver Operating Characteristic (ROC)
UR - http://www.scopus.com/inward/record.url?scp=85215325814&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85215325814&partnerID=8YFLogxK
U2 - 10.1109/ICCNS62192.2024.10776275
DO - 10.1109/ICCNS62192.2024.10776275
M3 - Conference contribution
AN - SCOPUS:85215325814
T3 - 2024 International Conference on Intelligent Computing, Communication, Networking and Services, ICCNS 2024
SP - 302
EP - 305
BT - 2024 International Conference on Intelligent Computing, Communication, Networking and Services, ICCNS 2024
A2 - Jararweh, Yaser
A2 - Alsmirat, Mohammad
A2 - Aloqaily, Moayad
A2 - Salameh, Haythem Bany
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 24 September 2024 through 27 September 2024
ER -