Diverse distant-students deep emotion recognition and visualization

Ala'a Harb, Abdalla Gad, Maha Yaghi, Marah Alhalabi, Huma Zia, Jawad Yousaf, Adel Khelifi, Kilani Ghoudi, Mohammed Ghazal

Research output: Contribution to journalArticlepeer-review


Distance learning through online platforms has recently emerged as a significant addition to the educational system, providing support and enhancing traditional delivery modes in schools. Instructors need visual feedback to monitor students’ engagement. This research aims to investigate the emotional state of culturally diverse students in the country throughout the lecture and provide teachers with an interactive dashboard to monitor students’ emotional states during a particular lecture. We collected a specialized dataset that includes Arab students posing for various emotions specifically related to lecture scenarios. Face detection and alignment algorithm is then applied. Finally, we utilize Neural Architecture Search (NAS) to optimize emotion classification architecture. The overall accuracy of the model is 86.29% on our collected dataset and 98.84% on the CK+ dataset. The live testing of the system shows the emotions clearly with real-time feedback. The AI-powered dashboard gives instructors insights into student engagement during distance learning, which induces data-driven decisions to optimize teaching strategies.

Original languageEnglish
Article number108963
JournalComputers and Electrical Engineering
Publication statusPublished - Nov 2023
Externally publishedYes


  • Computer vision
  • Data augmentation
  • Deep learning
  • Face alignment
  • Facial emotion recognition
  • Neural Architecture Search (NAS)

ASJC Scopus subject areas

  • Control and Systems Engineering
  • General Computer Science
  • Electrical and Electronic Engineering


Dive into the research topics of 'Diverse distant-students deep emotion recognition and visualization'. Together they form a unique fingerprint.

Cite this