- Home
- Publications
- Publication Search
- Publication Details
Title
Multimodal Engagement Analysis From Facial Videos in the Classroom
Authors
Keywords
-
Journal
IEEE Transactions on Affective Computing
Volume 14, Issue 2, Pages 1012-1027
Publisher
Institute of Electrical and Electronics Engineers (IEEE)
Online
2021-11-13
DOI
10.1109/taffc.2021.3127692
References
Ask authors/readers for more resources
Related references
Note: Only part of the references are listed.- Evaluation of the Azure Kinect and Its Comparison to Kinect V1 and Kinect V2
- (2021) Michal Tölgyessy et al. SENSORS
- Leveraging Recent Advances in Deep Learning for Audio-Visual Emotion Recognition
- (2021) Liam Schoneveld et al. PATTERN RECOGNITION LETTERS
- Biosensors Show Promise as a Measure of Student Engagement in a Large Introductory Biology Course
- (2020) Karen S. McNeal et al. CBE-Life Sciences Education
- Automated gaze-based mind wandering detection during computerized learning in classrooms
- (2019) Stephen Hutt et al. USER MODELING AND USER-ADAPTED INTERACTION
- A Computer-Vision Based Application for Student Behavior Monitoring in Classroom
- (2019) Bui Ngoc Anh et al. Applied Sciences-Basel
- Affective Computing for Large-scale Heterogeneous Multimedia Data
- (2019) Sicheng Zhao et al. ACM Transactions on Multimedia Computing Communications and Applications
- Student engagement, assessed using heart rate, shows no reset following active learning sessions in lectures
- (2019) Diana K. Darnell et al. PLoS One
- Automatic Detection of Mind Wandering from Video in the Lab and in the Classroom
- (2019) Nigel Bosch et al. IEEE Transactions on Affective Computing
- Brain-to-Brain Synchrony and Learning Outcomes Vary by Student–Teacher Dynamics: Evidence from a Real-world Classroom Electroencephalography Study
- (2018) Dana Bevilacqua et al. JOURNAL OF COGNITIVE NEUROSCIENCE
- Deep learning for affective computing: Text-based emotion recognition in decision support
- (2018) Bernhard Kratzwald et al. DECISION SUPPORT SYSTEMS
- Predicting students’ attention in the classroom from Kinect facial and body features
- (2017) Janez Zaletelj et al. EURASIP Journal on Image and Video Processing
- EEG in the classroom: Synchronised neural recordings during video presentation
- (2017) Andreas Trier Poulsen et al. Scientific Reports
- AffectNet: A Database for Facial Expression, Valence, and Arousal Computing in the Wild
- (2017) Ali Mollahosseini et al. IEEE Transactions on Affective Computing
- Multimodal Human-Human-Robot Interactions (MHHRI) Dataset for Studying Personality and Engagement
- (2017) Oya Celiktutan et al. IEEE Transactions on Affective Computing
- Automated Detection of Engagement Using Video-Based Estimation of Facial Expressions and Heart Rate
- (2017) Hamed Monkaresi et al. IEEE Transactions on Affective Computing
- Students' LMS interaction patterns and their relationship with achievement: A case study in higher education
- (2016) Rebeca Cerezo et al. COMPUTERS & EDUCATION
- The Faces of Engagement: Automatic Recognition of Student Engagementfrom Facial Expressions
- (2014) Jacob Whitehill et al. IEEE Transactions on Affective Computing
- Capturing the Stream of Behavior
- (2012) Ivana Lizdek et al. SOCIAL SCIENCE COMPUTER REVIEW
- The development and evaluation of a survey to measure user engagement
- (2009) Heather L. O'Brien et al. JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY
Discover Peeref hubs
Discuss science. Find collaborators. Network.
Join a conversationAdd your recorded webinar
Do you already have a recorded webinar? Grow your audience and get more views by easily listing your recording on Peeref.
Upload Now