- Home
- Publications
- Publication Search
- Publication Details
Title
Multimodal Data Fusion in Learning Analytics: A Systematic Review
Authors
Keywords
-
Journal
SENSORS
Volume 20, Issue 23, Pages 6856
Publisher
MDPI AG
Online
2020-11-30
DOI
10.3390/s20236856
References
Ask authors/readers for more resources
Related references
Note: Only part of the references are listed.- Facilitating student engagement through the flipped classroom approach in K-12: A systematic review
- (2020) Melissa Bond COMPUTERS & EDUCATION
- A Methodology for Capturing Joint Visual Attention Using Mobile Eye-Trackers
- (2020) Bertrand Schneider Jove-Journal of Visualized Experiments
- A systematic review of socially assistive robots in pre-tertiary education
- (2020) Irena Papadopoulos et al. COMPUTERS & EDUCATION
- Multimodal Analytics to Understand Self-Regulation Process of Cognitive and Behavioral Strategies in Real-World Learning
- (2020) Masaya OKADA et al. IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS
- Utilizing Interactive Surfaces to Enhance Learning, Collaboration and Engagement: Insights from Learners’ Gaze and Speech
- (2020) Kshitij Sharma et al. SENSORS
- A systematic review of mobile learning in social studies
- (2020) Mark Michael Diacopoulos et al. COMPUTERS & EDUCATION
- Mobile learning and pedagogical opportunities: A configurative systematic review of PreK-12 research using the SAMR framework
- (2020) Helen Crompton et al. COMPUTERS & EDUCATION
- Open learner models in supporting self-regulated learning in higher education: A systematic literature review
- (2020) Danial Hooshyar et al. COMPUTERS & EDUCATION
- A Multi-sensor Framework for Personal Presentation Analytics
- (2019) Tian Gan et al. ACM Transactions on Multimedia Computing Communications and Applications
- Fostering oral presentation competence through a virtual reality-based task for delivering feedback
- (2019) Stan van Ginkel et al. COMPUTERS & EDUCATION
- Can You Ink While You Blink? Assessing Mental Effort in a Sensor-Based Calligraphy Trainer
- (2019) Bibeg Hang Limbu et al. SENSORS
- Detecting Mistakes in CPR Training with Multimodal Data and Neural Networks
- (2019) Daniele Di Mitri et al. SENSORS
- Introducing Low-Cost Sensors into the Classroom Settings: Improving the Assessment in Agile Practices with Multimodal Learning Analytics
- (2019) Hector Cornide-Reyes et al. SENSORS
- Exploring emotional and cognitive dynamics of Knowledge Building in grades 1 and 2
- (2019) Gaoxia Zhu et al. USER MODELING AND USER-ADAPTED INTERACTION
- Using multimodal learning analytics to study collaboration on discussion groups
- (2019) Fabian Riquelme et al. Universal Access in the Information Society
- Beyond Reality—Extending a Presentation Trainer with an Immersive VR Module
- (2019) Schneider et al. SENSORS
- Using Depth Cameras to Detect Patterns in Oral Presentations: A Case Study Comparing Two Generations of Computer Engineering Students
- (2019) Felipe Roque et al. SENSORS
- Dancing Salsa with Machines—Filling the Gap of Dancing Learning Solutions
- (2019) Gianluca Romano et al. SENSORS
- A Computer-Vision Based Application for Student Behavior Monitoring in Classroom
- (2019) Bui Ngoc Anh et al. Applied Sciences-Basel
- The NISPI framework: Analysing collaborative problem-solving from students' physical interactions
- (2018) Mutlu Cukurova et al. COMPUTERS & EDUCATION
- Orchestration Load Indicators and Patterns: In-the-Wild Studies Using Mobile Eye-Tracking
- (2018) Luis P. Prieto et al. IEEE Transactions on Learning Technologies
- Using the Tablet Gestures and Speech of Pairs of Students to Classify Their Collaboration
- (2018) Sree Aurovindh Viswanathan et al. IEEE Transactions on Learning Technologies
- Biosignals reflect pair-dynamics in collaborative work: EDA and ECG study of pair-programming in a classroom environment
- (2018) Lauri Ahonen et al. Scientific Reports
- EmotionMeter: A Multimodal Framework for Recognizing Human Emotions
- (2018) Wei-Long Zheng et al. IEEE Transactions on Cybernetics
- Collocated Collaboration Analytics: Principles and Dilemmas for Mining Multimodal Interaction Data
- (2017) Roberto Martinez-Maldonado et al. HUMAN-COMPUTER INTERACTION
- End-to-End Multimodal Emotion Recognition Using Deep Neural Networks
- (2017) Panagiotis Tzirakis et al. IEEE Journal of Selected Topics in Signal Processing
- Video-based learners’ observed attention estimates for lecture learning gain evaluation
- (2017) Urban Burnik et al. MULTIMEDIA TOOLS AND APPLICATIONS
- Affective learning: improving engagement and enhancing learning with affect-aware feedback
- (2017) Beate Grawemeyer et al. USER MODELING AND USER-ADAPTED INTERACTION
- Predicting students’ attention in the classroom from Kinect facial and body features
- (2017) Janez Zaletelj et al. EURASIP Journal on Image and Video Processing
- Perceiving Learning at a Glance: A Systematic Literature Review of Learning Dashboard Research
- (2017) Beat A. Schwendimann et al. IEEE Transactions on Learning Technologies
- Automated Detection of Engagement Using Video-Based Estimation of Facial Expressions and Heart Rate
- (2017) Hamed Monkaresi et al. IEEE Transactions on Affective Computing
- Integrating Programming Learning Analytics Across Physical and Digital Space
- (2017) I-Han Hsiao et al. IEEE Transactions on Emerging Topics in Computing
- Visuohaptic augmented feedback for enhancing motor skills acquisition
- (2016) Ali Asadipour et al. VISUAL COMPUTER
- Preferred Reporting Items for Systematic Reviews and Meta-Analyses: The PRISMA Statement
- (2009) David Moher et al. PLOS MEDICINE
Discover Peeref hubs
Discuss science. Find collaborators. Network.
Join a conversationBecome a Peeref-certified reviewer
The Peeref Institute provides free reviewer training that teaches the core competencies of the academic peer review process.
Get Started