Multimodal semi-automated affect detection from conversational cues, gross body language, and facial features

Title
Multimodal semi-automated affect detection from conversational cues, gross body language, and facial features
Authors
Keywords
-
Journal
USER MODELING AND USER-ADAPTED INTERACTION
Volume 20, Issue 2, Pages 147-187
Publisher
Springer Nature
Online
2010-05-03
DOI
10.1007/s11257-010-9074-4

Ask authors/readers for more resources

Find Funding. Review Successful Grants.

Explore over 25,000 new funding opportunities and over 6,000,000 successful grants.

Explore

Add your recorded webinar

Do you already have a recorded webinar? Grow your audience and get more views by easily listing your recording on Peeref.

Upload Now