Professor Sharon Oviatt is internationally known for her work on human-centered interfaces, multimodal-multisensor interfaces, mobile interfaces, educational interfaces, the cognitive impact of computer input tools, and behavioral analytics. Her research is known for its pioneering and multidisciplinary style at the intersection of Computer Science, Psychology, Linguistics, and Learning Sciences. She has published a large volume of high-impact papers, including recent books on: The Design of Future Educational Interfaces (2013), The Paradigm Shift to Multimodality in Contemporary Computer Interfaces (2015), and the multi-volume Handbook of Multimodal-Multisensor Interfaces (co-edited with B. Schuller, P. Cohen, A. Krueger, G. Potamianos and D. Sonntag, 2017-2019). Sharon has been recipient of the inaugural ACM-ICMI Sustained Accomplishment Award, National Science Foundation Special Creativity Award, ACM-SIGCHI CHI Academy Award, and an ACM Fellow Award. She also has delivered over 100 keynotes, invited talks, and tutorials worldwide at conferences, universities and corporate events.
Talk: Multimodal behavioral analytics and interface tools: advancing human learning
Multimodal-multisensor data afford a deeply human-centred foundation for detecting human behavioral states, and then designing user-centered adaptive systems based on them. For example, analysis of human communication and movement patterns are proving particularly apt for assessing human intention (e.g., deception), mental load and cognition (e.g., attentional load, domain expertise), motivation and emotion (e.g., task engagement), and related health and mental health status (e.g., anxiety, neurodegenerative disease). In this lecture, I’ll focus on what multimodal behavioral analytics is revealing about human learning. I’ll also describe how expressively rich interface tools, including multimodal ones based on writing and speech, can stimulate human learning.