Khiet P. Truong
University of Twente
Abstract: Listening to more than words
Since it is becoming more and more common to talk to a device, the need for methods to make this interaction more smooth, enjoyable and natural increases. Spoken language is more than just words. The way people talk not only reveals information about their age, sex, or region they are from, it also reveals information about one's socio-affective, mental, and physical state. If agents can automatically extract this kind of information from the way the user talks, this will help regulate human-agent interaction and opens up opportunities for innovative talking agents.
In this talk, i will present an overview of our work on human-agent interaction and how we endow agents with (social) human-like capabilities such as backchanneling and detecting engagement. Acknowledging that you are listening to someone talking by nodding and saying uh-huh is something that comes naturally to humans but is rather challenging for virtual agents as we will see. Similarly, detecting engagement in a group of children is challenging for a machine that has to deal with naturalistic observations.
Khiet Truong is an assistant professor in the Human Media Interaction group, University of Twente. Her interests lie in the automatic analysis and understanding of verbal and nonverbal (vocal) behaviors in human-human and human-machine interaction, and the design of socially interactive technology to support human needs. Taking an interdisciplinary approach within the realms of affective computing and social signal processing, she aims to develop socially and affective intelligent interfaces (e.g. virtual conversational agents, social robots) that can recognize and display social and affective signals, and she aims to study how humans interact with this new kind of technology. Coming from a background in (computational) paralinguistics and speech analysis, her main focus is on analysing the vocal modality of expression, in addition to the visual (e.g. facial expressions, eye gaze) and physiological (e.g. heart rate, galvanic skin response) modalities in social interaction.
|Differences in the Intentionality Bias when Judging Human and Robotic Action
|Behavioral Characteristics of Humanoid Robot to Suppress Bullying in School
|Suggestion and Evaluation of a Model which Express Career Behavior and Social Network using Multi-Agent Simulation
|Toward a design of human-computer co-drawing system: preliminary experiments on effects of imitation and interference