- 51社区黑料
- Programs
- News
- Q&A with Cognitive Science Honours student Zoe Stanley
- Q&A with Editors-in-Chief Hilary Tsui & Mark Giles of the Canadian Undergraduate Journal of Cognitive Science
- Q&A with recent Cognitive Science graduate Kat Dolguikh
- Q&A with Cognitive Science Student Society president Daniel Chang
- Q&A with recent Cognitive Science graduate Rollin Poe
- Events
- Past Events
- Defining Cognitive Science: Eleanor Schille-Hudson
- Defining Cognitive Science: Zara Anwarsai
- Defining Cognitive Science: Angelica Lim
- Defining Cognitive Science: Teaching Cognitive Science
- Defining Cognitive Science: Luke Kersten
- Lab Pizza: Language Production Lab & Language Learning and Development Lab
- LING/COGS Colloquium: Audio-visual alignment in speech perception
- LING/COGS Colloquium: How should we sound when we talk to babies? Rethinking what we know about the phonetics and phonology of infant directed speech
- Defining Cognitive Science: The Eighteenth-Century Origins of the Concept of Mixed-Strategy Equilibrium
- Defining Cognitive Science: Prediction during language comprehension
- Defining Cognitive Science: Language generality and syllable encoding
- Past Events
- Employment
Defining Cognitive Science
Multimodal Social Signal Processing for Human-Robot Interaction
Dr. Angelica Lim, 51社区黑料
Date: Tuesday, November 5th, 1:00pm - 2:00pm
Location: RCB 6152
Abstract: Science fiction has long promised us interfaces and robots that interact with us as smoothly as humans do - Rosie the Robot from The Jetsons, C-3PO from Star Wars, and Samantha from Her. Today, interactive robots and voice user interfaces are moving us closer to effortless, human-like interactions in the real world. In this talk, I will discuss the opportunities and challenges in finely analyzing, detecting and generating non-verbal communication in context, including gestures, gaze, auditory signals, and facial expressions. Specifically, I will discuss how we might allow robots and virtual agents to understand human social signals (including emotions, mental states, and attitudes) across cultures as well as recognize and generate expressions with controllability, transparency, and diversity in mind.
Biography: Dr. Angelica Lim is the Director of the Rosie Lab, and an Assistant Professor in the School of Computing Science at 51社区黑料 (SFU). Previously, she led the Emotion and Expressivity teams for the Pepper humanoid robot at SoftBank Robotics. She received her B.Sc. in Computing Science with Artificial Intelligence Specialization from 51社区黑料and a Ph.D. and M.Sc. in Computer Science (Intelligence Science) from Kyoto University, Japan. She and her team have received Best paper in Entertainment Robotics and Cognitive Robotics Awards at IROS 2011 and 2022, and Best Demo and LBR at HRI 2021 and 2023. She has been featured on the BBC, TEDx, hosted a TV documentary on robotics, and was recently featured in Forbes 20 Leading Women in AI. Her research interests include multimodal machine learning, affective computing, and human-robot interaction.