Title
Human-Centric Embodied AI
Abstract
A primary focus of Embodied AI is to develop robots that assist humans with various tasks. However, most existing tasks and benchmarks in this domain do not account for human interaction, assuming that robots operate in isolation. Introducing humans into the equation introduces complex challenges often overlooked in traditional tasks, such as interpreting dynamic human intent, planning over long horizons in the presence of a highly capable, dynamic human agent, and handling ambiguous human language and instructions. In this talk, I will cover Habitat 3.0, a fast simulator for humans and robots performing collaborative tasks. I will discuss how this simulator facilitates data collection at scale, essential for effectively training recent large-scale models. Additionally, I will demonstrate our approach to automatically generating human and object motions based on textual descriptions of specific activities. Lastly, I will discuss human-centric Embodied AI tasks. These tasks present unique challenges that are absent in conventional Embodied AI studies, revealing the limitations of state-of-the-art models in human-robot interaction.
Short Bio
Roozbeh Mottaghi is a Senior Research Scientist Manager at FAIR and an Affiliate Associate Professor in Paul G. Allen School of Computer Science and Engineering at the University of Washington. Prior to joining FAIR, he was the Research Manager of the Perceptual Reasoning and Interaction Research (PRIOR) group at the Allen Institute for AI (AI2). He obtained his PhD in Computer Science in 2013 from the University of California, Los Angeles. After PhD, he joined the Computer Science Department at Stanford University as a post-doctoral researcher. His research mainly focuses on embodied AI, reasoning via perception, and learning via interaction, and his work on large-scale Embodied AI received the Outstanding Paper Award at NeurIPS 2022.