Skip to content

IDS Guest Seminar - by Dr. Xingyu Lin

Title: A Bottom-Up Approach towards Generalizable Robot Learning

Speaker: Xingyu LIN, Postdoctoral Scholar, University of California, Berkeley

Date: May 22, 2024
Time: 10:00am – 11:00am
Venue: HKU IDS Office, P307, Graduate House / Zoom

Mode: Hybrid. Seats for on-site participants are limited. A confirmation email will be sent to participants who have successfully registered. 


The rise of data-driven methods in robotics has significantly enhanced a robot’s capacity for perception, reasoning, and acting. However, the challenge and expense of collecting a diverse dataset with robots prevent learning control policies that are generalizable across various settings and tasks. Alternatively, while data sources like videos and robot play data are scalable, they are often not directly applicable due to the domain gaps and the absence of optimal action labels. In this talk, I will discuss my research on learning visual representations, particle trajectory models, and particle dynamics models from these data to learn generalizable low-level policies. These structured representations enables the learned policies to generalize to novel objects and configurations. I will conclude by demonstrating how these low-level skills can be assembled to tackle long-horizon and novel tasks.


Dr. Xingyu Lin
Postdoctoral Scholar @ University of California, Berkeley

Xingyu Lin is a postdoctoral researcher at the University of California Berkeley, working with Pieter Abbeel. His research lies at the intersection of computer vision, machine learning and robotics, with a focus on learning robust manipulation skills that generalize to novel objects, tasks and deformable objects. Xingyu holds a PhD from the Robotics Institute at Carnegie Mellon University, advised by David Held. Prior to that, he received his undergraduate degree in computer science from Peking University. His research has been published at top conferences, including CoRL, RSS, NeurIPS and ICLR. He was also selected as an RSS (Robotics Science and System) 2022 Pioneer.

For information, please contact: