A Celebration Of Black Folks Who Make Things
Odest Chadwicke Jenkins is my friend and neighbor. If I really wanted to split hairs, Chad is more a scientist than a Maker. I will overlook that small technicality because the work that Chad does is profoundly awesome and has mega impact in the world of robotics, which affects every Maker working with robots. Chad and I share many interests including video games and robots, and I find it interesting that he runs the lab previously occupied by Leslie Kaelbling, one of my former mentors, who is now at MIT. Chad played rugby in college, so you don’t want to mess with him! Chad is so cool, he was recently named as one of the “Brilliant 10” by Popular Science.
I’ll let Chad speak for himself:
I am an Associate Professor of Computer Science at Brown University. My research group, Robotics, Learning and Autonomy at Brown (RLAB), explores topics related to human-robot interaction and robot learning, with a specific focus on robot learning from human demonstration and robot software systems. My work strives towards realizing robots and autonomous systems as effective collaborators for humans in real-world tasks. Reproducibility and interoperability is a critical facet of my research and development work, such as from my group’s ROS repository.
My research into robot learning from demonstration, or robot LfD, centers on the automated discovery of processes underlying human movement and decision making. In recent years, robot LfD has emerged as a compelling alternative, where robots are programmed implicitly from a user’s demonstration rather than explicitly coding through a computer programming language. Robot LfD allows users and developers focus on usage and applications of robots without the burden of acquiring task-unrelated technical skills. My research focuses on developing algorithms and software capable of estimating and autonomously executing a human user’s intended robot behavior from demonstrated examples.
Earlier work from my dissertation work studied robot LfD from the perspective of imitation learning for humanoid robots, with an emphasis on manifold learning from time-series data for estimating dynamical motion primitives. My work has also ventured into computer vision for projects involving physics-based motion tracking using motion primitives and volumetric markerless motion capture., and computer animation for real-time control of physically simulated humanoids.