Date: Tuesday, May 12
Start Time: 2:40 pm
End Time: 3:10 pm
Humanoid robots have unique perception and compute requirements relative to other robots due to their dynamic stability, dexterity and need to operate safely in human-centric spaces. In this talk, we examine the sensing and compute challenges unique to humanoids. We’ll map applications—walking, mapping/localization, detection, classification and manipulation—to sensing needs and discuss failure modes such as self-occlusion, changing viewpoints and the demand for continuous human detection. We’ll survey sensing modalities (depth, visible and thermal cameras, radar, ultrasonic, inertial, force/torque and pressure) and what today’s sensors do well or poorly, followed by a “wish list” for next-generation capabilities. We’ll then turn to compute: latency-critical loops, asynchronous workloads and why tooling—model conversion, profiling, deployment and platform integrations—often limits performance more than TOPS. Attendees will obtain a framework for specifying sensor suites and compute stacks for humanoids.

