Humanoid robots have unique perception and compute requirements relative to wheeled or fixed-base robots due to their dynamic stability, dexterity and need to operate safely in human-centric spaces. In this talk, we examine the sensing and on-device compute challenges unique to humanoids. We’ll map key applications—walking, mapping/localization, detection and classification and safe manipulation—to concrete sensing needs and discuss common failure modes such as self-occlusion, changing viewpoints from bending/tilting and the demand for continuous, safety-critical human detection. We’ll survey practical sensing modalities (depth, visible and thermal cameras, radar, ultrasonic, inertial, force/torque and pressure) and what today’s off-the-shelf sensors do well or poorly, followed by a “wish list” for next-generation capabilities. We’ll then turn to compute: latency-critical loops, asynchronous workloads and why tooling—model conversion, profiling, deployment and platform integrations—often limits real performance more than raw TOPS. Attendees will leave with a system-level framework for specifying sensor suites and compute stacks for deployable humanoids.

