Where: Mission City M1-M3
Date: Day 2
Start Time: 2:20 pm
End Time: 2:50 pm
How do you design robots that are aware of their unstructured environments at a consumer price point? Excellent sensing is required but using low cost sensors. By fusing data from multiple sensors and making sure every sensor does multiple jobs (like the hidden sensors in your camera ISP), we can achieve rolling shutter correction, drivable space understanding and other applications. Careful sensor selection with a focus on ultimate customer utility, data bus architecture with a plan for fusion, and full understanding of the inner workings of each sensor can make all these fusion tasks easier and reduces computational complexity. This talk will cover the pragmatic approach we use for hardware design and some of the design choices to enable sensor fusion we made across multiple generations of products. We will also give an overview of how information is fused from multiple sensors in order to make robots that feel alive and aware of their surroundings.