Date: Wednesday, May 24
Start Time: 11:25 am
End Time: 11:55 am
Always-sensing cameras are becoming a common AI-enabled feature of consumer devices, much like the always-listening Siri or Google assistants. They can enable a more natural and seamless user experience, such as automatically locking and unlocking the device based on whether the owner is looking at the screen or within view of the camera. But the complexities of cameras, and the quantity and richness of the data they produce, mean that much more processing is required for an always-sensing camera compared with listening for a wake word. Without careful attention to neural processing unit (NPU) design, an always-sensing camera can wind up consuming excessive power or performing poorly, which can lead to an unsatisfactory user experience. In this talk, we’ll explore the architecture of a neural processor in the image signal path, discuss use cases, and provide tips for how OEMs, chipmakers, and system architects can successfully evaluate, specify, and deploy an NPU in an always-on camera.