Where: Mission City B1-B5
Date: Day 2
Start Time: 2:20 pm
End Time: 2:50 pm
Efficient deep neural networks are increasingly important in the age of AIoT (AI + IoT), in which people hope to deploy intelligent sensors and systems at scale. However, optimizing neural networks to achieve both high accuracy and efficient resource use on different target devices is difficult, since each device has its own idiosyncrasies. In this talk, we introduce differentiable neural architecture search (DNAS), an approach for hardware-aware neural network architecture search. We show that, using DNAS, the computation cost of the search itself is two orders of magnitude lower than previous approaches, while the models found by DNAS are optimized for target devices and surpass the previous state-of-the-art in efficiency and accuracy. We also explain how we used DNAS to find a new family of efficient neural networks called FBNets.