Date: Wednesday, May 21
Start Time: 1:30 pm
End Time: 2:00 pm
In the fast-changing world of artificial intelligence, the industry is deploying more AI compute at the edge. But the growing diversity and data footprint of transformers and models such as large language models and large multimodal models puts a spotlight on memory performance and data storage capacity as key bottlenecks. Enabling the full potential of AI in industries such as manufacturing, automotive, robotics and transportation will require us to find efficient ways to deploy this new generation of complex models. In this presentation, we’ll explore how memory and storage are responding to this need and solving complex issues in the AI market. We’ll examine the storage capacity and memory bandwidth requirements of edge AI use cases ranging from tiny devices with severe cost and power constraints to edge servers, and we’ll explain how new memory technologies such as LPDDR5, LPCAMM2 and multi-port SSDs are helping system developers to meet these challenges.