In the fast-changing world of artificial intelligence, the industry is deploying more AI compute at the edge. But the growing diversity and data footprint of transformers and models like large language models and large multimodal models puts a spotlight on memory performance and data storage capacity as key bottlenecks. Enabling the full potential of AI in industries such as manufacturing, automotive/ADAS, robotics and transportation will require us to find efficient ways to deploy this new generation of complex models. In this presentation, we’ll explore how memory and storage is responding to this need and solving complex issues in the AI market. We will cover various usage models, model requirements and how DRAM and storage solutions can help.