Date: Thursday, May 23
Start Time: 2:40 pm
End Time: 3:10 pm
OEMs, brands and cloud providers want to move LLMs to the edge, especially for vision applications. What are the benefits and challenges of doing so? In this talk, we explore how edge AI is evolving to encompass massively increasing LLM model sizes, the use cases of local LLMs and the performance, power and chip area considerations that system architects should consider when utilizing vision-based LLMs.