Skip to main content
Skip to main menu Skip to spotlight region Skip to secondary region Skip to UGA region Skip to Tertiary region Skip to Quaternary region Skip to unit footer

Slideshow

Research improves AI for smartphones

Image:
Adobestock Ai image
By: Alan Flurry

As Machine Learning tools become increasingly commonplace, their ubiquity is accompanied by a growing need to use the tools on mobile devices. From the application side, the endless possibilities go well beyond common speech or image recognition. And the growing computational capacity of smartphones creates the necessary room for improved hand-held AI.

Researchers at the University of Georgia have recently developed an innovative software framework called SmartMem, designed to make artificial intelligence (AI) applications run faster and more efficiently on mobile devices. While ChatGPT and other advanced image processing often struggle to run smoothly on smartphones because they require complex data transformations that slow down performance, SmartMem addresses this issue by significantly reducing the transformations, allowing AI to run quickly even on everyday mobile devices.

The software is described in a 2024 article published in the journal of the computer systems conference ASPLOS '24.

"Smartphones are becoming the most common platform for using AI in daily life. Our research helps ensure that everyone can experience fast, reliable AI without needing expensive equipment or constant internet connectivity," said Wei Niu, assistant professor in the UGA School of Computing and lead author of the study.

The novel framework improves the performance of Deep Learning and Deep Neural Networks (DNN) on mobile devices in part by categorizing DNN operators into four groups based on their input/output layouts and computations, considering combinations of producer-consumer edges between the operators, and searching optimized layouts with multiple carefully designed methods. The team conducted experiments with 18 leading-edge models to show significant speedup compared to existing processing compilers. 

"This advancement doesn't just speed up AI applications—it also enhances user privacy," Niu said.

Looking ahead, this work paves the way for further research in optimizing Large Language Models (LLMs), with larger parameters and more operators, on mobile devices.

"Because SmartMem helps complex AI run directly on your phone, it reduces the need to send sensitive data to cloud servers," said Gagan Agrawal, director of the UGA School of Computing and co-author. "This protects your privacy and lets AI applications work reliably even without an internet connection."

"Running AI locally on devices ensures personal data remains private by minimizing its transmission over the internet," he said. "Additionally, local AI enables consistent performance, especially in scenarios where internet access is unreliable or unavailable." Niu and Agrawal were recently awarded a National Science Foundation grant to support follow-up work on the project.

Evaluation results demonstrate that SmartMem significantly outperforms existing frameworks, achieving up to 7.9 times faster performance compared to popular mobile AI compiler frameworks. It also reduces memory usage and cache misses, demonstrating its effectiveness in optimizing AI for mobile devices.

"SmartMem's improvements mean that future AI experiences on mobile devices will be quicker, safer, and more accessible to everyone," Niu said.

The study, "SmartMem: Layout Transformation Elimination and Adaptation for Efficient DNN Execution on Mobile," is available online.

 

Image via Creative Commons license

Type of News/Audience:

Support us

We appreciate your financial support. Your gift is important to us and helps support critical opportunities for students and faculty alike, including lectures, travel support, and any number of educational events that augment the classroom experience. Click here to learn more about giving.

Every dollar given has a direct impact upon our students and faculty.