Close

Presentation

FARM: Fast Acceleration of Random forests via in-Memory processing
DescriptionArtificial Intelligence (AI) has become omnipresent, influencing a multitude of applications in our daily lives and expanding into a wide range of sensitive applications like medical diagnosis, autonomous driving, and more. Many systems using Machine Learning (ML) models, e.g., neural networks, function as black boxes and lack interpretability. In contrast, users are increasingly seeking greater transparency in how AI-driven decisions are made. Random Forests (RF) have arisen as a key model offering this "interpretability", but their limited performance is far from the needs of time-critical applications.

In this work, we design a specialized hardware solution to accelerate inference tasks on RFs. We first identify data retrieval inefficiencies by analyzing several RF inference algorithms, then design a processing-in-memory (PIM) architecture called FARM. FARM performs key inference tasks within the hardware's HBM memory banks, and coalesces burst memory activity to reduce unnecessary data movements. Our evaluation shows that FARM delivers an average of 8.8x performance improvement, and an 87% reduction in energy, without any loss in predictive accuracy, when compared to a state-of-the-art GPU coupled with HBM memory.
Event Type
Networking
Work-in-Progress Poster
TimeMonday, June 236:00pm - 7:00pm PDT
LocationLevel 2 Lobby