Presentation
Edge Continual Learning with Mixed-Signal Gaussian Mixture-based Bayesian Neural Networks
DescriptionContinual learning (CL) enables offline-trained models to adapt to new environments and unseen data, a critical feature for edge-deployed models. However, CL often suffers from significant data and hardware overhead or performance degradation, such as catastrophic forgetting (CF). To mitigate these challenges, this work proposes a hardware-algorithm co-design for Gaussian Mixture-based Bayesian Neural Networks (GM-BNNs). The proposed GM-BNN approach enables CL by identifying uncertain out-of-distribution (OOD) data to minimize retraining data volume and mitigates CF by integrating old and new knowledge within a unified GM framework of multiple distributions—each addressing a distinct task. To address the high computational overhead of Bayesian sampling, we design a custom in-memory Gaussian mixture computation circuit, enabling efficient and scalable CL. Leveraging shared Gaussian random number generation inside the multi-distribution memory words and near-memory distribution selection achieves a 10.9× and 1.97× improvement in energy and area respectively compared to a state-of-the-art baseline. Furthermore, the uncertainty-aware, minimal retraining GM-BNN algorithm reduces retraining data required to achieve iso-accuracy by 5×.
Event Type
Networking
Work-in-Progress Poster
TimeMonday, June 236:00pm - 7:00pm PDT
LocationLevel 2 Lobby