AI News Hub Logo

AI News Hub

TRAM: Training Approximate Multiplier Structures for Low-Power AI Accelerators

cs.LG updates on arXiv.org
Chang Meng, Hanyu Wang, Yuyang Ye, Mingfei Yu, Wayne Burleson, Giovanni De Micheli

arXiv:2605.08231v1 Announce Type: new Abstract: Reducing power consumption in AI accelerators is increasingly important. Approximate computing can reduce power consumption while keeping the accuracy loss small. Since multipliers are power-hungry components in AI models, this paper focuses on synthesizing low-power approximate multipliers (AxMs). Unlike prior works that design AxMs separately from AI model training, we present TRAM, which jointly optimizes the AxM structure and AI model parameters to lower power with small accuracy loss. Experiments show that compared to state-of-the-art AxMs, TRAM achieves up to 25.05% AxM power reduction on CNNs with CIFAR-10, and reduces power by up to 27.09% on vision transformers with ImageNet.