AI News Hub Logo

AI News Hub

ConquerNet: Convolution-Smoothed Quantile ReLU Neural Networks with Minimax Guarantees

stat.ML updates on arXiv.org
Tianpai Luo, Fangwei Wu, Weichi Wu

arXiv:2605.06265v1 Announce Type: new Abstract: Quantile regression is a fundamental tool for distributional learning but poses significant optimization challenges for deep models due to the non-smoothness of the pinball loss. We propose ConquerNet, a class of \textbf{con}volution-smoothed \textbf{qu}antil\textbf{e} \textbf{R}eLU neural \textbf{net}works, which yield smooth objectives while preserving the underlying quantile structure. We establish general nonasymptotic risk bounds for ConquerNet under mild conditions, providing minimax guarantees over Besov function classes. In numerical studies, we demonstrate that the proposed approach outperforms standard quantile neural networks at multiple quantile levels, showing improved estimation accuracy and training efficiency across the board, with particularly pronounced advantages at high and low quantiles.