AI News Hub Logo

AI News Hub

Communication Dynamics Neural Networks: FFT-Diagonalized Layers for Improved Hessian Conditioning at Reduced Parameter Count

cs.LG updates on arXiv.org
Lurong Pan

arXiv:2605.08171v1 Announce Type: new Abstract: Background and motivation. The Communication Dynamics (CD) framework, introduced in two earlier papers for atomic-energy prediction and field-induced superconductivity, treats each physical channel as a (2l+1)-vertex polygon whose discrete Fourier transform yields its energy spectrum. This paper applies the same circulant-spectral machinery to neural-network design. Layer construction. CDLinear is a block-circulant linear layer with block size B = 2l+1 and 1/B the parameter count of a dense layer of equal input/output dimensions. Three properties follow from the construction. (i) The Hessian of mean-squared loss with respect to the weights is diagonalized by the discrete Fourier transform, with eigenvalues |F[Xj](k)|^2 read directly from the input statistics (Theorem 1). (ii) Under input pre-whitening, the population Hessian condition number satisfies kappa = 1 exactly, with the empirical condition number bounded by 1+O(sqrt(B/N)) on N samples (Theorem 2). (iii) The Shannon noise rate alpha_CD = 0.0118 calibrated in the parent CD papers from the Na D-doublet specifies a transferable, non-arbitrary dropout rate. Empirical evaluation. A CDLinear MLP at B = 4 achieves 97.50% +/- 0.23% test accuracy with 2,380 parameters versus 98.15% +/- 0.47% for a parameter-matched dense MLP at 8,970 parameters, a 3.8x parameter reduction at 0.65% accuracy cost, within one standard deviation of the seed-to-seed spread. The CD-MLP mean Hessian condition number kappa = 1.9x10^4 is 310x smaller than the dense baseline kappa = 5.9x10^6, in quantitative agreement with Theorem 2.