AI News Hub Logo

AI News Hub

A PyTorch Library of Turing-Complete Neural Networks

cs.LG updates on arXiv.org
Jonathan Bates

arXiv:2605.08150v1 Announce Type: new Abstract: We present a PyTorch package that compiles neural networks and their weights from Turing machine descriptions, producing models that exactly simulate the specified machine without any training. Given a transition function and a set of terminal states, the package constructs a model whose forward pass corresponds to one step of the Turing machine. Two architectures are implemented, each realizing a different theoretical result: (1) a transformer with self-attention, cross-attention, and feedforward layers based on Wei, Chen, and Ma (2021), and (2) a recurrent network based on Siegelmann and Sontag (1995) that encodes the stack in a Cantor set. We develop the constructions from first principles, showing how ReLU networks implement Boolean circuits (AND, OR, NOT, XOR gates and their composition into DNF formulas and binary adders) and how hard attention implements positional lookup on the tape. The package serves as a concrete, runnable reference for the symbolic-neural bridge, and as a foundation for future work on the stability of constructed solutions under gradient-based optimization. Code is available at https://github.com/jonrbates/turing.