The Local Substrate

The Local Substrate

Backpropagation requires a global error signal that propagates backward through the entire network. Every layer must wait for every downstream layer to compute its gradient. The algorithm is sequential, non-local, and biologically implausible — yet it is how nearly all neural networks learn.

Oh builds a digital hardware implementation of predictive coding that learns without backpropagation. Each neural core maintains its own activity, prediction error, and synaptic weights. It communicates only with adjacent layers through hardwired connections. Learning happens through local update rules — each core adjusts its own weights based on its own prediction errors, without knowing what any distant core is doing.

The system does not execute instructions. It evolves under fixed local dynamics, with task structure imposed through connectivity, parameters, and boundary conditions. A clamping mechanism sets boundary values; the substrate relaxes toward a state that minimizes prediction error throughout. The architecture is a complete synthesizable digital design — not a simulation of predictive coding but a direct hardware instantiation.

The structural insight: removing the global error signal does not prevent learning. It changes what learning looks like. Instead of a centralized optimizer computing gradients for every parameter, a distributed substrate settles into solutions through local negotiation. The loss of global coordination is compensated by the gain of parallelism and locality. The hardware does not need to know the whole problem — each piece solves its own piece, and the solution emerges from the pieces agreeing with their neighbors.


No comments yet.