A Weight-Sharing Backpropagation Algorithm for Training Neural Networks

A biologically plausible neural network for local supervision in cortical microcircuits

The backpropagation algorithm is an invaluable tool for training artificialneural networks, however, because of a weight sharing requirement, it does notprovide a plausible model of brain function.Here, in the context of a two-layer network, we derive an algorithm for training a neural network which avoids this problem by not requiring explicit error computation and backpropagation.We find that our algorithm empirically performs comparably to backprop on a number of datasets.Furthermore, our algorithm maps onto a neural network that bears a remarkable resemblance to the connectivity structure and learning rulesof the cortex.