Generalization in a two-layer neural network

Abstract
Generalization in a fully connected two-layer neural network with N input nodes, M hidden nodes, a single output node, and binary weights is studied in the annealed approximation. When the number of examples is the order of N, the generalization error approaches a plateau and the system is in a permutation symmetric phase. When the number of examples is of the order of MN, the system undergoes a first-order phase transition to perfect generalization and the permutation symmetry breaks. Results of the computer simulation show good agreement with analytic calculation.