This lecture delves into the integration of hidden variables in the replica solutions and hidden units in the committee machine model, exploring the graphical model and the computational gap in learning two-layer neural networks. The instructor presents a rigorous justification of heuristic tools from statistical physics for the committee machine, introducing an approximate message passing algorithm. The lecture also covers the specialization transition, computational gap analysis, and the challenges in determining the optimal generalization error in a two-(extensive) layers perceptron.