We use a neural-network ansatz originally designed for the variational optimization of quantum systems to study dynamical large deviations in classical ones. We use recurrent neural networks to describe the large deviations of the dynamical activity of model glasses, kinetically constrained models in two dimensions. We present the first finite size-scaling analysis of the large-deviation functions of the two-dimensional Fredrickson-Andersen model, and explore the spatial structure of the high-activity sector of the South-or-East model. These results provide a new route to the study of dynamical large-deviation functions, and highlight the broad applicability of the neural-network state ansatz across domains in physics.