How do people rapidly learn rich, structured concepts fromsparse input? Recent approaches to concept learning havefound success by integrating rules and statistics. We describe ahierarchical model in this spirit in which the rules are stochas-tic, generative processes, and the rules themselves arise froma higher-level stochastic, generative process. We evaluate thisprobabilistic language-of-thought model with data from an ab-stract rule learning experiment carried out with adults. In thisexperiment, we find novel generalization effects, and we showthat the model gives a qualitatively good account of the exper-imental data. We then discuss the role of this kind of model inthe larger context of concept learning.