Language is ordered in time and an incremental processingsystem encounters temporary ambiguity in the middle of sen-tence comprehension. An optimal incremental processing sys-tem must solve two computational problems: On the one hand,it has to keep multiple possible interpretations without choos-ing one over the others. On the other hand, it must rejectinterpretations inconsistent with context. We propose a re-current neural network model of incremental processing thatdoes stochastic optimization of a set of soft, local constraintsto build a globally coherent structure successfully. Bifurcationanalysis of the model makes clear when and why the modelparses a sentence successfully and when and why it does not—the garden path and local coherence effects are discussed. Ourmodel provides neurally plausible solutions of the computa-tional problems arising in incremental processing