Deterministic parsing promises to (almost) never backtrack. Neural network technology promises competition, and learning capabilities. The marriage of these two ideas is being investigated in an experimental natural language parsing system that combines some of the best features of each. The result is a deterministic parser that learns, generalizes, and supports competition among structures and lexical interpretations. The performance of the parser is being evaluated on predicted as well as unpredicted sentence forms. Several mildly ungrammatical sentences have been successfully processed into structures judged reasonable when compared to their grammatical counterparts. Lexic^ ambiguities can create problems for traditional parsers, or at least require additional backtracking. With the use of neural netwoilcs, ambiguities can be resolved through the wider syntactic context. The results have shown the potential for parsing using this approach.