We present modifications to Recursive Auto-Associative Memory which increase its robustness and storage capacity. This is done by introducing an extra layer to the compressor and reconstructor networks, employing integer rather than realvalued representations, pre-conditioning the weights and presetting the representations to be compatible with them, and using a quick-prop modification. Initial studies have shown this method to be reliable for data sets with up to three hundred subtrees.