Modern Hopfield networks (HNs) exhibit properties of a Content Addressable Memory (CAM) that can store and retrieve a large number of memories. They also provide a basis for modelling associative memory in humans. However, the implementation of these networks is often not biologically plausible as they assume the strengths of synaptic connections are symmetric, and utilize functions that rely on many-body synapses. More biologically realistic versions of Modern HNs have been proposed, although these implementations often still utilize the softmax function. Computing the softmax for a single node requires the knowledge of all other neurons, and thus still poses a degree of biological implausibility. We present a Modern HN that uses a version of softmax that can be computed in a more bio-realistic way, and hence moves us closer to a model of memory that is biologically sound. We also show that our proposed network can learn the connection weights using a local learning rule, derived from gradient descent on the energy function. Finally, we verify that our proposed biological network behaves like a Modern HN and explore some of its other interesting properties.