We complement our previous work (Kropff & Treves, 2007) with the full (nondiluted) solution describing the stable states of an attractor network that stores correlated patterns of activity. The new solution provides a good fit of simulations of a model network storing the feature norms of McRae, Cree, Seidenberg, and McNorgan (2005), experimentally obtained combinations of features representing concepts in semantic memory. We discuss three ways to improve the storage capacity of the network: adding uninformative neurons, removing informative neurons, and introducing popularity-modulated Hebbian learning. We show that if the strength of synapses is modulated by an exponential decay of the popularity of the presynaptic neuron, any distribution of patterns can be stored and retrieved with approximately an optimal storage capacity: Cmin Ifp – that is, the minimum number of connections per neuron needed to sustain the retrieval of a pattern is proportional to the information content of the pattern multiplied by the number of patterns stored in the network.