A little more on how memories are stored, with notes on homeostasis.
Neuroscience is in a golden age, where age-old problems are yielding to new technologies and accumulated insight. Where are memories stored? Well, in the brain- we have known that for a long time (but not forever!). Then came the beautiful discovery of neurons, as investigated and drawn by Ramon Cajal. More recently, there has been a realization that memories reside in "engrams", or patterns of neurons which can be created by coordinated firing, and later re-activated by firing some fraction of those engram neurons to trigger re-activation of the full memory. Does that mean they are stored in individual synapses, the connections between neurons? Not really- synapses apparently turn over quite frequently, so long-term memories need a more stable substrate, such as a set of neurons whose synaptic connections, while plastic and turned over periodically, are kept generally consistent- perhaps by periodic re-activation of the engram or other means.
A neuron, in the artistry of Ramon Cajal, complete with plenty of dendrites and spines (tiny nubs) which are where synapses have been constructed to talk to incoming signals. The main output (axon) is probably the slightly thicker process going upward. |
A recent paper explored by simulation how such a system would work. The reigning paradigm is Hebbian, per Donald Hebb, where neurons that fire coordinately increase the strength of their connections, while others do not. This rule would create, out of a random matrix of neurons, engram-resembling assemblies of neurons that have stronger connections and get re-activated when the original stimulus comes along. But there is a problem, which is that activation under such a simple theory would be unbounded, creating run-away assemblies that fire constantly and gradually take over the whole brain. There has to be a homeostatic mechanism that balances out the firing characteristics so that the memory is "hidden", in the sense that it does not produce unusual activation characteristics unless specific activating inputs come along.
That is what this paper explores, modeling a neuron/synapse system that is notionally auto-balancing, in that it reduces the connection of the engram neurons with outside neurons even while it increases their internal connectivity to represent the engram. The net effect is that spontaneous firing over the tissue (model) as a whole, from random inputs, is unchanged even though a memory trace is resident and can be called out any time with a specially coordinated set of inputs.
A modeled set of neurons, including an engram. At top is a representation of each neuron as a dot, activated as time goes by on the X axis. Looks a bit like old-fashioned TV snow, as do most brain signals. The black bars show the activation of a memory engram, where its set of neurons fire a bit more than average (green line) and get connected to each other slightly better. This is immediately followed by a compensating decrease of connectivity which kicks in to isolate the engram a bit from the surrounding network, thus rebalancing overall activity back the snow pattern. Panel C shows an overall decrease in synapses as the compensation mechanism is modeled, and Panel D shows connectivity, which is increased among the engram cells (green), and decreased slightly between the engram and other cells (gray), and unchanged on average overall (blue). |
The researchers find this a pretty realistic portrayal and model of what might be going on IRL, as it were. The mechanism behind this, as stated at the top, is not just a single synapse, but a whole network of connectivity among some (small) set of neurons, as carried out by transcriptional programs, new protein synthesis and the development of new synapses here, fewer there, and so forth, among many connected neurons. It is maintained by repeated recall / re-activation, as happens during dreaming, and conversely can degrade and be forgotten in the absence of such reinforcement, as new engrams are over-written on the tissue. What is the size of such neuron engram sets, and how are neurons selected to participate? The Hebbian theory posits that the more strongly active neurons would be pulled into such a network, automatically, at the moment of memory creation. Given a rather even, snow-like pattern of default activity in neural tissue, such activation would naturally be distributed evenly as well, avoiding existing engrams (due to their reduced outside connectivity) unless some explicit similarity were present in the new pattern. The researchers' model had 10,000 neurons, of which a random 1,000 were used to activate memory ensembles (the model also had 2,500 inhibitory neurons interlaced, for homeostasis).
Lastly, there is the fascinating question of generalization. Memories are useful only if they are not too specific. If they are only recalled under the precise conditions of creation, there is hardly much point to keeping them. If they are triggered by every other, or loosely related, experience, however, you have PTSD, which is also not good. The system needs to possess a balance so that a memory can be recalled by a sufficiently similar experience or situation, such that it can usefully inform some choice that the organism is facing in a novel environment, or be recalled at a dinner party without too much re-enactment being required. This is another area of homeostasis, by which the brain keeps things humming by carefully balancing competing needs / processes. A bit of recent work found two genes which are expressed in different cells within memory engram collections, and which mark the apparently competing subsets as either prone to generalization (Fos) by getting exciting inputs, or prone to restriction/specialization (Npas4) by getting inhibitory inputs. Thus each engram is composed of competing components, all in the ultimate interest of balance so that we have useful access to important memories, forget unimportant ones, and are not overwhelmed with traumatic ones.
- Pay people to be deprogrammed?
- Trumpists very strong? More like wet noodles.
- Top Republicans, compared.
- We have not exactly defunded the police.
- More on covid infectiousness.
- More details on the job guarantee in MMT.
- Graph of the week... our emissions keep going up, not down.