We’ve all been there: cramming for a college final. Whether it’s relying on flashcards, notes scribbled frantically on a legal pad, or with a classmate in a deserted cafe, staying up late to fill our brains with every last date, equation, or theory, remains a universal constant.
What most college students understand less, though, is the science behind the brain’s ability to recall facts through repetition. It’s a trait of the brain’s neural pathways we’ve dubbed “synaptic plasticity.” This process, the foundation of the brain’s ability to learn, relies on synapses, which can strengthen – or atrophy – over time.
And yet, despite all this, scientists have struggled to measure the fluctuations in each synapse. Now Salk researchers have announced a new technique to measure synaptic strength, precision of plasticity, and the synapses’ data storage capacity. It’s a breakthrough that could shine new light on learning, memory, the aging process, and even disease.
Methodology
The research, appearing in MIT’s Neural Computation, could mark a turning point in neuroscience.
“We’re getting better at identifying exactly where and how individual neurons are connected … but we still have a lot to learn about the dynamics of those connections,” Professor Terrence Sejnowski, senior author of the study and holder of the Francis Crick Chair at Salk, explained. “We have now created a technique for studying the strength of synapses, the precision with which neurons modulate that strength, and the amount of information synapses are capable of storing — leading us to find that our brain can store 10 times more information than we previously thought.”
The Salk research team leveraged information theory — an advanced mathematical framework used to analyze information processing — to study synapse pairs in the hippocampus of rats. By examining synapse pairs with identical activation histories, the scientists came to some conclusions about plasticity. Specifically, they figured that if these pairs had changed in strength by the same amount, it meant they were precise.
“Using information theory, we divided synapses by strength into 24 categories and compared special synapse pairs to determine the precision of their strength modulation,” Mohammad Samavat, the study’s first author and a postdoctoral researcher in Sejnowski’s lab, added. “We were excited to find that the pairs had very similar dendritic spine sizes and synaptic strengths, meaning the brain is highly precise when it makes synapses weaker or stronger over time.”
Rethinking What We Thought We Knew
Using this approach, the Salk researchers discovered that synapses contain more information than they expected. Specifically, they found that each of the 24 synaptic strength categories stored between 4.1 and 4.6 bits of information, redefining what we thought we knew about the brain’s storage capacity.
Compared to traditional methods techniques, the researchers claim this new method is more thorough and scalable, allowing for the examination of more diverse and larger datasets.
“This technique is going to be a tremendous help for neuroscientists,” Kristen Harris, a University of Texas professor and co-author, said. “Having this detailed look into synaptic strength and plasticity could really propel research on learning and memory, and we can use it to explore these processes in all different parts of human brains, animal brains, young brains, and old brains.”
However, these findings could have implications that go beyond neuroscience. It can help inform other research projects, such as the National Institutes of Health’s BRAIN Initiative, which mapped out a human brain cell atlas last October. It can also help future researchers better understand neurodegenerative disorders such as Alzheimer’s.
As others embrace this new method, it could open a window into the brain’s ability to learn and store data, and what happens to that as we age.
Further Reading
The Use of Mood Stabilizers as Plasticity Enhancers in the Treatment of Neuropsychiatric Disorders