New Research Applies Theoretical Computer Science to the Brain
How does the human brain work? It's a question that has stymied scientists for centuries. For the past few years, College of Computing Professor Santosh Vempala has been exploring the question using theoretical computer science.
“This is one of the most fundamental questions in science,” Vempala said.
His starting point is the observation that the brain is able to do many things computers still cannot — and more reliably and robustly. For example, humans are great at pattern recognition and generalization, from identifying letters to walking on new surfaces, after just a few “training” examples.
A particularly striking example is language. After hearing relatively few sentences, a neurotypical 2-year-old can understand and generate virtually infinite correct sentences.
“How is the brain able to do this?” Vempala asks. “How does the mind emerge from the brain? From neurons and synapses, how do we get to perception, language, and stories?”
Computers, however, need a lot of data and computational power, and work only in limited settings — at least for now. This is why Captchas became popular for keeping applications secure.
[RELATED: Real-Time Captcha Technique Improves Biometric Authentication]
While machine learning is interested in using insights from the brain, it mostly examines it from a data perspective. Vempala, however, wants to develop an algorithmic theory of brain function. He plans to approach this at an intermediate scale that is higher than neurons and synapses, but doesn’t examine the brain as a whole.
Vempala and Columbia Professor Christos Papadimitriou hypothesize that assemblies of large, densely inter-connected neurons are the engine of brain computation. Assemblies are a strong basis for a computational system because they can be used to perform higher-level operations while at the same time can be compiled down to tangible units, such as neurons and synapses.
This has been an ongoing project. In the Spring of 2018, they co-organized a Simons Institute semester The Brain and Computation. In recent work, assemblies have been shown to be able to explain several experimental findings by neuroscientists.
Their research, currently funded by a $500,000 National Science Foundation grant, will develop a theory to understand how the brain functions from a computational perspective. This is a collaborative effort with computer scientists at Columbia and cognitive scientists at City University of New York who will also contribute experimental studies.
The research addresses the assembly hypothesis through five goals:
- Expanding modeling and mathematical techniques of analysis for the study of assembly computation
- Developing more accurate and efficient simulation methodology
- Exploring assemblies’ computational power through new modes outside formal computation, such as pattern completion, learning, and prediction
- Modeling and algorithmic investigation of how synaptic connectivity dynamics and biases of affect the various modes of brain computation
- Creating and analyzing functional magnetic resonance imaging experiments and electrocorticography data through new algorithmic and machine learning techniques
As computing revolutionizes research in science and engineering disciplines and drives industry innovation, Georgia Tech leads the way, ranking as a top-tier destination for undergraduate computer science (CS) education. Read more about the college's commitment:… https://t.co/9e5udNwuuD pic.twitter.com/MZ6KU9gpF3
— Georgia Tech Computing (@gtcomputing) September 24, 2024