Researcher Looks to Future of Computing through Human Visual Cortex
Computers are often compared to the human brain. While computers can operate faster than the brain, the brain is exponentially more efficient. This is a key reason why the brain remains a source of inspiration for scientists.
One of these scientists is Nabil Imam, an assistant professor at Georgia Tech’s School of Computational Science and Engineering (CSE). Not only does Imam study the possibilities of brain-inspired computing, he also thinks the answer to this engineering question lies in the same parts of the brain that let us see our world.
Imam posits that by modeling the human visual cortex in computer hardware and software, computers can become both more efficient and more powerful. If achieved, this idea could transform the future of computer manufacturing and programming.
Imam presented his research observations Feb. 3 at a summit hosted by Georgia Tech’s Center for Research in Novel Compute Hierarchies (CRNCH). Imam’s presentation included an overview of the neural circuits within the brain, simulations that have modeled the visual cortex, and designs for silicon chips that emulate brain architecture.
The highlight of Imam’s seminar was his presentation of a microchip architecture he and other researchers have designed to function like the brain’s visual cortex. The system Imam presented has simulated hundreds of millions of neurons and tens of billions of synapses, a step toward making brain-inspired computing a reality.
“Multi-chip systems have been built using these chips to simulate 100 million neurons and 25 billion synapses in real time,” Imam said. “These chips are very efficient platforms for simulating biological neural networks.”
To open his seminar, Imam showed circuit connectivity and neural response properties of the primary visual cortex – an area of the brain that is involved in seeing the world.
The primary visual cortex is one of the most extensively studied areas of the brain. It consists of six layers of brain cells totaling about 300 million neurons and 300 billion synapses in primates. Even though this is one of the most well studied areas of the brain, the sheer numbers and complexity involved still make it difficult to understand.
Imam then discussed computer simulations of the visual system during his presentation. The example Imam used simulates a visual cortex segment of 230,000 neurons, a small sliver of the structure but one that shows promise.
“This is a very small simulation, but it is a very detailed one. The models are based on extensive data curated from years of measurements,” said Imam. “With the right kind of computing platform, we can scale this up and simulate larger portions of the circuit and its interactions with other areas of the brain.”
Due to the sheer computing power required for brain simulations, scaling is a significant obstacle that researchers, like Imam, are studying to overcome.
Simulating one second of a small neural circuit requires hours of computing time. With current computer architectures, it would require speeds measured in exaflops and memory spanning petabytes to achieve a simulation of the human brain.
Presently, the max speed of the world’s fastest supercomputer, called Frontier, is 1.102 exaflops. It is the first, and currently, the only computer to reach exascale speeds. To do this, Frontier requires 7,300 square feet of space, consumes 21 megawatts of power, and pumps 6,000 gallons of water a minute to keep itself cool. This shows how far computers still must go before being able to simulate the brain.
However, progress is being made. The chip Imam discussed at CRNCH Summit 2023 uses specialized integrated circuits to model neurons and their networks found in the brain and visual cortex.
To address the scaling challenge, the computer chips were scaled up via multi-chip platforms to simulate hundreds of millions of neurons and tens of billions of synapses, approaching scales of complex cortical circuitry.
Simulations in this chip are orders of magnitude larger than the 230,000-neuron example simulation Imam presented earlier.
These chips operate in real time, so one second of brain activity equals one second of computing. These chips are also smaller than a square inch and consume less than one watt of power. The chips also can be integrated with sensors and actuators to interact with the environment.
Much of Imam’s research remains theoretical and ongoing work continues. But as Imam showed in his CRNCH Summit 2023 seminar, the intersection of computing and neuroscience is rapidly growing and the future for this technology appears bright.
“The goal is to develop computational and analytical methods that will help us understand the behavior of these models” Imam said. “These insights can then be used to develop new classes of computer systems and new models of computation.”
As computing revolutionizes research in science and engineering disciplines and drives industry innovation, Georgia Tech leads the way, ranking as a top-tier destination for undergraduate computer science (CS) education. Read more about the college's commitment:… https://t.co/9e5udNwuuD pic.twitter.com/MZ6KU9gpF3
— Georgia Tech Computing (@gtcomputing) September 24, 2024