A new brain-chip will transform computers and usher in the coming age of AI. A tiny super-chip developed by IBM and Cornell Tech, and funded by DARPA, the U.S. Defense Advanced Research Projects Agency, claims to mimic the human brain in its ability to process information. Called a neurosynaptic chip, it is supposed to be a key player in the age of cognitive computing.
The engineers who designed the chip named it “TrueNorth,” a nod to its role in navigating towards the future of computers. The neurosynaptic processor has the capacity of one million “neurons” and 256 “synapses” on a single chip the size of a postage stamp. The chip was created using Samsung’s 28-nanometer process technology. In addition to IBM and Cornell University, computer engineers from around the world collaborated on the chip. It purports to be something entirely new in the computing world. Dharmendra Modha of IBM said, “We have taken inspiration from the cerebral cortex to design this chip.”
The cerebral cortex, or cerebrum, is the outermost layer of tissue in the brain, commonly known as gray matter. It plays a crucial role in memory, awareness and thought. The cerebral cortex is divided into two hemispheres by a deep groove. The left hemisphere is more logical and analytical. It is the language and math center of the brain. The right hemisphere is more intuitive and thoughtful. It is responsible for art and music.
In the past, computers have functioned more like the left side of the brain – computing bits and bytes of binary code into strings of information. The researchers who developed TrueNorth want it to mimic right brain functions and patterns. The neurosynaptic chip has the ability of sensory processing and can learn from its environment. The chip is able to take in sights, sounds and smells, along with other information, and process it much the way human brains process environmental data to make decisions. This chip may be the first real step to building an artificial intelligence. This new “brain-chip” will transform computers and usher in an age of AI unlike anything the world has seen as of yet.
The chip may allow cars to drive themselves, or smart phones to be even smarter – possibly even smarter than their users. It could create robots that can take in enough sensory information to fully function without human aid. The chip is a breakthrough in computing technology because it is a completely new way of designing a processor. IBM and DARPA will engineer a computer chip that can recognize spatial and temporal patterns and react to changing environments.
One main advantage of the chip is its energy requirements – or lack thereof. The chip runs on the equivalent of a hearing aid battery. It processes between 46 billion to 400 billion “synaptic” calculations per watt of energy. The most energy-efficient computer available until now processes 4.5 billion calculations per second per watt. Shawn Han of Samsung Electronics states, “It is an astonishing achievement to leverage a process traditionally used for commercially available, low-power mobile devices to deliver a chip that emulates the human brain by processing extreme amounts of sensory information with very little power.”
The other main advantage is that it negates the need for a network connection. The tiny chip performs supercomputer functions in real time without being connected to the internet or the cloud. It is also autonomous in its ability to process information. According to DARPA, current computers are limited by their requirements of human-derived algorithms to describe and calculate data. DARPA is interested in developing a computer with biological-like systems that can assess and prioritize environmental information on its own and react appropriately. It could give unmanned aircraft, vehicles, and robotics a more refined perception of their environments.
Although the chip may be a giant leap forward, it is still in the developmental stage and not ready for commercial applications. This tiny chip may be the future of computers, phones and cars, as well as military technology because it works in a way that mimics the human brain. The new “brain-chip” will transform computers and usher in AI, which may be a whole lot closer than people realize.
By: Rebecca Savastio