Computers have come so far in terms of their power and potential, rivaling and even eclipsing human brains in their ability to store and crunch data, make predictions and communicate. But there is one domain where human brains continue to dominate: energy efficiency.
“The most efficient computers are still approximately four orders of magnitude — that’s 10,000 times — higher in energy requirements compared to the human brain for specific tasks such as image processing and recognition, although they outperform the brain in tasks like mathematical calculations,” said UC Santa Barbara electrical and computer engineering Professor Kaustav Banerjee, a world expert in the realm of nanoelectronics. “Making computers more energy efficient is crucial because the worldwide energy consumption by on-chip electronics stands at #4 in the global rankings of nation-wise energy consumption, and it is increasing exponentially each year, fueled by applications such as artificial intelligence.” Additionally, he said, the problem of energy inefficient computing is particularly pressing in the context of global warming, “highlighting the urgent need to develop more energy-efficient computing technologies.”
Neuromorphic (NM) computing has emerged as a promising way to bridge the energy efficiency gap. By mimicking the structure and operations of the human brain, where processing occurs in parallel across an array of low power-consuming neurons, it may be possible to approach brain-like energy efficiency. In a paper published in thejournal Nature Communications, Banerjee and co-workers Arnab Pal, Zichun Chai, Junkai Jiang and Wei Cao, in collaboration with researchers Vivek De and Mike Davies from Intel Labs propose such an ultra-energy efficient platform, using 2D transition metal dichalcogenide (TMD)-based tunnel-field-effect transistors (TFETs). Their platform, the researchers say, can bring the energy requirements to within two orders of magnitude (about 100 times) with respect to the human brain.
Leakage currents and subthreshold swing
The concept of neuromorphic computing has been around for decades, though the research around it has intensified only relatively recently. Advances in circuitry that enable smaller, denser arrays of transistors, and therefore more processing and functionality for less power consumption are just scratching the surface of what can be done to enable brain-inspired computing. Add to that an appetite generated by its many potential applications, such as AI and the Internet-of-Things, and it’s clear that expanding the options for a hardware platform for neuromorphic computing must be addressed in order to move forward.
Enter the team’s 2D tunnel-transistors. Emerging out of Banerjee’s longstandingresearch efforts to develop high-performance, low-power consumption transistors to meet the growing hunger for processing without a matching increase in power requirement, these atomically thin, nanoscale transistors are responsive at low voltages, and as the foundation of the researchers’ NM platform, can mimic the highly energy efficient operations of the human brain. In addition to lower off-state currents, the 2D TFETs also have a low subthreshold swing (SS), a parameter that describes how effectively a transistor can switch from off to on. According to Banerjee, a lower SS means a lower operating voltage, and faster and more efficient switching.
“Neuromorphic computing architectures are designed to operate with very sparse firing circuits,” said lead author Arnab Pal, “meaning they mimic how neurons in the brain fire only when necessary.” In contrast to the more conventional von Neumann architecture of today’s computers, in which data is processed sequentially, memory and processing components are separated and which continuously draw power throughout the entire operation, an event-driven system such as a NM computer fires up only when there is input to process, and memory and processing are distributed across an array of transistors. Companies like Intel and IBM have developed brain-inspired platforms, deploying billions of interconnected transistors and generating significant energy savings.
However, there’s still room for energy efficiency improvement, according to the researchers.
“In these systems, most of the energy is lost through leakage currents when the transistors are off, rather than during their active state,” Banerjee explained. A ubiquitous phenomenon in the world of electronics, leakage currents are small amounts of electricity that flow through a circuit even when it is in the off state (but still connected to power). According to the paper, current NM chips use traditional metal-oxide-semiconductor field-effect transistors (MOSFETs) which have a high on-state current, but also high off-state leakage. “Since the power efficiency of these chips is constrained by the off-state leakage, our approach — using tunneling transistors with much lower off-state current — can greatly improve power efficiency,” Banerjee said.
When integrated into a neuromorphic circuit, which emulates the firing and reset of neurons, the TFETs proved themselves more energy efficient than state-of-the-art MOSFETs, particularly the FinFETs (a MOSFET design that incorporates vertical “fins” as a way to provide better control of switching and leakage). TFETs are still in the experimental stage, however the performance and energy efficiency of neuromorphic circuits based on them makes them a promising candidate for the next generation of brain-inspired computing.
According to co-authors Vivek De (Intel Fellow) and Mike Davies (Director of Intel’s Neuromorphic Computing Lab), “Once realized, this platform can bring the energy consumption in chips to within two orders of magnitude with respect to the human brain — not accounting for the interface circuitry and memory storage elements. This represents a significant improvement from what is achievable today.”
Eventually, one can realize three-dimensional versions of these 2D-TFET based neuromorphic circuits to provide even closer emulation of the human brain, added Banerjee, widely recognized as one of the key visionaries behind 3D integrated circuits that are now witnessing wide scale commercial proliferation.
Source link
lol