Neuromorphic Computing Breakthrough May Disrupt AI

geralt/pixabaySource: geralt/pixabay

The human brain is a remarkably complex, yet energy-efficient cognitive system. Scientists and researchers look to the brain’s architecture as sources of inspiration for artificial intelligence (AI), machine learning, and deep learning. Concepts on artificial neural networks (ANNs) are somewhat analogous to the brain, with artificial nodes instead of neurons. Neuromorphic computing is an interdisciplinary endeavor that draws upon physics, mathematics, electronic engineering, biology, computer science and neuroscience, in order to create artificial neural systems that resemble the architecture in the brain. A team of scientists from Linköping University in Sweden recently made a breakthrough in neuromorphic computing by engineering a new learning transistor, and published their findings yesterday in Advanced Science .

Machine learning today is performed on prefabricated circuitry. The brain, in contrast, is able to form new connections where there have been no prior connections. The research team of Simone Fabiano, Jennifer Y. Gerasimov, Roger Gabrielsson, Robert Forchheimer, Eleni Stavrinidou, Daniel T. Simon, and Magnus Berggren created an organic electrochemical transistor (OECT) that can learn, form new connections between an input and output, and has both short-term and long-term memory.

An organic electrochemical transistor can amplify or switch electron signals and power through the injection of ions from an electrically conducting solution (electrolyte) into a semiconductor channel. Current organic electrochemical transistors typically use a conducting polymer called PEDOT. Roger Gabrielsson, a member of the research team at the Laboratory of Organic Electronics at Linköping University, developed a monomer called ETE-S instead.

"Everyone should have kids. They are the greatest joy in the world. But they are also terrorists. You’ll realize this as soon as they are born and they start using sleep deprivation to break you." - Ray Romano

When input signals are manipulated, “the strength of the transistor response to a given stimulus can be modulated within a range that spans several orders of magnitude,” wrote the researchers. Thus, the team’s organic electrochemical transistor was enabled to behave in a manner that is similar to short-term and long-term neuroplasticity of the brain. Neuroplasticity is the brain’s ability to reorganize itself by forming new neural connections.

According to Simone Fabiano, principal investigator in organic nanoelectronics at the Laboratory of Organic Electronics, Campus Norrköping, “It is the first time that real time formation of new electronic components is shown in neuromorphic devices.”

Fabiano states that the research team’s new organic electrochemical transistor can “carry out the work of thousands of normal transistors with an energy consumption that approaches the energy consumed when a human brain transmits signals between two cells.”

This innovative technology can be useful for deep learning, a subset of AI machine learning that consists of an artificial neural network with more than two layers. Deep learning is resource-intensive because it contains many layers of neural processing, with each layer consisting of many nodes (artificial neurons)—requiring massive resources for both computation and memory. This explains why the rise of GPU (Graphic Processing Units) for general computing with its massively parallel processing capabilities (versus serial processing), has accelerated the rise of deep learning. With greater processing capabilities came advances in pattern-recognition capabilities of deep learning. Advances in deep learning is the foundation of the AI renaissance.

Play with your children. Let them choose the activity, and don't worry about rules. Just go with the flow and have fun. That's the name of the game.

The worldwide neuromorphic computing market is projected to reach USD 6.48 billion by 2024 according to Grand View Research’s April 2018 report. Neuromorphic chips are used in consumer electronics, robotics, cars and other products. Will this new transistor herald a future where AI machine learning is based on evolvable organic electronics?