. Scientific Frontline: Living Brain Cells Enable Machine Learning Computations

Friday, April 3, 2026

Living Brain Cells Enable Machine Learning Computations

(a) Conventional neuron models used in reservoir computing. Artificial neural networks (ANNs) comprise of neuron models that sum up weighted inputs, filter the value through an activation function, and generate a continuous valued output. Spiking neural networks (SNNs) comprise of neuron models receive spiking inputs and output spikes when their membrane potential exceeds a threshold. (b) Biological neurons used for reservoir computing in this work. Rat cortical neurons are cultured in microfluidic devices that are attached to a microelectrode array.
Image Credit: ©Yuki Sono et al.

Scientific Frontline: Extended "At a Glance" Summary
: Living Brain Cells Enable Machine Learning Computations

The Core Concept: Biological neural networks (BNNs) grown from cultured neurons can be integrated into a machine learning framework to perform supervised temporal pattern learning. This demonstrates that living cellular systems can generate complex, time-series computations previously restricted to artificial systems.

Key Distinction/Mechanism: Unlike traditional artificial neural networks (ANNs) or spiking neural networks (SNNs) that rely on digital models of neurons, this system utilizes living rat cortical neurons cultured on microelectrode arrays within microfluidic devices. By applying the First-Order Reduced and Controlled Error (FORCE) learning algorithm to this "physical reservoir," researchers optimized the readout layer to correct errors in real-time, enabling the living network to generate structured temporal signals such as sine waves and chaotic trajectories.

Major Frameworks/Components:

  • Reservoir Computing: A computational framework that processes time-dependent data by leveraging the dynamic properties of complex, recurrently connected networks.
  • FORCE Learning: A real-time adaptation technique used to train the system by continuously adjusting output signals in response to real-time feedback errors.
  • Microfluidic Network Architecture: Specialized devices used to guide biological neuronal growth and control connectivity, promoting the high-dimensional dynamics required for computation by minimizing excessive neural synchronization.
  • Biological Neural Networks (BNNs): The living substrate of cultured rat cortical neurons that functions as the core processing reservoir.

Branch of Science: Neuroscience, Bio-inspired Computing, Machine Learning, and Bioengineering.

Future Application: This technology paves the way for advanced, biologically-driven computational hardware (biocomputing). Additionally, the platform can be adapted into microphysiological systems to accurately model complex neurological disorders, test pharmaceutical drug responses, and study neural network dynamics in real time.

Why It Matters: This research validates that living neural systems can serve as functional computational resources, bridging the gap between biological organisms and machine learning. It provides a viable biological alternative to artificial neuromorphic hardware while offering unprecedented methodologies for studying the intrinsic computational mechanics of the brain.

Example of time-series learning using physical reservoir computing with cultured neurons. A 30-second-period sine wave was provided as the target signal. In the absence of input, the BNN exhibits high-dimensional, complex activity; with FORCE learning and feedback, the activity becomes structured and reproduces the target waveform. Under suitable conditions, the network continues to autonomously generate trained signals even after the learning is stopped.
Image Credit: ©Yuki Sono et al.

A research team at Tohoku University and Future University Hakodate has demonstrated that living biological neurons can be trained to perform a supervised temporal pattern learning task previously carried out by artificial systems. By integrating cultured neuronal networks into a machine learning framework, the team showed that these biological systems can generate complex time-series signals, marking a significant step forward in both neuroscience and bio-inspired computing. 

The study was published online in Proceedings of the National Academy of Sciences, highlighting a novel intersection between living neural systems and computational technology. The findings suggest that biological neural networks (BNNs) may serve as viable alternatives or complements to existing machine learning models. 

Artificial neural networks (ANNs) and spiking neural networks (SNNs) have long been used in machine learning and neuromorphic hardware. A framework known as reservoir computing has emerged as an efficient approach for processing time-dependent data by leveraging the dynamic properties of recurrently connected ANNs and SNNs. 

In conventional ANN-based reservoir computing, methods such as First-Order Reduced and Controlled Error (FORCE) learning enable real-time adaptation by continuously adjusting output signals in response to errors. These techniques allow artificial systems to generate a wide range of temporal patterns, including periodic and chaotic signals. However, whether similar approaches could be applied to biological neural networks has remained an open question. 

To address this gap, the researchers constructed biological neural networks using cultured rat cortical neurons and incorporated them into a reservoir computing framework. By applying FORCE learning to optimize the system's readout layer, the team successfully trained the biological networks to produce complex temporal signals comparable to those involved in motor control. 

A key innovation in the study was the use of microfluidic devices to precisely guide neuronal growth and control network connectivity. This approach enabled the researchers to create modular network architectures that minimized excessive synchronization, thereby promoting the rich, high-dimensional dynamics required for effective reservoir computing. 

Using this system, the BNN-based framework was able to generate a variety of time-series patterns, including sine waves, triangular waves, square waves, and even chaotic trajectories such as the Lorenz attractor. Notably, the network demonstrated flexibility by learning and stably reproducing sine waves with periods ranging from 4 to 30 seconds within the same system. 

"This work shows that living neuronal networks are not only biologically meaningful systems but may also serve as novel computational resources," said Hideaki Yamamoto, a professor at Tohoku University. "By bridging neuroscience and machine learning, we are opening a pathway toward new forms of computing that leverage the intrinsic dynamics of biological systems." 

Looking ahead, the research team aims to improve the stability of signal generation after training has concluded. Future efforts will focus on reducing feedback delays and refining the FORCE learning algorithm. In parallel, the platform may be expanded into a microphysiological system for studying drug responses and modeling neurological disorders, further extending its impact across both scientific and medical fields. 

Published in journal: Proceedings of the National Academy of Sciences

TitleOnline supervised learning of temporal patterns in biological neural networks under feedback control

Authors: Yuki Sono, Hideaki Yamamoto, Yusei Nishi, Takuma Sumi, Yuya Sato, Ayumi Hirano-Iwata, Yuichi Katori, and Shigeo Sato

Source/CreditTohoku University

Reference Number: ns040326_01

Privacy Policy | Terms of Service | Contact Us