Back
Science

Researchers Train Living Rat Neurons for Machine Learning Tasks

View source

Living Biological Neurons Master Complex Temporal Pattern Learning

Tohoku University and Future University Hakodate researchers have successfully trained living biological neurons to perform complex supervised temporal pattern learning tasks. This groundbreaking study integrates cultured neuronal networks into a machine learning framework, demonstrating that biological systems can generate intricate time-series signals and potentially serve as computational resources.

The findings were published in the Proceedings of the National Academy of Sciences (PNAS) on March 12, 2026.

Research Methodology

The research team constructed biological neural networks (BNNs) using cultured rat cortical neurons. These BNNs were then integrated into a "reservoir computing" framework.

This approach utilizes the dynamic properties of a neural network to process time-dependent data, with only the "readout" layer, which interprets the network's activity, being trained, rather than every individual neuron.

A key aspect of the methodology was the application of First-Order Reduced and Controlled Error (FORCE) learning. This method, typically used in artificial neural networks (ANNs) to adjust output signals in real-time based on errors, was successfully applied to BNNs for the first time to generate time-series data.

To facilitate effective reservoir computing, microfluidic devices were employed. These devices precisely guided neuronal growth and controlled network connectivity, enabling the creation of modular network architectures. This design minimized excessive synchronization within the network, promoting the high-dimensional dynamics necessary for processing complex information and effective reservoir computing.

Key Findings and Capabilities

The BNN-based framework successfully generated a variety of time-series patterns, demonstrating that living biological circuits can reproduce intricate mathematical patterns. These included:

  • Sine waves
  • Triangular waves
  • Square waves
  • Chaotic trajectories, such as the Lorenz attractor

The networks also exhibited adaptability and versatility by learning and stably reproducing sine waves with periods ranging from 4 to 30 seconds within the same biological system.

These results indicate that living neuronal networks can function as real-time computational resources.

Implications

This research suggests that BNNs could serve as alternatives or complements to existing machine learning models, offering new forms of computing that leverage the intrinsic dynamics of biological systems. Biological networks have shown potential to process large amounts of time-dependent data with high energy efficiency.

The concept of "Wetware Computing," which utilizes biological systems for computation, is advanced by these findings.

Furthermore, the platform could be expanded for use as a microphysiological system. This would enable the study of drug responses and the modeling of neurological disorders in a laboratory setting, potentially offering an alternative to traditional animal testing.

Future Directions

Future research efforts will focus on enhancing the stability of signal generation post-training. The team also plans to concentrate on reducing feedback delays and refining the FORCE learning algorithm to further optimize the system's performance.