Back
Science

MIT Neuroscientists Develop AI Tools for Tracking Brain Cell Activity

View source

MIT Neuroscientists Advance Cell Tracking Technology

Neuroscientists at MIT have developed three artificial intelligence (AI) tools designed to precisely track and identify individual brain cells in live, behaving animals.

This advancement addresses a significant challenge in neuroscience: monitoring the position and identity of neurons as animals move.

Steven Flavell, a lead author and associate professor at The Picower Institute for Learning and Memory, stated that these tools enable researchers to track neurons over time and identify most neurons, which is crucial for linking brain activity to behavior.

The Developed AI Tools

The research introduces three distinct AI tools:

  • BrainAlignNet: This tool tracks cells across long series of images, such as video recordings. It operates 600 times faster than previous methods with 99.6% accuracy.
  • AutoCellLabeler: This tool identifies specific cell types in images. It requires initial training with human-annotated data but performs effectively even with fewer labeling colors, achieving 98% accuracy with a four-color barcoding system called NeuroPAL.
  • CellDiscoveryNet: This tool identifies and clusters fluorescently labeled cell types across different animals without requiring any prior training or supervision. Its performance aligns with that of trained human annotators.

Impact and Application

The introduction of these tools has reduced the previous need for trade-offs between speed and accuracy in cell labeling.

The methodology may serve as a model for other laboratories working with extensive image data from various organisms or human tissues.

Brady Weissbourd, a co-author and assistant professor, applied BrainAlignNet to C. hemisphaerica jellyfish in his laboratory. The tool facilitated the extraction of neural activity data from videos of the jellyfish, which exhibit complex, arbitrary movements.

Previously, Flavell's lab faced significant bottlenecks in annotating neuron identities in C. elegans, with manual annotation taking up to five hours per video even with a comprehensive barcoding system. The estimated cost for outsourcing this task was substantial. The development of AutoCellLabeler originated directly from this challenge.

The tools utilize existing neural network architectures that were optimized to address alignment and annotation issues. They are designed to learn relevant features for task success without explicit instructions on specific criteria.

Future Directions

Flavell and Weissbourd indicate ongoing work. Weissbourd plans to label all cell types in jellyfish and develop microscopy capable of imaging jellyfish in free-swimming conditions.

The research received funding from multiple sources, including the National Institutes of Health, the National Science Foundation, and The Howard Hughes Medical Institute.