DNA-based neural network learns from examples to solve problems

Stephanie Baum
scientific editor

Robert Egan
associate editor

Neural networks are computing systems designed to mimic both the structure and function of the human brain. Caltech researchers have been developing a neural network made out of strands of DNA instead of electronic parts that carries out computation through chemical reactions rather than digital signals.
An important property of any neural network is the ability to learn by taking in information and retaining it for future decisions. Now, researchers in the laboratory of Lulu Qian, professor of bioengineering, have created a DNA-based neural network that can learn. The work represents a first step toward demonstrating more complex learning behaviors in chemical systems.
A paper describing the research in the journal Nature on September 3. Kevin Cherry, Ph.D., is the study's first author.
The ability to learn is found on many scales: Our brains rewire themselves to integrate new information, our immune systems chemically encode information about encounters with pathogens for the future, and even single-celled bacteria learn simple information about chemical gradients and use it to navigate toward food. Learning is a key component of intelligence, whether natural or artificial; for example, "smart" devices can learn your preferences and offer customized recommendations.
"Our goal was to build a molecular system from scratch that could take in examples, find the underlying patterns, and then act on new information it had never seen before," Qian says. "Think of a future artificial cell with a biological cell as its teacher. It observes how the teacher reacts to different molecular cues, stores those experiences, and—over the course of many lessons—figures out how to respond on its own to similar but not identical cues."
In 2018, Cherry and Qian created a DNA-based neural network that could recognize handwritten numbers that are encoded with DNA as chemical patterns. Because it can be difficult even for humans to recognize others' sloppy handwriting, identifying handwritten numbers is a common test for programming intelligence into classical electronic artificial neural networks. These networks must account for variations in handwriting, then compare an unknown pattern to their so-called memories and decide what number the image represents.
In Cherry and Qian's system, rather than digital pixels making up a number, each molecular "image" was made up of 20 unique DNA strands that were assigned to represent an individual pixel in a 10-by-10 pattern. In designing the DNA system, a classical computer was used to determine how much of each molecular ingredient is needed to represent the memories.
The new study builds on that research to engineer a system that can "develop" its own memories, encoded in chemical signals called molecular wires. These wires can be chemically flipped on to store information. When the system encounters a molecular example of a handwritten number, it turns on a set of wires that each present a connection between a number and its physical identifying features.
Over time, the system builds up a physical record of what it has learned, which is stored in the concentrations of specific DNA molecules. The concept is similar to how human brains learn. (There is an adage in neuroscience that "cells that fire together, wire together." In this case, the wiring is molecular, and the memories live in the chemistry itself.)
Each neural network can perform its computation to identify numbers in a tiny droplet containing billions of DNA strands of over a thousand types. Every type of strand was designed from scratch to react only with specific intended partners under certain conditions. When the cascade of chemical reactions finishes, the system returns an output. For example, if a molecular image is recognized as being a handwritten "0," a fluorescent signal is produced that corresponds to the output, such as red for 0 and blue for 1.
"Our journey to a DNA neural network that learns took seven years—and the path was anything but straight," Cherry says. "In a complex molecular system, fixing one issue was like patching a leak in a dam only to have another leak spring open somewhere else. Instead of picking off challenges one by one, we needed to step back and see the whole picture, then design solutions that addressed all the challenges at once.
"It was a bold move, because it meant starting from scratch. With a new, holistic design, we finally achieved what we'd been chasing: a molecular system that can learn. Looking back, the science taught us something bigger: that the hardest problems demand both a wide view and the courage to start over when the stakes are highest."
The work lays the foundation for one day developing "smart" medicines that can adapt in real time to pathogenic threats, or "smart" materials that can learn and adapt to external conditions (like a bandage that learns from your own skin's signals and responds to promote faster wound healing).
More information: Kevin M. Cherry et al, Supervised learning in DNA neural networks, Nature (2025).
Journal information: Nature
Provided by California Institute of Technology