AI system detects oysters faster than humans but with lower accuracy in reef monitoring study

Gaby Clark
scientific editor

Andrew Zinin
lead editor

With global oyster populations having plummeted by more than 85% from historical levels, researchers are feeling a sense of urgency to restore and monitor these critical marine ecosystems. But traditional methods of oyster reef monitoring often involve destructive sampling and extensive manual labor. A new study published in explores whether artificial intelligence could serve as a safer and more effective tool.
The deep learning model, ODYSSEE, was trained to identify live oysters from underwater imagery. Researchers used real and synthetically generated images and pitted the model against both expert and non-expert human annotators to evaluate its accuracy and speed.
The findings were mixed. While ODYSSEE vastly outpaced humans in processing time (averaging just 39.6 seconds to annotate 150 images, compared to 2.3 hours for experts and 4.5 hours for non-experts), it lagged behind in accuracy. The AI model correctly identified live oysters 63% of the time, compared to 74% for experts and 75% for non-experts.
Researchers found that better image quality increased human accuracy but reduced the model's accuracy.
One of the project's researchers, professor Art Trembanis from University of Delaware's College of Earth, Ocean and Environment, played a pivotal role in both image acquisition and advancing the robotic systems used in the study. Trembanis and the team used handheld and remotely operated vehicles to capture footage from oyster reefs in Lewes, Delaware. This work provided a critical testbed for evaluating how AI can complement, and perhaps one day streamline, traditional monitoring.
"This is not about replacing human expertise," said Trembanis. "It's about scaling our ability to monitor reef health, particularly in sensitive areas where dredging simply isn't an option. ODYSSEE shows promise, but also highlights how nuanced oyster identification can be."
The research is a collaboration between the University of Delaware, the University of Maryland and the University of Cincinnati, and is part of a broader initiative to integrate autonomous robotic systems into aquaculture. The team trained the ODYSSEE model using a blend of real-world oyster images and synthetic data generated with stable diffusion—a technique that renders hyper-realistic images from 3D scans to help bridge the gap between virtual and natural environments.
While the current version of ODYSSEE has its limitations, the researchers are optimistic that the model's accuracy can eventually match or exceed that of human annotators.
"This research offers a path forward for non-invasive, scalable monitoring of marine ecosystems," the authors conclude. "As AI models improve, they can serve as powerful allies in restoration efforts—especially where human time and access are limited."
More information: Brendan Campbell et al, Is AI currently capable of identifying wild oysters? A comparison of human annotators against the AI model, ODYSSEE, Frontiers in Robotics and AI (2025).
Journal information: Frontiers in Robotics and AI
Provided by University of Delaware