VR experiments train AI to identify ancient finger-fluting artists

Lisa Lock
scientific editor

Robert Egan
associate editor

Griffith researchers built and tested a digital archaeology framework to learn more about the ancient humans who created one of the oldest forms of rock art, finger fluting.
Finger flutings are marks drawn by fingers through a soft mineral film called moonmilk on cave walls.
Experiments were conducted—both with adult participants in a tactile setup and using VR headsets in a custom-built program—to explore whether image-recognition methods could learn enough from finger-fluting images made by modern people to identify the sex of the person who created them.
The study, "Using digital archaeology and machine learning to determine sex in finger flutings," has been in Scientific Reports.
Finger flutings appear in pitch dark caves across Europe and Australia. The oldest known examples in France have been attributed to Neanderthals about 300,000 years ago.
Dr. Andrea Jalandoni, a digital archaeologist from the Griffith Center for Social and Cultural Research, who led the study, said one of the key questions around finger flutings was who made them.
"Whether the marks were made by men or women can have real world implications," she said. "This information has been used to decide who can access certain sites for cultural reasons."
Past attempts to identify who made cave marks often relied on finger measurements and ratios, or hand size measurements.
Those methods turned out to be inconsistent or vulnerable to error; finger pressure varied, surfaces weren't uniform, pigments distorted outlines, and the same measurements could overlap heavily between males and females.
"The goal of this research was to avoid those assumptions and use digital archaeology instead," Dr. Jalandoni said.

Two controlled experiments with 96 adult participants were conducted with each person creating nine flutings twice: once on a moonmilk clay substitute developed to mimic the look and feel of cave surfaces and once in virtual reality (VR) using Meta Quest 3.
Images were taken of all the flutings, which were then curated and two common image-recognition models were trained on them.
The team evaluated performance using standard metrics and, crucially, looked for signs that models were simply memorizing the training data (overfitting), rather than learning patterns that generalized.
Team member Dr. Gervase Tuxworth from the School of Information and Communication Technology said the results were mixed but revealed some promising insights.
The VR images did not yield reliable sex classification; even when accuracy looked acceptable in places, overall discrimination and balance were weak.
But the tactile images performed much better.
"Under one training condition, models reached about 84% accuracy, and one model achieved a relatively strong discrimination score," Dr. Tuxworth said.
However, the models did learn patterns specific to the dataset; for example, subtle artifacts of the setup, rather than robust features of fluting that would hold elsewhere, which meant there was more work to be done.
The study showed a computational pipeline, from a realistic tactile representation and a VR capture environment to an open machine-learning workflow, could be built, replicated, and improved by others for a more rigorous scientific approach.
"We've released the code and materials so others can replicate the experiment, critique it, and scale it," said Dr. Robert Haubt, co-author and Information Scientist from the Australian Research Center for Human Evolution (ARCHE).
"That's how a proof of concept becomes a reliable tool."
The team said this research paved the way for interdisciplinary applications across archaeology, forensics, psychology, and human-computer interaction, while contributing new insights into the cultural and cognitive practices of early humans.
More information: Andrea Jalandoni et al, Using digital archaeology and machine learning to determine sex in finger flutings, Scientific Reports (2025).
Journal information: Scientific Reports
Provided by Griffith University