Study finds blind spot for some auditors who use tech-based fraud tests

Gaby Clark
scientific editor

Andrew Zinin
lead editor

A new study finds that auditors of financial statements are less likely to follow up on "red flags" identified by data analytics if the auditors did not play a role in developing the relevant analytical tests. The researchers also identified a low-cost intervention that significantly improves auditor performance when using auditing tools they did not help develop.
The paper, "," is published in the Journal of Accounting Research.
At issue are audit data analytic tests (ADAs), which is a catch-all term for tests designed to evaluate the accuracy of financial statements that use technological tools, such as data visualization, analytic or statistical programs.
"This work is important, in part, because ADAs are becoming increasingly common," says Joe Brazel, co-author of a paper on the work. "And many auditors develop their own ADAs, which are customized to address the specific characteristics of the companies and financial statements they are responsible for auditing. However, when an auditor moves on to a new position, the tools they developed are frequently handed down to their successors.
"We wanted to know if there was any difference in performance between auditors who are using ADAs they developed versus auditors who are using ADAs they inherited," says Brazel, who is the Jenkins Distinguished Professor of Accounting in North Carolina State University's Poole College of Management. "As it turns out, the difference is fairly dramatic."
For the study, the researchers enlisted 173 audit practitioners. Study participants were given one of three scenarios. In the first scenario, participants were placed in a setting where an ADA they developed illustrated that a company's sales data was largely consistent with the company's financial data and industry trends, but there was one section of data that was inconsistent—a red flag that raised the possibility of fraud.
The second scenario was identical to the first, but participants were told they were using an ADA developed by someone else. In the third scenario, participants used someone else's ADA, but were also given a detailed memo outlining how the ADA was developed.
After reading the scenario, participants were asked to develop an expectation for the company's sales, in order to determine whether the sales figures in the company's financial statement were reasonable. Once this step was completed, study participants were asked whether more testing was needed.
"This study design allows us to assess multiple things," Brazel says. "Did the auditor make use of the red flag data when developing the expectation? And if the auditor determined that more work was needed, did they go to their supervisor? Did they ask company management for an explanation or additional information?"
The researchers found that 54% of study participants who developed the ADA said more work was needed and went to their boss or to company management. But only 26% of study participants who had inherited the ADA said more work was needed.
"We see a substantial reduction in professional skepticism when an ADA was developed by somebody else," Brazel says. "That's not good. Looking into a fraud red flag is the first step toward detecting fraud, and almost 75% of the auditors using an inherited ADA didn't even look."
However, the researchers also found what they believe is a constructive path forward.
"It is unavoidable that, at some point, auditors will use ADA tools developed by someone else," Brazel says. "The good news here is that we found that 34% of auditors who used inherited ADAs that were accompanied by memos describing their development reported that more work was needed—they looked into the red flag. That's not as good as developing the ADAs themselves, but it is substantially better than when people are asked to use ADAs with little explanation of precisely how those ADAs were developed."
"A key takeaway here is that there are practical, low-cost steps auditing firms can take to improve professional skepticism," Brazel says. "When developing new ADAs, make time to articulate how those ADAs were developed—and share that information with any auditors who will be using those tools."
More information: XIAOXING LI et al, Inheriting Versus Developing Data Analytic Tests and Auditors' Professional Skepticism, Journal of Accounting Research (2025).
Provided by North Carolina State University