When will the "Big Data" and "Analytics" be included in the audit?

Oct 8, 2017

blog logo
blog logo

When will the "Big Data" and "Analytics" be included in the audit?

The massive volume of data available inside and outside companies and the place occupied by new data analysis technologies fundamentally change the audit mission.  This short story will explore the possibilities and explain the main issues facing tomorrow’s listeners.

Historically, data was something you owned and was generally structured and man-generated.  However, technological trends over the past decade have broadened the definition, which now includes unstructured and machine-generated data, as well as data that lies outside the company’s boundaries.

“Big Data” is the term used to describe this vast and growing data portfolio.  The general opinion is that «Big Data» will have a dramatic impact on improving productivity, profits, and risk management.  These data are of limited value unless they are analyzed.

«Analytics» is the process of analyzing data to draw meaningful conclusions.  Companies and large organizations have recognized the possibility that Big Data and their analytics provide, and many are making significant investments to better leverage and impact their businesses.

In this case, the audit firms, themselves concerned in a field, which I am part of, where we see significant potential in the transformation of the audit.

'This is a real migration from traditional audit approaches to those that fully integrate big data and analytics more consistently.'

 

Transform the audit

As we continue to operate in one of the most challenging and unequal economic climates of modern times, the role of auditors in financial markets is more crucial than ever.  Audit firms must provide their audits to serve the public interest by continuously improving quality and providing greater intelligibility, relevance, and reliability to financial statement users.  Professional skepticism and an ongoing focus on the quality of the evidence and its evidence strengths are required in an audit engagement.  Meanwhile, companies expect improved dialogue with their listeners and more relevant information.

While the profession has long recognized the impact of data analysis on improving audit quality and relevance, the routine use of this technique has been hampered by a lack of effective technological solutions, data entry issues, and privacy concerns.  However, recent technological advances in big data and analytics provide an opportunity to rethink how an audit is executed.

The transformed audit will extend beyond sampling-based testing to include analysis of entire populations of audit-relevant data, using artificial intelligence to bring out the most plausible conclusions and more relevant financial information.  “Big Data” and “Analytics” enable auditors to better understand financial statements, fraud, and operational risk risks and adapt their approaches.

While we are making significant progress and starting to see the benefits of “Big Data” and “Analytics” in the audit, we recognize that this is a journey.  A good way to describe where we are as a profession is to draw parallels between Netflix and the movie subscription service.  When the company started in 1997, it adopted a DVD-by-email model, sending movies to its customers, who returned them after an evening or a week of entertainment.  Netflix always knew the future was online movie streaming, but the technology wasn’t ready at the time, nor was high-speed broadband as widespread since its inception.

Today we are engaged in the equivalent audit of DVD-by-mail, transferring the data of our customers to our auditors.  What we want to do is have smart audit devices that reside on companies' data servers and disseminate the results of the analysis to audit teams.  But the technology to accomplish this vision is still in its infancy, and in the meantime, we are conducting financial statement analyses (ex-post) through analytical review techniques that are numerous and varied, mainly using likelihood review, absolute data comparisons, relative data comparisons (ratios) and trend analyses.

That said, our current approaches are based on out-put and not in-put. This critical examination allows the auditor to explain the obvious anomalies, but it is by no means sufficient on its own to prove that an account or an accounting document does not contain any anomalies. Indeed, it is not because there is no anomaly that there are no hidden ones. 

 “The transition to this future will not happen overnight.  It is a giant leap from traditional audit approaches to others that fully transparently integrate big data and analysis.”


Obstacles to the integration

There are several barriers to the successful integration of “Big Data” and “Analytics” into the audit.

The first is data capture: if auditors are not able to effectively and efficiently obtain the company’s data, they will not be able to analyze it in an audit approach.  Today, companies invest heavily in protecting their data, with multi-layered approval processes.  As a result, giving auditors access to its data might not be obvious. And if so, this access can take a long time.  Indeed, companies refuse or are reluctant to provide data, citing security issues.

In addition, auditors encounter hundreds of different accounting systems and, in many cases, multiple systems within the same company.  Data extraction has not always been a core competency within the audit, and companies do not necessarily have that competency.  This results in multiple attempts and a lot of back and forth between the company and the auditor on data collection.

Today, data extraction is primarily focused on general ledger data.  However, the data required to transform the audit are those obtained from sub-accounts, such as revenue data or procurement cycle data, for key business processes.  This increases the complexity of data extraction and the volumes of data to be processed.

While it is reasonably easy to use descriptive analytics to understand the business and identify potential risk areas, it is much more difficult to use “Analytics” in the audit to produce evidence-based conclusions in response to these risks.  Another problem with relying on Big Data analytics to produce audit evidence is the ambiguity or complexity of how the data analysis operates, with algorithms or rules used to transform binary data and produce visualizations or reports.  When the auditor reaches this stage, he must find the appropriate balance between the implementation of professional judgment and the use of the results of these analyses.

Indeed, the balance that must be sought is a dosage between processing and modifying the data received from the customer.

Remember that the audit raises fundamental questions only because its process is neither fully knowable nor fully determinable. The audit depends on the ability to judge, the ability to relate facts, norms and concepts, and moral and human values. To enhance the exercise of this judgment, we propose to strengthen the process even through the integration of «Big Data» and «Analytics».

This integration will not be valued unless it substantially changes the nature, timing, and extent of the audit procedures.  This will require the auditor to develop new skills focused on knowing how to speak the numbers and the ability to use the outputs of the analyses performed as audit evidence.


Analytical dilemmas

Another question is how audit standards and regulations can be aligned with the use of “Big Data” and “Analytics” in the audit.  Generally speaking, the auditing profession is governed by standards that were developed a few years ago and that did not explicitly consider the possibility of exploiting Big Data.  Here are four areas that will require more consideration.

1. Substantiation Analytical Procedures: It examines the correlations between specific elements of the financial statements or the reasonableness of a financial indicator through a comparison to expectations.

One of the main differences with big data analysis techniques is that the procedures are used to identify unusual or false transactions, based on data analysis, usually notwithstanding any expectation expressed by the auditor. It is «Analytics» that determine expectations.   Big data and this kind of analysis techniques did not exist when the standard was designed, so are not considered a source of evidence.  The standard did not explicitly provide for the use of Big Data as a source of evidence.  

2. The validation of the data used for the analysis: as an auditor, highlighting his skepticism leads him to an examination of the occurrence, accuracy, and completeness of the information received from the client.  This applies to physical documents (such as contracts) or electronic data.

However «Analytics» in the audit does not use or rely on reports generated by the system. Instead, it is based on the underlying data that is extracted directly without any manipulation by the client through continuous access to the data server. The auditor is then convinced that his analyses are based on complete and exhaustive information of the moment when the standards do not provide instructions regarding the type and volume of data he extracts.  

3. Definition of Evidence: Standards provide a hierarchy of the evidence strength of the evidence collected, ranging from external to strong evidence to internal evidence.  However, the said standards do not indicate what type of evidence «Analytics» provides.

Thus, in the absence of this, a diligent auditor aware of the binding normative framework, will be hesitant to consider it as evidence.  

4. Accuracy and Confidence: The audit is designed to detect a significant anomaly.  Users of the financial statements expect to be free from material misstatements. So, what level of accuracy do auditors need for their «Big Data Analytics»?  Standards should provide more guidance in this area.

In the end, tomorrow’s audit could be very different from today’s.  Auditors will also be able to use Big Data and Analytics to better understand the company, identify the main areas of risk, and know exactly what relevant information the auditor must extract from Analytics.  However, to achieve this transformation, the profession will need to work closely with key stakeholders from business management to regulatory and standards bodies.


What we recommend to audit committees:

We have identified three key areas for the audit committee and corporate leadership to consider now concerning “Big Data” and “Analytics”:

1. External Audit: develop a better understanding of how their data analysis is used today in the audit.  An investigation of the entire process is inevitable, from data entry which is a major obstacle, to the scope of the data collected and the measures taken by the company’s IT function to streamline the input.

2. Compliance and Risk Management: Understand how the internal audit and ethics functions may be using “Big Data” and “Analytics” today, and the plans of the oversight process automation directorate.

3. Skills development: the success of investments in “Big Data” and “Analytics” will be determined by the human capital invested.  The focus should not be limited to the development of technological skills, but should extend to the creation of an analytical logic in the finance, risk management, and audit functions to consume effectively «Big Data Analytics». 


Have you heard of Robocop?

In 2013, the US Securities and Exchange (SEC) Commission announced new initiatives to better identify financial information fraud through the use of “Big Data” and “Analytics”. One of them is the Accounting Quality Model (AQM), often called "RoboCop". AQM is a fully automated artificial intelligence system that analyzes a company’s deposit within 24 hours. The system is designed to identify high-risk activities by comparing the current filing with those of companies within the same sector around the world. The SEC then expanded the capabilities of the model to include an analysis of the manager’s management reports.  SEC analysts have developed lists of words and sentence choices that are common among fraudsters.  These lists were transformed into risk factors and incorporated into the AQM review process. AQM assigns a risk score to each deposit, assessing the likelihood that fraudulent activity has occurred.  The SEC inspection staff uses the AQM risk score to help influence the scope and direction of audit plans.

© LucaPacioli - 2024 - All rights reserved

Luca Pacioli is a multidisciplinary, local firm that imagines and develops comprehensive and integrated solutions to support business leaders in their daily activities and throughout the life of their company, from inception to transfer. Traditional and digital accounting expertise, legal and social formalities, training, auditing, advice in business law, strategy, or wealth management, the diversity of our expertise allows us to support our clients in their daily management and future projects.

Background motif

© LucaPacioli - 2024 - All rights reserved

Luca Pacioli is a multidisciplinary, local firm that imagines and develops comprehensive and integrated solutions to support business leaders in their daily activities and throughout the life of their company, from inception to transfer. Traditional and digital accounting expertise, legal and social formalities, training, auditing, advice in business law, strategy, or wealth management, the diversity of our expertise allows us to support our clients in their daily management and future projects.

Background motif

© LucaPacioli - 2024 - All rights reserved

Luca Pacioli is a multidisciplinary, local firm that imagines and develops comprehensive and integrated solutions to support business leaders in their daily activities and throughout the life of their company, from inception to transfer. Traditional and digital accounting expertise, legal and social formalities, training, auditing, advice in business law, strategy, or wealth management, the diversity of our expertise allows us to support our clients in their daily management and future projects.

Background motif