Adding value with digital analysis – auditing
Mark J. Nigrini
Internal auditing’s quick and revealing data analysis proves invaluable in the acquisition process.
The general auditor of Global Technologies was called to a management meeting, where he learned that his firm had decided to purchase HomeTech, a home-security business, from Republic Technologies. With the end of the fiscal year looming, Republic wanted to close the deal quickly, which meant that Global had only five days to review the records and headquarters data.
A team of six auditors, led by Paul Lee, was assigned the task. Responsibilities were divided into three areas: (1) high-level business review, (2) revenue review, and (3) expenditure review. Paul, an ACL user, was responsible for data analysis.
Normally, Global’s computerized analytical procedures were used to find plausible relationships among financial and non financial data, but in this project not all the data was accessible. The new objective was, therefore, to uncover audit findings that would support the negotiation process and to project the scenario that might occur should HomeTech be acquired.
Paul elected to utilize digital analysis, a new audit technology that looks for the abnormal duplication of single digits, digit combinations, specific numbers, and round numbers. Paul believed these abnormal duplications could signal data anomalies relevant to the task at hand.
The expected digit distribution for tabulated data is usually based on Benford’s Law, which is quite counterintuitive. The expected digit frequencies are heavily skewed toward low digits. The expected frequencies for 1 through 9 as a first digit are 30.1 percent, 17.6 percent, 12.5 percent, 9.7 percent, 7.9 percent, 6.7 percent, 5.8 percent, 5.1 percent, and 4.6 percent, respectively. The first digit in a number is the leftmost digit. For example, “6” is the first digit in “63.40.”
Benford’s Law usually applies to tabulated data where numbers describe the sizes of similar phenomena. The list of numbers should not contain a built-in maximum or minimum value. Also, the data must not consist of assigned numbers that, in essence, are symbols, such as social security and bank account numbers and zip codes.
Paul realized that his knowledge of the business and the data was rudimentary, and that Benford’s Law might not apply. His task was to find abnormal duplications and anomalies even though the level of normal duplication was unclear.
Paul requested the customer billing files, separated by month into II files, from January to November. Since auditing an incomplete or erroneous file would provide little value, Paul totaled the dollar amounts and attempted to reconcile to the financial records. The sums in the files represented 97 percent of the revenues shown in the financial records, and Paul felt it worthwhile to proceed, noting that the reconciliation was an outstanding matter.
As a reasonableness check, Paul stratified the records for each of the II months to show the number of records and total dollars in five dollar amount strata: $50.00 and higher, $.01 to $50.00, $0.00, -$0.01 to -$50.00, and -$50.00 and lower. The counts as a percentage of the total number of records remained about equal for each month. A junior person then checked the authenticity of supporting documentation for all monthly bills above $5,000.
Using ACL, Paul graphed the first-two digit frequencies for each of the II months (the first-two digits of 63.40 are 63). The graphs showed the proportion of occurrences for each of the 90 possible (10 to 99) first-two digits. Microsoft Excel was used to prepare neat graphs from the ACL output. Paul also prepared a table of the most frequently occurring numbers of the individual amounts billed for each month.
The graphs showed a changing pattern for the II-month period. In the early months, the most frequent first-two digit combination was 28. Over time, the frequency of 28 decreased; whereas the frequency of 24 increased. By November, 24 was the most frequently occurring first-two digit combination.
The number frequency tables told the same story. In January, $28.80 was the most frequent amount billed; but by November $24.20 appeared most often. Paul presented this data as an audit finding and suggested exploring reasons for the change.
The list of checks written for the year to date was also analyzed. The data profile was prepared using ACL’s stratify command. The primary finding disclosed that about 18 percent of all checks written were for $50 or less. A second analysis, completed by using the classify function, showed that each of two regional telephone companies were payees on more than 100 checks per month. A separate check was being processed for each business location.
A test for round number checks produced some startling findings. About one-quarter of all checks written were for numbers that were round multiples of 100. Furthermore, after sorting by date, it was evident that the round dollar checks were written in groups on specific dates.
Paul requested the perpetual inventory fries on a month-by-month basis. For each month the files for each of about 10,000 inventory items had four fields: the inventory item number, inventory description, most recent cost, and closing quantity. A graph of the first-two digits of the inventory cost prices showed large changes on a month-by-month basis – a circumstance that was somewhat difficult to interpret.
For each inventory item, Paul then used ACL’s maximum and minimum functions to calculate a ratio that showed the highest cost price for the item divided by the lowest cost. For an inventory item with a small, inflationary price movement, the expected ratio was about 1.02. Many items, however, had high ratios – above 100 – and some even had a ratio of infinity when the lowest, or minimum, price was zero.
Paul also analyzed the year-to-date disbursements, or accounts payable, file. Using. the stratify capability of ACL, he prepared a data profile showing that 48 percent of all invoices were for $50 or less. The number of credit memos was about six percent of the number of invoices. The under $50 statistic suggested processing inefficiencies, and the high proportion of credit memos showed frequent corrections to invoices.
The digit frequencies of the invoice dollar amounts were tested against the frequencies of Benford’s Law. The data followed the Law quite closely, suggesting that it passed a reasonableness test.
Paul subsequently ran a Relative Size Factor (RSF) test on vendors. The RSF test calculates the ratio of the largest invoice divided by the second largest invoice for each vendor. The objective is to find a large invoice incorrectly entered for a vendor – an amount that is abnormally high for that vendor – or a decimal point error in entering the dollar amount of an invoice. A ratio of up to 10 is not usually a problem. The test identified about 10 vendors with RSFS of more than 10,000 and about 100 vendors with RSFS of about 100, indicating a possible decimal point error.
He finally used ACL to extract all cases with duplicate invoice numbers and amounts, but with different vendor numbers. The list showed about 200 matches, with an invoice value of $104,000 at the top of the list. The test showed that the system allowed for incorrect payments and that significant recoveries were possible.
The controller admitted that HomeTech had a cash flow problem when Global’s general auditor discussed the round number checks with him. He indicated that, wherever possible, they paid off creditors in installments of round dollar amounts. These amounts were just enough to ensure the continuation of goods and services.
The revenue audit eventually led to disclosure that, recently, the company had changed its pricing schedule and the commission rates it paid agents for new business. The change in digit and number patterns Paul found had occurred because agents had been emphasizing a new product configuration that cost the customer less and attracted a higher commission. The firm was, therefore, losing revenue and paying more in commissions.
Investigation of the inventory cost prices ratios revealed that the company had shifted to a Y2K system and then converted back to the old one because of a high rate of errors. The old system had to be restarted with new data. Inexperienced, part-time employees were hired to assist with data entry, possibly contributing to the number of data errors.
Management used the audit findings to gain a negotiating edge. The cash shortage situation was raised with Republic; and Global used the revenue shift to argue that revenue growth had been, and would continue to be, adversely affected by the new pricing policy. Global noted that any optimistic projections of Republic would need to be tempered with reality. Global also made it quite clear that the inventory records were unreliable and that inventory value would need to be deeply discounted.
The processing inefficiencies in accounts payable and the potential for errors were not shared with Republic. However, Global realized that significant savings could be obtained by improving these systems.
Global’s management was impressed by the swift and incisive results of the computerized analytical procedures. The general auditor stressed that new technology, appropriately managed by internal auditors, could be invaluable in acquisitions, as well as in an expanding range of business challenges.
MARK J. NIGRINI, PHD, is Assistant Professor of Accounting at Southern Methodist University in Dallas. You may reach him via e-mail at Mark_Nigrini@msn.com.
COPYRIGHT 1999 Institute of Internal Auditors, Inc.
COPYRIGHT 2004 Gale Group