Activity-based costing and simulation modeling

Activity-based costing and simulation modeling

Swain, Monte R

Business processes in the banking sector are becoming increasingly complex to support multiple markets, products, and customer demands. Increased competition makes the margin for error in managing processes and pricing outputs that much tighter. Hence, it is demonstrated time and again that organizations successful in understanding their cost structures and in managing their business processes to satisfy customer demands are successful in the competitive marketplace.

Activity-based costing (ABC) and activity-based management (ABM) have been tools for understanding costs and managing business processes. Increasingly, practitioners are applying the ABC model to better understand and manage bank service business processes. The purpose of this article is to strengthen the application of ABC/ABM with another modem management tool simulation modeling.

ABC requires much more detail than traditional costing methods and requires that the detail used be based on actual business processes in the organization. The value of ABC is that the resulting cost model is much more realistic-it “looks” more like the actual business process it is costing. Combining ABC with ABM provides valuable insight into the interactive effects of costs and performance measures. This combination supports efforts to intelligently manage both costs and quality.

There are, however, two difficulties in applying ABC to the financial service setting. The first difficulty is the level of investment and activity detail required as the cost system begins effectively mirroring the realities of business processes. Though ABC-type cost models have been promoted for the last 30 years, only the recent proliferation of inexpensive computing power and the development of advanced database software have provided the ability for organizations to cost-effectively track the required information. Consequently, this first difficulty can be managed.

The second difficulty is somewhat more subtle. Those involved in interviewing employees and observing business processes to create the new costing model quickly discover how difficult it can be to obtain and use precise estimates of resource and activity flows. In other words, the ABC/ABM implementation team will often find that employees are uncomfortable esti mating exact values when describing how cost resources attach to particular activities. It is also difficult to specifically describe an exact relationship between activity cost pools and products or other cost objects. There is, frankly, much that can be rather vague in specifying complex business processes.

This article uses a telephone support center at a regional mortgage bank to demonstrate the integration of ABC and simulation analysis.1 Consider the information used in Exhibit 1 to attribute costs to mortgage products. How accurate is this information and how much can management depend on it in making future forecasts? Assuming that the telephone center spent $600,000 in both fixed and variable costs last year, will much variance in this cost figure occur in the future? The same questions apply to total minutes on line and the average minutes per telephone call. How confident is the ABC/ABM implementation that it takes five minutes to answer an inquiry of rates and terms on adjustable rate mortgages and only three minutes to answer the same inquiry on fixed rate mortgages? Isn’t it also important to know the range of these estimates? Efforts to get a department manager’s estimate of resource drivers and activity drivers are often frustrated by these questions.

Others have written on interviewing techniques to help “commit” the manager to single point estimates. However, this strategy has serious shortcomings. If possible, the accountant should avoid forcing the manager or the time tracking system to say that, “It takes, on average, 15 minutes to solve a problem regarding adjustable rate mortgages.” In contrast, consider these alternative responses to the inquiry on this business process:

“It usually takes about 15 minutes to handle a problem call on an adjustable rate mortgage, but that time can easily range from five to 45 minutes.”

* “Time to handle adjustable rate mortgage problem calls ranges between 10 and 20 minutes, with an average time of about 15 minutes.”

Each of these alternative statements from the department manager implies a cost function that is fundamentally different from the other. Critical information is lost in an analysis that simply targets 15 minutes for this ac tivity, eventually resulting in potentially serious limitations on the future ABC-based information system. Furthermore, the issue of developing “realistic” information on cost and activity flows is important to the cost model (i.e., ABC). More difficult (and more important) are the questions regarding the effects of cost and process management on questions of performance quality (i.e., ABM). These issues include questions such as:

* “How does the number of calls received or the way I staff the terminals affect the activity model?”

* “What aspects of this model are most important to expected levels of

employee idle time and customer hold time?”

Hence, in addition to allowing uncertainty and range estimation in establishing costs, simulation analysis can also provide inexpensive insight into the interaction of cost structure and process performance. These advantages of simulation analysis are particularly important in the context of a dynamic service process. With a software tool that automates much of the typical complexity found in service processes, the typical level of detail and the interdependency of events can quickly overwhelm the ABC/ABM analysis effort.

A customer telephone service center in a mortgage department at a regional bank is used in this article to demonstrate ABC/ABM analysis using simulation modeling (graphically depicted in Exhibit 4). Essential ABC and volume-based cost data related to this example of a service-type business process is provided in Exhibit 1. Essentially, the telephone service center has 10 telephone/terminal sets linked to the banks mortgage computer. Telephone calls coming into the telephone center fall into one of four categories:

1. Inquiry into current rates and terms

2. Inquiry into the status of a loan application

3. Problem solving and servicing of current mortgages

4. Other simple inquiries regarding questions such as operating hours

The telephone center supports both fixed-rate mortgage (FRM) products and adjustable-rate mortgage (ARM) products. Before using ABC, the telephone center followed a traditional volume-based costing approach, calculating cost per call by simply dividing annual department costs by the total number of calls received for the year ($3.75 per call answered).


The ABC/ABM model shown in Exhibit 2 has been well established previously by others.2 In this article, ABC is depicted as the horizontal view of this model where resources (including costs) are used in the support of activities (or business processes) in the organization. Activities are required to support the ultimate cost objects (for example, products or customers). Determining the logical relationship between each resource and all associated activities makes it possible to assign resource costs accurately to activities using a resource driver. Summing all assigned resource costs results in activity cost pools. Continuing the horizontal view, activities are then related to cost objects by defining how each cost object uses or consumes each particular activity. A single activity driver is used to assign costs of an activity pool to related cost objects.

For example, specifics of the mortgage department’s customer telephone support center included in Exhibit 1 suggest that the telephone center spent $600,000 in total costs and used 783,500 on-line annual minutes as resources in this department. These resources are used in the telephone center to support the four basic activities described previously. The cost objects are the FRM product line, the ARM product line, and a pool of other activities that cannot be strictly related to either product line. As a first step in ABC, the cost manager should determine the activity cost pools for rates and terms inquiries, application status inquiries, problem solving, and other calls. The second ABC step assigns the costs of each activity cost pool to FRMs, to ARMs, or to common activities.


ABM is depicted as the vertical view of the ABC/ABM model in Exhibit 2. ABC shows how to attach costs of business processes to products or customers. ABM supports the management of both cost and performance of business processes. To understand why costs of a particular activity cost pool are rising or falling, the ABM system can provide several useful cost indicators (or cost drivers). Based on these cost indicators, management works to improve the costs of the activity pool. Using the telephone center data, the number of calls received and the configuration of staffing assignments are indicators that help explain cost fluctuations in this department. Finally, ABM also provides performance measures to aid in the management of the quality of the business process. For the telephone center, average customer time on hold, the number of employee errors, and missed customer calls provide measures of customer satisfaction. Tracking the amount of employee idle time is a measure of management efficiency in staffing the telephone center.

It is important that those involved in implementing the ABC/ABM model understand the essential difference between the horizontal and vertical views of Exhibit 2. In summary, the horizontal view (ABC) focuses on activities as a tool for assigning costs to mortgage products (the cost objects). In contrast, the vertical view (ABM) focuses on managing both cost and quality of the four core business activities or processes within the telephone service center.


In the telephone center example, the costing of ARM and FRM products was originally accomplished by simply dividing the $600,000 in department costs by 160,000 total annual calls, resulting in $3.75 cost per call (Exhibit 1). The effect of this volume-based costing approach is to allocate telephone center costs of $301,875 to the FRMs, $275,625 to the ARMs, and $22,500 to the common cost pool. A subsequent conventional ABC model used a two-step approach to change this cost structure. Working directly with management of the telephone center, the cost analysts established that there were 783,500 on-line minutes with customers in the telephone center, resulting in a cost per minute of $0.766 (Exhibit 1). This resource driver is used to trace resource costs to activities in the telephone center. Additional work with management identified the types of calls taking place in the telephone center and the average number of minutes per type of call, resulting in a cost per activity. These activity drivers are used to trace costs of activity pools to the cost objects. The effect of this volume-based costing approach is to allocate telephone center costs of $210,957 to the FRMs, $375,247 to the ARMs, and $13,782 to the common cost pool (Exhibit 1). When compared against the traditional volume– based costing approach, this ABC analysis demonstrates a classic cross-subsidization pattern. The FRMs have been overcosted and the ARMs undercosted.

The ABC analysis seen in Exhibit 1 is straightforward and provides valuable information. However, as stated previously, the typical ABC analysis does not deal well with the realities of randomness, uncertainty, and the interdependencies of resources and activities. An ABC system that is designed and managed with spreadsheet or database software using point estimates of average cost and performance measures (similar to Exhibit 1 ) does not usually allow dynamic or contextual analysis. Much of the reason for this is the fact that static, algebraic calculations as used in the example above ignore effects of variability and queuing. However, ABC analyses that are able to incorporate randomness, uncertainty, and the interdependencies will better reflect real business processes. Such an analytic approach can actually simplify efforts to capture the typically complex business process by allowing the imprecision of the process to be explic itly included in the ABC fact-finding effort. The rewards of this approach will be greater accuracy in capturing cost relationships and greater confidence in applying the results of the analysis to reengineer business processes. The effective use of computer power can be applied to realistic cost analysis using simulation modeling software.


Over the past few years, several new software tools have been devel oped specifically for modeling business processes and workflows, particu larly in the ABC domain. Most of these tools define business processes using graphical symbols or objects, with individual process activities depicted as a series of boxes and arrows. Specific characteristics of each process or activity may then be attached as attributes of the process. Many of these tools also allow for some type of analysis, depending on the sophistication of the underlying methodology of the tool. In general, analysis and modeling tools can be broken into three categories: flow diagramming tools, process capture tools, simulation modeling tools.

Flow diagramming tools. At the most basic level are flow diagramming and drawing tools that help define processes and work flows by linking text descriptions of processes to symbols. Typically, these flowchart models provide little, if any, analysis capability.

Process capture tools. Process capture tools are developed specifically for documenting the critical elements of a business process. These tools provide a conceptual framework for modeling hierarchies and process definitions. They are typically built on relational databases and include functions that provide static (time independent) and deterministic (no variation) analysis capability. Typical calculations include process cycle time, cost analysis, and resource utilization.

Simulation modeling tools. Simulation tools provide continuous or discrete-event, dynamic (time dependent), and stochastic (random variation) analysis capability. Simulation tools typically provide animation capability that allows the process designer to see how customers and work objects flow through the system. The ability to display time dependency and ran domness through on-screen animation allows a manager to “watch” a business process and judge whether an accurate representation has been achieved. The ability to effectively represent the manager’s perspective is critical to accurate analysis and successful communication of ABC/ABM results. The extended capabilities found in a few of the ABC/ABM process capture tools listed above do provide some “what-if’ analyses. However, the “what-if’ capabilities of these tools are static and deterministic. The consequent effort required to specify artificial point estimates of resource usage and production flows that in reality exhibit random, time dependent qualities adds unnecessary complications to the data gathering and modeling process. Alternatively, generic simulation modeling tools are becoming available that allow development teams to easily incorporate business process variation into a dynamic ABC/ABM model.

There are several essential differences in the deliverables offered by these three types of tools. First, flow diagramming tools provide help in un derstanding the overall nature of an existing business process (similar to the function of Exhibit 2). Nonetheless, these tools lack the ability to accurately predict the outcome of proposed changes to that process. Unlike di agramming tools, process capture tools do suggest some interpretation of output data. However, these tools have limited capability for modeling vari ability and uncertainty and, therefore, provide limited insight into the potential cost and performance effects of changes to the business process. Simulation modeling tools, on the other hand, are time-based decision– support tools that take into account fluctuations in a business process using mean, variance, maximum, and minimum measures to simulate current values of resources, activities, cost objects, cost drivers, and performance measures. Thus, the computer is able to replace the human effort to resolve complicated business processes that are inherently imprecise.

Simulation analysis can be particularly valuable when developing ABC/ABM models of bank service processes like the telephone support center described earlier. Compared to traditional manufacturing processes, service processes are much more variable because people are so directly involved in the process flow. Accordingly, to assume that a bank service process will achieve some kind of a “steady state” can be hazardous to an ABC/ABM analysis. Queuing factors can cause people to change their minds, resulting in changes to their current status or future decisions. For example, customers may attempt to enter a queue, see that it is full, and thus move to another queue or leave the process entirely. This type of jockeying can also take place any time while the customer is in the queue and requires on-the-spot decisions by managers to reallocate resources to serve changing backlogs of customers. Compared to manufacturing processes, this factor alone greatly complicates the ABC/ABM model of service processes.

Furthermore, customer arrival times are usually erratic, though somewhat predictable. Each arrival event may consist of single or multiple customers. Usually, some general arrival patterns can be established based on aspects like time of day or day of week. However, there is often so much variance that arrivals should be represented by distributions rather than by point estimates. For example, rather than requiring precise arrival times and rates, it may be much easier and more valid for the telephone support center manager to simply estimate that the center receives between 250 and 300 calls a day with about half the calls coming between 11:00am and 2:00pm. Currently, only simulation modeling tools allow these kinds of realistic approximations.

Finally, bank service processing tends to be highly variable and dependent on the agent providing the service. A change in the state of a queue (e.g., number of customers on hold) may cause an agent (e.g., telephone operator) to work faster, thereby reducing processing time. A change in the state of the customer (e.g., becoming angry) may cause the agent to work faster to complete the service. Similarly, a change in the state of the agent (e.g., fatigue) may also change the service time. The implications of these dynamic processes can effectively nullify the usefulness of common analysis tools like spreadsheets or flow charts.


Once a suitable application or project has been identified as a candidate for simulation, decisions must be made about how to conduct the study. There are no strict rules on how to perform a simulation study; however, the following steps are generally recommended as a guideline:

1. Plan the study. This stage involves defining the objectives of the study, identifying system constraints, preparing simulation specifications, and developing a budget and schedule.

2. Define the system. This stage requires the development of a conceptual model on which the simulation model will be based. It includes making assumptions, gathering data, and converting the data to a useful form.

3. Build the model. The goal of model building is to create a valid representation of the defined system operation. A model is neither true nor false, but rather useful or not useful. The process of building a model consists of several subtasks including:

* actual “coding” of the model using an appropriate software tool

* verifying that the model works as intended

* validating that the model reflects the behavior of the actual system or reasonably predicts the behavior of a proposed system.

4. Run experiments. The fourth step is to conduct simulation experiments with the model. The goal of experimentation is not just to find out how well a particular system operates, but to gain enough insight to improve the system.

5. Analyze the output. Output analysis deals with drawing inferences about the actual system based on the simulation output. When conducting simulation experiments, caution must be used in interpreting the results. Because the results of a simulation experiment are random (given the probabilistic nature of the inputs), an accurate measurement of the statistical significance of the output is often necessary.

6. Report results. The last step in the simulation procedure is to make recommendations for improvement in the actual system based on the results of the simulated model. These recommendations should be supported and clearly presented so that an informed decision can be made. Documentation of the data used, the models) developed, and the experiments performed should all be included as part of a final simulation report.


The following section demonstrates the steps followed in applying a simulation study to the mortgage service center described earlier.

Planning the study. Following the guidelines listed above, the first step in building a representative simulation model is to plan the study. In the case of the mortgage telephone service center, the objectives of the model are to determine the effect of variability and uncertainty on the performance of the system, and to see how these factors affect the associated costs of servicing each product. The boundaries of the system include the mortgage service center operations only, with each full-time and part-time employee assumed to be dedicated to answering service center calls.

Defining the System. The next step is to define the system inputs, outputs, and processing characteristics. The data required to build a model of the telephone service center include, for example, staffing assignments, number of calls of each type, service time per call, and cost per labor hour.

In defining the staffing assignments, Exhibit 3 shows two different staffing arrangements. Scenario 1 considers four full-time operators on duty from 8:00AM to 5:00PM, with 15-minute morning and afternoon breaks and a 30-minute lunch break. In addition, six part-time staff members provide support hours during the middle part of the work day. This staffing arrangement is designed to be heaviest during the middle of the day when the level of incoming telephone calls tends to be highest. However, given the rather random nature of such a process (e.g., number of telephone calls, length of time to answer calls, etc.), this arrangement may also result in unnecessarily idle resources or resources strained beyond capacity-issues that have important product cost (i.e., ABC) and process management (i.e., ABM) implications.

Alternatively, Scenario 2 considers a combination of three full-time operators and seven part-time operators. The goal of both staffing arrangements is to minimize average customer hold times and the number of missed calls due to unavailability of an operator, while simultaneously minimizing the number of labor hours required. Determining which scenario is the most optimal staffing arrangement would be difficult using a process modeling tool that cannot easily work with time dependent events which exhibit significant random variation.

Focusing on the expected telephone call demand as a beginning point, there are several approaches that could be used to develop a realistic view of this rather random process. Perhaps the easiest approach would be to have the telephone center manager provide estimates of the typical range and average levels of calls of each type that come in daily, similar to those shown in the table below.

One the other hand, if the manager is not confident in making this assessment, or if the telephone call volume measure is particularly important as an input to the simulation model, a more rigorous approach may be required. For example, perhaps actual measures of telephone call volume can be tracked (or are already available) over a period of time. Assume, for example, that records for the past 100 days show the following number of calls regarding Fixed Rate Mortgages (FRM):

Obviously, the actual daily quantities of each call type are a major source of randomness in the mortgage telephone service center. If large numbers of data like these are available, a statistical procedure known as curve fitting can be performed to determine appropriate statistical distributions. These data can be easily read into a statistical curve fitting software (available within most modem spreadsheet packages) and shown to have been produced by a Poisson distribution with a mean equal to 268 calls per day. Furthermore, suppose that calls are distributed throughout each day according to a distribution similar to the one shown in the table below. In this case, 9% of the FRM calls arrive in the first hour of each day, 11% arrive in the second hour, and so on. Most simulation modeling software allows distributions of this sort (rather than a simple point estimate of 268 calls per day) to be built directly into the cost and performance analysis. This combination of random arrival quantities and distributed arrival patterns produces a rich view of customer calls into the system that is significantly more realistic than an estimate of 268 average FRM calls per day.

The amount of time it takes to realistically serve a call in the mortgage service center depends on the type of call being served. Similar to call demand, service time should also be captured as a distribution. In this case, instead of using simple average values, triangular distributions are used to represent service times for each type of call. These distributions specify the minimum, most likely, and maximum times that a particular type of call will require to serve. Actual parameters for each distribution are usually obtained from employee interviews. For the simulation analysis in this article, the following service time distributions are assumed:

It is always best to use historical data whenever possible. If an automated call tracking system records the type and length of each call, then accurate statistical distributions can be generated from that data. Nonetheless, this information is often not available. Consequently, interviewing telephone operators and asking for their best estimates of telephone call durations is an acceptable alternative. Triangular distributions based on interviews provide a good first-cut estimate of the actual time typically required to serve a given type of call.

Building the Model. Building the model is a relatively straightforward and simple process. The model shown in Exhibit 4 was developed using Service Mode(TM) 3. The model is viewed as a group of identical work stations where any type of call may be served. If an operator is available when a call arrives, it is served immediately. Otherwise, the call is placed on hold for the first available operator, or until the customer hangs up due to impatience. Once a call has been served, it exits the system and statistics are updated. On-screen statistics keep track of the number of calls served of each type and the number of minutes used in servicing each product. In addition, the number of calls in the system at any time is shown, along with the average customer holding time and resource utilization in each one-hour period.

Running the experiment Exhibit 5 summarizes the simulated cost of op erating the service center for a period of one year, replicated 30 times. Under traditional volume-based costing, the FRM product is cross-subsidizing the ARM product due to the greater amount of time required to serve ARM calls.

Volume-based and ABC cost figures in Exhibit 5 are based on actual resource utilization of 54 percent of capacity. Though the individual cost outputs are not exactly the same, the overall cost structures for each of the two models follows a pattern similar to the cost structures originally presented in Exhibit 1. The variation from the exact numbers in Exhibit 1 is the result of the important difference between an analysis using a simulation tool and the same analysis using a process capture tool, such as a spreadsheet software program. As described above, the primary activities and cost assumptions built into the simulation analysis are not point estimates or simple averages. Rather, the input information allows uncertainty and interdependency. Therefore, every time the simulation analysis is completed, new inputs of time, costs, and volume are input based on the distributions built into the model. Multiple replications of the model provide, in a sense, a long-term perspective of the business process. General trends in costs and performance can be observed. Thus, business processes modeled in the simulation analysis will exhibit a much closer connection with the reality of the actual processes. Further, the simulation analysis will result in more effective analysis of cost and other performance measures by explicitly incorporating realistic random variation and interdependencies among agents, customers, and processes. Accordingly, management will be more confident of the results and will be able to engage in much more effective “what-if” analysis.

Analyzing the output. In addition to the cost information provided in Exhibit 5 (the horizontal view of the ABC model), the simulation analysis also provides a valuable look at the effect of the business process structure on important performance measures. To the extent the analysis output suggests undesirable performance outputs, the actual business process can be designed to avoid similar performance.

Exhibit 6 graphically shows the effects of the two staffing scenarios seen in Exhibit 3 on hourly utilization of employees and on hourly customer hold times. The effort to use part-time help to better balance resources with demand in the middle hours of the day seems to pay off in terms of resource utilization. Using all full-time employees through the day (Scenario 1) resuits in roughly 40 percent utilization in early morning and late afternoon, and nearly 100 percent utilization at the noon hour. The Scenario 2 staffing schedule generally maintains employee idle time at roughly 10 percent; though most likely, this staffing plan is understaffed during the 2:00 afternoon hour. Analysis of the ABC costs (not shown) confirms the expected reduction in costs for Scenario 2 using part-time employees.

Before using part-time employees to help staff the support center, management will also want to evaluate projected customer hold times based on the simulation analysis. Scenario 2 is a first cut at alternative staffing configurations which results in more efficient utilization and lower product costs; however, as seen in Exhibit 6, customer hold times in the early morning and afternoon are in excess of three minutes. If these are unsatisfactory hold times, management should experiment further with other staffing configurations and rerun the analysis, perhaps replacing one of the 10:00 to 2:00 part-time customer service representatives (CSR) with a full-time CSR. It is expected that the level of resource utilization in this new configuration will slightly decrease with a commensurate increase in costs. Using the simulation analysis approach, management should easily determine an acceptable final staffing configuration. While this final configuration may not be perfect and optimal (if such a configuration even exists), it will reflect a view of a realistic business process-full of estimation uncertainty and complex cost/performance interactions. As a result, management will be more accurate in designing their cost system and more confident in implementing changes to the business process based on the new cost system.


As demonstrated by the analysis of the mortgage service center example, computerized simulation models can provide a wealth of detail and information for a complex bank service process. In terms of cost analysis, simulation modeling can provide more robust measures of costs while taking into account extreme events and their likelihood of occurrence. Hence, cost estimates are more accurate and realistic and decision makers are provided with the range of cost estimates useful for risk assessment. In addition to more accurate tracking of service cost flows via ABC, simulation empowers the full ABC/ABM model for planning and control purposes by providing management insight into relationships between cost and performance factors. The realities of actual business processes include variability, uncertainty, and interdependencies of resources and activities. In these settings, computerized simulation modeling is a valuable management tool.

Over the past few years, many financial institutions have invested substantial time and resources creating Activity-Based Costing (ABC) systems which will support the institution’s product, organization, and customer profitability measurement processes. The road towards developing such an ABC system can be a long and bumpy one. At the very least, a great deal of attention and focus is required to carve up an institution’s traditional expense base and reassemble it by activity. Many times the effort required to develop the system can be rather all-consuming. With their attention so keenly focused on this development effort, profitability measurement teams can lose sight of why the system is being built in the first place, i.e., to help managers make better decisions. When asked by one of the bank’s managers how to use ABC data, profitability team members may tend to explain where the numbers came from rather than how the manager can use the information to make better decisions. This article will outline three basic analyses which a) incorporate ABC data, b) can help the institution make better decisions, and c) can provide profitability team members with some answers to the question, “How do I use this stuff ?”


Before diving into the three analyses mentioned above, some basic assumptions regarding the development of activity-based costs is required. In the financial services industry, activity-based costing has been defined as the process of pooling costs associated with specific processes also known as activities. These processes (activities) can be shaped and defined by the institution’s interaction with its customers, the way it supports and delivers products, or the way its business units are organized and interact with each other. In each of these cases, the underlying theme is that there is an activity which takes place that causes the institution to consume resources and, as a result, incur cost. The primary assumption in an activity-based costing model is that the performance of an activity drives the institution to incur cost.

Although there are a variety of ways to develop activity unit costs, this article assumes that the following method is applied: First a financial institution’s operating expenses; i.e., compensation & benefits, occupancy & equipment, travel, marketing, and various other professional fees, are booked on a general ledger system in a variety of cost centers. These cost centers are reviewed to determine what types of tasks (functions) are performed in each center. The expenses booked in each cost center are then mapped to their corresponding function to generate a function cost pool. The third step in the process is mapping the newly defined function costs to an appropriate set of activities. As mentioned earlier, the activities can be defined from a number of perspectives; however, the institution must be able to establish a relationship between the functions performed by various cost centers and the activities which are driving the organization to consume cost. This is the essence of activity-based costing. Once the institution’s expenses are pooled by activity, an activity unit cost can be easily calculated by dividing the pooled cost by the underlying volume which is driving the institution to incur the cost.

Diagram I outlines a simple example of activity-based costing logic which will be referenced throughout this article. In this example, center #1 incurs traditional expenses of $870,000. These expenses are mapped to the two functions which center #1 performs; i.e., $610,000 to function #1 and $260,000 to function #2. Next, the function expenses are mapped to the activities each supports. In this example function #1 supports activity #1 and activity #2, contributing $600,000 and $10,000 to each respectively. Finally the total activity expenses are divided by the activity base period volume to determine the base period activity unit cost. In the case of activity #1, $600,000 of expense is divided by a base period volume of 12,000,000 units to yield an activity unit cost of $0.05/unit.

A financial institution which has implemented an ABC system will have numerous cost mappings similar to those outlined in Diagram I. In fact, almost every department within the institution will probably have been evaluated in such a manner. This cost mapping logic, along with the resulting activity unit costs, provides a wealth of information which department managers can use in the decision making process.


The first way in which a banks managers can use ABC data is in the analysis of a capital purchase. There can be many instances when a manager is trying to cost justify the purchase of a new piece of capital equipment, but is having trouble quantifying the potential savings associated with the new purchase. ABC can help quantify the “value added” associated with the capital investment. For example, assume that an institution has implemented an ABC system with logic similar to that outlined in Diagram I and has identified Activity #1 as being related to its encoding operation. In addition, assume that the institution’s Items Processing Department (IP) is considering the purchase of a new piece of encoding equipment which costs $400,000. This piece of equipment will allow encoding operators to work faster and will require less technical maintenance. As noted in Diagram II, the ABC data suggests that it costs the institution approximately $0.05 to process/encode an item; on average, the institution processes 12 million items each year. If the items processing department projects that with the new equipment it can reduce its per unit processing cost from $0.05 to $0.04 (a combination of reduced encoding time and reduced machine maintenance expense), then this department should be able to realize a $120,000 annual reduction in operating expenses by purchasing the new equipment. The $120,000 annual figure is the result of multiplying the change in unit cost ($0.05-$0.04) by the annual volume processed (12 million). This figure can then be compared to the initial capital investment of $400,000 to determine the project’s financial impact. In this case it appears the equipment would pay for itself in roughly 3.3 years ($400,000 initial investment/$120,000 savings per year). In addition, continued monitoring of the activity unit cost can provide subsequent review of the investment to insure that projected expense reductions are being realized.


A second way that financial institutions can use ABC data is in spending variance analysis. A traditional definition of spending variance looks something like Diagram III, where it represents the difference between actual expenses incurred and expected expenses. Actual expenses normally represent the actual volume of a particular item produced multiplied by the actual unit cost to produce it. This figure is comparable to actual expenses booked in a general ledger system. Expected expenses normally represent the actual volume produced multiplied by the standard unit cost to produce it. This standard cost will tend to be static and will most likely have been developed based on some prescribed level of performance. The standard unit cost noted in Diagram III is comparable to the base period activity unit cost mentioned previously.

In the example outlined in Diagram I, valuable information can be gained by analyzing the mapping of expenditures from centers through functions to activities. If an institution works backwards through its cost development logic, it can determine an individual center’s impact on an activity unit cost. In Activity #4 for example (reference Diagram IV), the unit cost of $0.30 can be broken down into its center components by following the total activity expenses back to their original centers. In the diagram it is apparent that $10,000 of activity expense comes from Center #1, while $50,000 worth of expense comes from Center #2. If each of these values is divided by the activity base period volume, the center contribution to activity unit cost can be calculated.

Once the center contribution to activity unit cost is calculated, it can be used to perform a spending variance analysis by center. If in a given period Center #2 actually incurs $70,000 of expense and the institution actually processes 250,000 units of Activity #4, a spending variance can be computed for Center #2 (reference Diagram V). First, the expected center expenses are calculated by multiplying the center contribution to activity unit cost ($0.25) by the actual activity volume (250,000). This calculation yields an expected expense figure of $62,500. This figure is then compared to the actual center expenses of $70,000 to yield a $7,500 unfavorable spending variance ($70,000 – $62,500) for Center #2. The variance is considered “unfavorable” because the center spent more money than expected to produce the 250,000 units of Activity #4. This spending variance methodology can be applied to most centers in an institution, and a database of spending variance figures can easily be developed. Once a database is developed, profitability managers can quickly sort the data to bring centers with large spending variances to the top of the list. These are the centers which should be analyzed in greater detail to determine why their actual expenses exceed the expected expenses as calculated based on volume. This analytical process provides management with a valuable tool for both identifying opportunities for cost reduction and highlighting problem areas where costs may be compromising the institution’s performance.


A third way in which ABC information can be used is in a traditional break-even analysis. ABC data can be combined with other profitability information to develop break-even analyses which can help managers structure and position products in the marketplace. Assume a financial institution is planning a $1 million promotion for its fixed-rate residential mortgage product. How many new loans does the institution need to generate in order to have the promotion break even in the first year? The institution’s mortgage department is comfortable with the assumption that an average mortgage has a $100,000 balance and will carry a 150 basis point spread (spread represents the difference between interest earned on the loan and the cost to fund the loan). From the ABC system, the institution will know that it costs $800 to originate a new loan and it will cost an additional $200 per loan for maintenance and servicing once the loan has been originated. These two cost components combine to generate a first year total cost for each new loan of $1,000 ($800 origination + $200 servicing). When this costing information is merged with the spread assumptions mentioned above, it becomes apparent that each new loan will provide a $500 contribution (reference Diagram VI).

Once a contribution figure has been established, it can be compared to the planned promotion expenditure to determine a first year break-even point. In this example, the $1 million promotion expense is divided by the $500 contribution per loan to determine that this promotion needs to generate 2,000 loans with an average balance of $100,000 per loan and an average spread of 150 basis points to break even in the first year. This type of information can be extremely valuable to a product manager who is trying to structure such a promotion. Also, once the promotion is launched, the break-even analysis will serve as a benchmark to determine the relative success of the marketing effort. In addition, the analysis can be modified to factor in the financial impact of these new loans over time. In general, a longer time horizon will tend to reduce the number of new loans needed to achieve a targeted break-even point (a one-year horizon was used in this example for simplification),


The three analyses outlined in this article are all rather basic; however, their impact is quite profound. By embracing these concepts, a profitability measurement team can actually start to move away from the ABC system development process and begin to utilize the valuable information which has been created. Profitability team members can then take pride in the fact that they are using the information which they have worked so hard to develop and are, in fact, beginning to help their institution make better financial decisions. Finally, team members will also have at least some answers to the question of how to use this new activity-based costing data.

The example data for the customer telephone service center in a mortgage department used in this artie is based on G. Y. Yang and R. C. Wu, “Strategic Costing & ABC ” Managment Accounting (September 1993): pp. 33-37.

* Information Systems. Brigham Young University, Provo, UT 84602, Telephone: (801) 378-3174, Fax: (801) 378-5933, Internet,

** PROMODEL Corporation

2 See R. Cooper and R. S. Kaplan, “Activity-Based Systems: Measuring the Costs of Resource Usage,” Accounting Horizons (September 1992), pp 1-13. Also P. B. B. Tumey, ‘What an Activity-Based Cost Model Looks Like,” Journal of Cost Management (Winter 1992): 54-60.

3 Service,Model(TM) is a product of PROMODEL Corporation (Orem, UT). Other examples of simulation analysis tools include iThink(TM) from High Performance Systems. Inc. and SimProcess(TM) from CACI.

Note: The financial impact of loan loss provision, taxes, and other items have been excluded from this example to simplify the illustration.


Ernst & Whinney, Profitability Measurement for Financial Institutions – A Management Information Approach, Bank Administration Institute, Rolling Meadows, Illinois, 1987.

Mabberley, Julie, The Price Waterhouse Guide to Activity-Based Costing for Financial Institutions, Pitman Publishing, London, 1996.

Horngren, Charles T., Cost Accounting a Managerial Emphasis, 5/E, Prentice-Hall Inc., Englewood Cliffs, New Jersey, 1982.

Sapp, Richard W., Crawford, David M., Rebishcke, Steven A., “Activity– Based Information for Financial Institutions,” The Journal of Bank Cost & Management Accounting, Volume 3, Number 2, 1990, pp. 53-62.

Weiner, Jerry, “Activity-Based Costing for Financial Institutions,” The Journal of Bank Cost & Management Accounting, Volume 8, Number 3, 1995, pp. 19-44.

by Monte R. Swain, Ph.D., CPA, CMA* and Bruce Gladwin**

Copyright National Association for Bank Cost & Management Accounting 1998

Provided by ProQuest Information and Learning Company. All rights Reserved