Providing quality analysis
for over 20 years

Advice on performance measurement questions.

What information should be included in internal management reports?

We believe the data to include varies with the organisation and situation. Our approach is to consider the following:

  1. The yardstick: budget for current year, or target for current year, or actual for last year, at the choice of the user, as a basis of comparison of current year data;
  2. Current year actual data: actual financial data is taken from monthly trial balances for the current year; actual operations data is selected by the user with suggestions supplied for different stake holders;
  3. Current year forecast data: forecasts of financial and operations data are generated automatically, updating for each new set of actual data input by users;
  4. Next year forecast data: forecasts of financial and operations data are generated automatically.

Why use spread sheets, rather than a tested package, to develop reports?

Answers to the following criticisms of spread sheets are tabled:

  1. Criticism 1: "Spread sheets may incur mechanical errors, particularly in data entry, with severe consequential errors"; Countermeasures: Data is routinely and automatically checked on entry, both as to arithmetical accuracy and as to reasonableness in magnitude. The accuracy of the checks may be determined by the user. A significant advantage of these MAF spread sheets over packaged programs is that they are transparent: worked data are shown and input may be traced through to the final reports. It is probably impossible to check in advance for every possible situation that may arise in different organisations for a huge range in data in advance; but this software does highlight detected errors which can then be corrected by the user, or referred to the supplier.
  2. Criticism 2: "Spread sheets include logic errors due to inappropriate algorithms being used"; Countermeasures: the spread sheets used follow a template structure using standard formulae tested in previous trials before release. The purpose of each sheet is summarised at the top of the sheet together with instructions if appropriate on how to input data. Assumptions of the entire spread sheet system are held on a separate sheet, with data acceptance limits for sensitivity analysis work.
  3. Criticism 3: "Spread sheets are subject to omission errors with some components of a model omitted completely"; Countermeasures: the strong internal control and checks inherent in the structure of these spread sheets guards against omission of data. Data on every sheet is subject to both vertical and horizontal checks. If errors are detected in any sheet then users will be unable to read subsequent sheets, and subsequent reports will not print.
  4. Criticism 4: "Spread sheets often contain macros, requiring programmers to interpret and amend them, and exposing the organisation to viruses"; Countermeasures: no macros are used in these spread sheets, which rely on a systematic structure and templates and testing.

A range of criticisms of alternative packaged software could be levied. Packaged software is not magic and is subject to similar errors that spread sheets are subject to, with the important exception that the errors are not obvious to the user. Wikipedia quoted a 2002 study commissioned by the US Department of Commerce' National Institute of Standards and Technology concluded that software bugs, or errors, are so prevalent and so detrimental that they cost the US economy an estimated $59 billion annually, or about 0.6 per cent of the gross domestic product. How much packaged software is marketed with a label indicating an independent firm is willing to take responsibility for auditing and testing it? Packaged software may be inadequate without specialised programming support, and inflexible for the end-user to change to allow for unique organisation features, or for changes in requirements from one year to the next. Packaged software reports may be output from a black box type program, with limited means of checking the source and processing of data. Packaged software reports may omit treatment of non-financial data due to the extraordinary variety of data that might be included for different organisations. Who wants to keep running back to a programmer every time a change in reports is required? If packaged software was so good, spread sheets would not be so popular.

What is the best response to executive(s) pressure for ambitious forecasts in connection with a specific decision?

Our view is that the person providing management reports has a professional responsibility to present a balanced view to organization executives, and to present alternative scenarios on occasion.

An example is a manufacturing operation, where an executive asks for approval to purchase new equipment that he believes will result in greatly increased sales. The executive is very confident that use of the new equipment will result in sales "really taking off". He relies on his knowledge and experience of the customers and competitors, and has no objective evidence. The executive believes that papers to go to the Board of Directors, to support the decision to buy the new equipment, should include forecasts based on purchase of the new equipment, and a high growth rate in sales.

We believe that the Board needs to get a balanced view, and to have alternatives, including a "do nothing" option. It is also important that whatever forecasts are put to the Board have the key assumptions clearly stated.

A preferable approach in this situation in our view is to provide executives with a comparison of three scenarios, each of which is possible, and clearly label the key assumptions for the forecasts in each scenario:

  1. Expansion Scenario.
  2. Modified Steady State Scenario.
  3. Stagnation Scenario.

What policy on ownership of data should be followed in routine internal management reports?

Our advice is that the person supplying internal management reports has a professional responsibility to question the data in the reports, and to be comfortable that the reports are realistic and complete as far as can be reasonably assessed. This entails a need to ask questions on the data supplied, to check the consistency of it with other data available within and outside the organisation, and to comment in writing to recipients on the sources of the data, on significant assumptions, and on weaknesses in internal control. The data in internal management reports cannot be guaranteed to be accurate, but it can and should be questioned by the supplier of the reports before that person(s) pass it on to executives who will rely on it, and time needs to be allowed for this. As far as forecasts to be included in routine management reports, it is important that the supplier of the reports is comfortable that the forecasts are realistic, understands how they are made if not prepared by the supplier of the reports, and includes key assumptions of the forecasts as a note on the reports.

How do you appraise the accuracy of forecasting?

Statisticians often used R-squared as a measure of goodness of fit. Geoff suggests a measure called the Absolute Percentage Error (APE) be used, defined as follows:

This gives a result that is more meaningful than R-squared for the average person in business. For example, most persons in business will readily comprehend a statement that, on average for the year to 30 June 2002, actual monthly profit was within 10% of monthly budgets. In this case the denominator in the fraction above would be the sum of the absolute values of monthly budget profits.

How can the MAF approach improve forecasting accuracy?

One important way that MAF may obtain improvement in accuracy in forecasting will be from disaggregation of data. Spread sheets are ideal in providing the flexibility required for disaggregation. An obvious example is forecasting cash at bank: forecasting accuracy will usually be improved if separate trends in sales collections and overhead expenses are analysed, and then forecast, rather than trying to predict cash at bank direct from past cash at bank figures.

Another example is forecasting profit: forecasts based on both operational and financial data are more accurate, rather than forecasting profit as an extrapolation of past profit figures. Look at the numbers of items sold, the numbers of people employed, and the year-end financial journal entries in the closing and opening months of each year. Another example is forecasting sales: forecasting accuracy may be improved if the sales are related to general economic trends, company pricing data, market growth data, and customer returns, rather than trying to forecast sales as a simple regression on a single series.

What accuracy in forecasting may be expected?

Measuring forecasting error as the absolute difference between the year-end result and the forecast result, the following may be expected:

  1. Accuracy will vary with volatility of the series being studied. It is more important to steadily reduce forecasting error, and to learn from one's mistakes, than to judge forecasting success against arbitrary standards.
  2. Forecast annual results will hopefully converge to the final actual result, as the year progresses. For example an 11% error in a forecast of sales for the full year, made at the beginning of the year, reduced to a 3% error when forecast half way through the year.
  3. One cannot expect to predict System Shocks, associated with unexpected shake-ups to the economic or political system.

Who can best set up and run internal management reporting for an organization?

Kaplan and Norton noted in their book "The Balanced Scorecard" 1996 on page 290, in discussing ownership of the strategic management system: "Most organizations today have a leadership void for this system. No executive in a traditional organization has the responsibility or perspective to manage a strategic management process, and it is unclear who should assume this responsibility."

The best answer seems to be “horses for courses”, or each organisation will best assess the best person to (a) set up the internal management reporting system, and (b) operate the system and supply reports to directors/executive management. (a) and (b) need not be the same person. Candidates to consider would be the CEO, the CFO, the CIO, the company secretary, the corporate planner or the economist.



Glossary:

Budget: An operating and financial plan showing what resources an organization has decided to use, where it plans to get them, where and how it plans to use them, and what it expects to accomplish during this specific period. Typically, it is the first year of the long-range plan. (Shillinglaw: "Managerial Cost Accounting" 1982 p5.).

Target: A goal for an organization, often to drive and inspire change in the organization. It requires total commitment to be achieved. (Kaplan & Norton "The Strategy Focussed Organization" p335)

Forecast: A prediction of values of a variable, based on known past values of that variable or other related variables, or on expert judgements based in turn on historical data and experience. (Makridakis, Wheelwright, McGee: "Forecasting: Methods and Applications" 1983 p899)

The emphasis picked up in the MAF reports is on expectation and plan: the budget is not seen as a target, or a pressure device. It is also assumed that the budget/plan was approved, and the data can therefore be used for control comparisons, and approval was at or prior to the beginning of a year. Reports are based on comments concentrating on differences between actual and budget data for the current month, rather than actual and budget data for the year to date.

Monthly budgets or targets are assumed, although reports may also be adapted to quarterly reporting. MAF reports typically update forecasts every month, or other time period as appropriate, based on all the latest available actual data. (In contrast, the budget usually remains fixed for the accounting year.)