Open access

A Lean Balanced Scorecard Using the Delphi Process: Enhancements for Decision Making

Written By

Chuo-Hsuan Lee, Edward J. Lusk and Michael Halperin

Submitted: 19 October 2010 Published: 06 September 2011

DOI: 10.5772/17208

Chapter metrics overview

2,864 Chapter Downloads

View Full Metrics

1. Introduction

Kaplan & Norton’s Balanced Scorecard (BSC) first appeared in the Harvard Business Review in 1992 (Kaplan and Norton, 1992). It described in general terms a method by which management may improve the organization’s competitive advantage by broadening the scope of evaluation from the usual Financial dimension to include: the organization’s Customer base, the constitution and functioning of the firm’s Internal Business processes, and the necessity of Innovation and Learning as a condition for growth. Over the years, these four constituent elements of the BSC have remained largely unchanged, with the exception of a modification in 1996 when Innovation and Learning was changed to Learning and Growth (Kaplan and Norton, 1996).

As the BSC is now in its second decade of use, there have been a number of articles suggesting that the BSC is in need of refocusing. Lusk, Halperin & Zhang (2006) and Van der Woerd, F. & Van den Brink (2004). This refocusing suggests two changes in the Financial Dimension of the BSC. First, the Financial Dimensionof the BSC needs to be broadened from measures that only address internal financial performance to those that are more market-oriented. For example, according to the Hackett Group (2004: 67), the majority of Balanced Scorecards are “out of balance because they are overweight with internal financial measures”. Second, there seems to be a tendency to add evaluation variables with little regard for their relationship to existing variables. This “piecemeal approach” results in a lack of coherence and often causes Information Overload. This tendency is underscored by Robert Paladino (2005), former vice president and global leader of the Telecommunications and Utility Practice for Norton’s company, the Balanced Scorecard Collaborative, http://www.bscol.com/, who suggests that a major failure of the BSC is that many organizations adopt a piecemeal approach that generates an information set consisting of too many highly associated variables without independent associational linkages to the firm’s evaluation system. The observation is consistent with one of the classic issues first addressed by Shannon (1948) often labelled as the Shannon-Weaver Communication Theory which in turn lead to the concept of Information Overload.

Information Overload is particularly germane, because from the beginning of the Internet Era, management has succumbed to the irresistible temptation to collect too many financial performance variables due to the plethora of data sources offering simple downloads of hundreds of financial performance variables. For example, Standard &Poors™ and Bloomberg™, to mention a few, have data on thousands of organisations for hundreds of variables.

In this study, we answer the call for refocusing the Financial Dimension of the BSC. The purpose of this study is twofold. First, as suggested by Lusk, Halperin& Zhang (2006) we addtwo market-oriented variables - the standard CAPM Market Beta and Tobin’s qto broaden the financial dimension of the BSC. And second, we suggest, in detail, a simple modelling procedure to avoidthe “Information Overload and Variable Redundancy” so often found in the Financial Dimension of the BSC. Although this study is focused only on the Financial Dimension of the BSC, the modelling process that we suggest in this study may be also applied to the other dimensions of the BSC.

This study is organized as follows:

  1. First, we suggest a simple procedure for generating a “lean” i.e.,—parsimonious, variable characterisation of the firm’s financial profile and then use that lean or consolidated variable set as an input to a standard Delphi process.

  2. Second, we present an empirical study to illustrate the lean BSC modelling system.

  3. Lastly, we offer some concluding remarks as to use of our refined information set.

Advertisement

2. Our simple procedures leading to parsimony in the BSC

As an overview, as discussed above the BSC has been criticized as being “too endowed” with financial performance variables due to the convenience of downloading financial variables using the standard databases. Simply put, many of these financial variables are merely expressing the same firm process characterization but in a different variable voice. To deal with this “redundancy” issue, we are following the suggestion of Jones (2006) who recommends factor analysis, essentially based upon the standard Harmon (1960) Factor model [SHFM], as the best technique to develop non-overlapping categorizations of impact variables for risk assessment. His model forms the basis of the Information Risk Management module of the Certified Information Security Manager manual (2009, Ch 2).

The essential idea is that factor analysis reduces the “over-endowed” variable space to its factor equivalent; this will achieve parsimony in the variable space and thus enable firm decision-makers to better understand their processes as characterisations of independent factors. This factor reduction thus addresses the issues of Variable Redundancy and Information Overload discussed above. Consider now how factor analysis and benchmarking can be used to develop the Lean BSC.

2.1. Three variable contexts of factor analysis and the Delphi Process: our lean modelling system

In the process ofusing factor analysis to develop lean variable set characterizations, we suggest performing factor analyses in three variable contexts: (1) the Industry, (2) a Benchmarked Comparison Organisation and (3) the Particular Firm. These three variable contexts

The above lean modelling framework and benchmarking procedure was used by one of the authors as a member of the Busch Center of the Wharton School of the University of Pennsylvania, the consulting arm of the Department of Social Systems Science, as a way to focus decision-makers’ attention on differences between their organisation and (1) the related industry as well as (2) a selected firm in the industry used as a positive benchmark—that is, a best practices case that the study firm would like to emulate. Sometimes, a negative benchmark was also used—in this case an organisation from which the study firm wished to distance itself.

taken together help management refocus on variables that may be productively used in planning and executing the navigation of the firm. Specifically, the feedback from these three variable contexts will be organized as a Delphi Process as the information processing logic of the BSC. Consider now the details of the Information Processing logic for the BSC as organized through the Delphi process.

2.2. The two-stage Delphi Process in our lean modelling system

Because the principal information link in the Delphi Process is generated by factor analysis, the decision makers [DM] must have a practical understanding of the output of the factor model—i.e., a working familiarity with the output of the factor model is essential to the successful creation of a Lean BSC. Accordingly, we suggest the Delphi Process to be achieved in the following two stages.

2.2.1. Stage 1: unfreezing stage –familiarize the decision makers with “factors”

The goal of the Unfreezingstage is to have the DM feel comfortable with the logic of factor analysis as a statistical technique that groups BSC Variables into Factors. To effect the Unfreezing stage we recommend using an intuitive example dealing with the simple question: What is a Computer? It is based upon an example, the data of which is presented following in Table 1, used by Paul Green and Donald Tull (1975) in their classic text on marketing research; they used it to illustrate the logic of Factor Analysis.

Computers
n=15
Basic Processing Advanced Processing
Minimum
Storage
Maximum
Non-swap
Storage
Add-Ons
Non-buffer
Cycle
Time
1 -0.28 -0.36 -0.49 -0.52 -0.48 -0.27
2 3.51 3.61 -0.55 -0.6 -0.87 3.74
3 -0.39 -0.34 -0.55 -0.53 -0.59 -0.27
4 -0.06 -0.28 -0.55 -1.07 -0.83 -0.26
5 0.38 -0.27 -0.46 -0.50 -0.88 -0.27
6 -0.43 -0.38 -0.55 -0.52 -0.48 -0.27
7 -0.26 0.37 -0.55 -0.52 -0.59 -0.27
8 0.70 0.68 -0.60 -0.61 -0.92 -0.27
9 -0.47 -0.39 -0.37 -0.52 -0.48 -0.27
10 -0.28 -0.23 -0.02 -0.14 -0.77 -0.27
11 -0.49 -0.39 -0.13 0.16 1.71 -0.27
12 -0.50 -0.39 1.32 2.47 1.08 -0.27
13 -0.51 -0.39 0.36 0.16 1.70 -0.27
14 -0.51 -0.39 3.26 2.47 1.08 -0.27
15 -0.52 -0.12 -0.13 -0.23 1.33 -0.27

Table 1.

The Green and Tull Computer Factor Dataset

To start the Unfreezing process, we suggest presenting to the DM the following information processing task:

Assume that you do not know anything about Computers, and you wish to better understand the essential functioning of Computers. Unfortunately, as you are not gifted “technically, it is not possible for you to reverse-engineer a computer and tinker with it so as to understand its essential features. So, you are left with an “empirical” or analytic option. Therefore, with the aid of your tech-friends you collect six performance Variables on 15 different models of computers: (1) Basic Processing Speed, (2) Advanced Processing Speed, (3) Minimum Storage, (4) Maximum Non-Swap Storage, (5) Add-Ons Non-buffer Capacity, and (6) Cycle Time. So now you have a dataset with 15 computers measured on six performance variables—i.e., a matrix of size 15 rows and 6 columns as presented in Table 1.

As we continue with the Unfreezing stage, we distribute to all the DM a copy of the dataset in Table 1 for anchoring their processing of the variable reduction from this dataset. We, the group conducting the Unfreezing, enter the dataset into the Standard Harmon Factor Model [SHFM] statistical program; the results produced are presented in Table 2. We recommend that Table 2 be displayed either: (1) on the computers of the DM, (2) on a projection screen for all to see or (3) distributed to the DM as printed copies. They need to be able to see and discuss the information in Table 1and the factor results as presented in Table 2.

Variables Factor A Factor B
Basic Processing 0.96 -0.22
Advanced Processing 0.98 -0.16
Minimum Storage -0.07 0.94
Maximum Non-Swap Storage -0.07 0.96
Add-on Non Buffer -0.25 0.75
Cycle Time 0.98 -0.06

Table 2.

Two Factor Rotation of the Six Variables of the Computer Dataset

Using the above information and the usual Factor Definition Rule that: any variable score greater than 0.71 is an important variable descriptor of that factor—here bolded in Table 2, we discuss with the DM that the six variables really describe just two Factors

We do not discuss with the DM the (0.5 loading rule because this concept requires a relatively sophisticated understanding of the mathematical statistics of factor models. Rather we use the simple loading rule that if a Variable had a weight of 0.71 or greater then that variable is a meaningful descriptor of that Factor. In our experience, this level of detail is sufficient for the DM to use the results of the Factor Model. This does assume, of course, that there will be someone in the firm with sufficient training in statistical modeling to actually use the factor software that is to be used in the Delphi process. This is usually the case.

. Factor A is described predominately by the variables: Basic Processing, Advanced Processing and Cycle Time all of which have variable scores greater than 0.71. These are all measures of Speed. Factor B, on the other hand, is described predominately by Minimum Storage, Maximum Non-Swap Storage andAdd-on Non Bufferwhich are all measures of Storage. The results in Table 2 provide simple and intuitive information which speaks in a straightforward way to the question which started the Unfreezing analysis “What is a Computer?” The DM now recognizes and, of course, probably knew at the outsetthat Computers are devices that have Speed of Processing and Storage Capacity as their essential profiling characteristics. Also, they understand, via the comparison of the data in Table 1 and the results in Table 2, that there were not really six variables but rather two variables: Speed and Storage with each of them measured in three different ways.

In summary, to wind-down the Unfreezing stage, we emphasize that the initial set of six variables was by definition “over-weight” or redundant and so the six variables were consolidation by the SHFM to form only two dimensions: Speed and Storage in the lean-version characterization of the dataset.

This simple computer example is critical to the understanding of the use offactor analysis to reduce variable redundancy and so deals with “over-weight” variable characterizations. We find that this simple and intuitive example unfreezes the DM in the sense that they knew that computers were fast storage computation devices and this is exactly what the factor analysis shows; this reinforcing their belief that the Factor Analysis can be “trusted” to replicate the reality of the associated variable space with fewer variables.

2.2.2. Stage 2: Factor analysis in three contexts and brainstorming

Following the first or Unfreezing Stage, the next stage is where we engage the Delphi process. We recommend that the Delphi process be used in its EDI-mode (See Jung-Erceg, Pandza, Armbruster & Dreher (2007)) where the DM discuss, in a Chat-Loop-Context, the various information sets until they are satisfied with their insights and then they propose the Action Plan for the firm derived from the Lean-BSC.

Specifically we recommend that the firm, given its understanding of the Mission, Goals and Objectives, engages the Delphi Process to generate the Lean-BSC using the following steps:

  1. The firm will identify the set of DM who intend to navigate their firm using the BSC.

  2. This group of DM will select the longitudinal panel consisting of (i) a sample of firms from their industry, possibly all, and (ii) a particular firm that could be a positive or negative benchmark—meaning that the DM judge the benchmark to be a firm that they wish to emulated or a firm from which they wish to distance themselves.

  3. Then the DM will select a Comprehensive Variable Set [CVS] that they believe are the firm performance variables that can be used to best profile their firm. In our experience the best profiling variables may include those financial performance variables that (i) are simple to measure, (ii) have operational measures that are sensitive indicators of change, and (iii) are themselves considered as direct measures of effects relative to the Mission, Goals and Objectives of the firm. This will be important in constructing the necessary reward linkages which is one of the principal reasons to use the BSC. The CVS, itself, is very likely to be a variable set that the firm has been using in the past and so may be characterized as the “over-weight” variable set. This is not a problem as the intention at this stage is to incorporate all the variables that theDM feel to be important. We expect that the CVS will be formidably large.

  4. The “Overweight” CVS will be inputted to the SHFM so as to consolidate the variable set to the factors—i.e., the lean variable set. This process will be repeated for the Industry and the Firm-benchmark respectively (i.e., the other two contexts of factor analysis).

  5. And finally, these three lean-variable sets, one for: the Firm, the Industry and the Firm-benchmark, will be sent to the Chat-Loop-Context-Delphi-Space and the DM will begin the convergent process of deciding the Financial Action Plan that will be integrated into the full BSC evaluation process. This process will lead to the final action plan of the firm considering all four of the BSC dimensions.

To enrich understanding of how these five steps will be used for a particular firm, we will now present a detailed and comprehensive example of all the steps that are needed to create the Lean BSC.

Study Firm: A.D.A.M. Inc.
(ADAM). [http://www.adam.com] The principal activity of A.D.A.M.,Inc is to provide health information services and technology solutions to healthcare organizations, group insurance brokers, employers, consumers, and educational institutions. The products of the Group are used for learning about health, wellness, disease, treatments, alternative medicine, anatomy, nutrition and general medical reference in both the healthcare and education markets. The products contain physician-reviewed text, in-house developed medical graphics and multimedia to create health information that offers visual learning experience. The Group provides information on annual licensing agreements to healthcare organizations, Internet websites and educational institutions.
Benchmark: Amdocs
(DOX). [http://www.amdocs.com/Site/AmdocsCom.htm] Amdocs,Inc is a leading provider of customer care, billing and order management systems for communications and Internet services. Amdocs has an unparalleled success record in project delivery of its mission-critical products. With human resources of over 5,900 information systems professionals, Amdocs has an installed base of successful projects with more than 75 service providers throughout the world. In April 2000, Amdocs completed the acquisition of Solect Technology Group Inc., a leading provider of customer care and billing systems for IP providers.

Table 3.

Brief Profiles of the Study Firm and its Benchmark

Advertisement

3. Illustrative example –pre-packaged software: SIC 7372

We next present an illustrative example of the BSC Delphi Factor Procedure [Delphi BSC] using an actualfirm: A.D.A.M. Inc. in the SIC 7372. It is not our intention to suggest that this selected dataset speaks to actual recommendations drawn from the BSC analysis as this is clearly the domain of the DM of the firm using the Delphi BSC. We offer this example as a detailed illustration of the guidance through the process. To this end we, the authors, have assumed the role of the DM for A.D.A.M. Inc. and will (1) discuss our reaction to the information that we have generated using the Delphi BSC and (2) how this may be used to develop the BSC navigation information for the sample firm. Our assuming the roles of decision makers is only to illustrate the possible functioning of the Delphi BSC—i.e., our Lean-BSC navigation recommendations for the actual firm that we have selected are not normative in nature.

3.1. Selection of three lean variable contexts

We assume the role of DM for A.D.A.M. Inc. (Ticker: ADAM; NASDAQ) which is in the software and related devices industry in the SIC: 7372. The principal activity of our company, A.D.A.M. Inc., is to provide health information services and technology solutions to healthcare organizations, group insurance brokers, employers, consumers, and educational institutions. Our positive benchmark is Amdocs (Ticker: DOX; NYSE), a leading provider of customer care, billing and order management systems for communications and Internet services (See Table 3 above for a brief description of these firms and their URL-links). Our Industry Benchmark includes all 7372 SIC Firms that had data reported in COMPUSTAT from 2003 to and including 2006. We selected this time period as it was after Sarbanes-Oxley: 2002, and before the lead-up to the sub-prime debacle in 2008, which we view as event partitions of reported firm data. This accrual yielded 411 firms and netted a variable-panel of 1,161 observations.

3.2. Selection of a comprehensive variable set [CVS]

We, the DM for A.D.A.M. Inc., began the Delphi BSC process by selecting from the extensive COMPUSTAT™ menu 15 variables which we believe are useful in portraying the financial profile of market traded organizations. Further, we have used these variables in characterizing our operating profile in that, over time, these variables have been instrumental change variables for us, and have been used in the evaluation of our firm.These DM-selected variables are presented in Table 4.The seven (7) variables that were not downloaded from COMPUSTAT™, but rather calculated from COMPUSTAT™ variables are noted in boldface. The computation of these variables is detailed in the Table 5. The other definitions are available from COMPUSTAT™. Finally, we downloaded Beta from CRSP™. Therefore, in total, we have 23 variables as our financial profile, two of which are Tobin’s q and the CAPM β as suggested by Lusk, Halperin & Zhang (2006).

Cash & Short-Term Investments Net Sales Diluted EPS before extraordinary items Current Ratio
Receivables Total Market Value: Fiscal Year End Net Income (Loss) Quick Ratio
Cash Number of Common Shares outstanding Cash from Operations Accounts Receivable Turnover
Current Assets Net PPE Depreciation & Amortization Tobin’s Q
Current Liabilities Beta Gross Margin EPS Growth
Total Assets Cost of Goods Sold ROA

Table 4.

The 23 Judgmental Variables Selected by the Decision-makers of A.D.A.M. Inc.

Effectively we are saying that these 23 variables would be important in profiling our organization as we play the role of DM for A.D.A.M. Inc.; other DM as well as other firms may, and probably will, select other variables. This is a positive feature of the Delphi BSC in that it provides the needed idiosyncratic flexibility in selecting the variables that will be inputted into the Factor Analysis, and so constitutes the judgmental factor set critical in the analysis.

3.3. The results of the factor study

All of the results of the factor study reported in Tables 6, 7 and 8 were created using the SHFM; the standard Varimax rotation on the Pearson correlation matrix, as programmed in JMP of the SAS Institute, version 6.0 (see Sall, Lehman & Creighton, 2005). The number of factors selected was the number of factors in the un-rotated factor space—i.e., the correlation matrix—for which the eigenvalue was greater than 1.0. Finally, variable loadings greater than √0.5 were used in the description of the factors. For ease of reading, the variable loadings are presented only to the second decimal, and those loading greater than √0.5 are bolded. For the industry factor analysis, we excluded A.D.A.M. Inc. and Amdocs as they are the study firm and the benchmark firm respectively.

Tobin’s q =(A25xA199+A130+A9)/A6
Current Ratio =A4/A5
Quick Ratio =(A1+A2)/A5
ROA =(A172-A19)/A6
EPS Growth =((A57t-A57t-1)/|A57t-1|
Gross Margin =A12-A41
Accounts Receivable Turnover =A12/((A2t+A2t-1)/2)
Where:
A1- Cash and short term investments
A2- Receivables
A4- Current Assets
A5- Current liabilities
A6- Total assets
A9- Total long-term liabilities
A12- Net sales
A19- Preferred Dividends
A25- Common shares outstanding
A41- Cost of goods sold
A57- Diluted earnings per share excluding extraordinary items
A130- Preferred stock-carrying value
A172- Net income (loss)
A199- Common Stock Price: Fiscal year end

Table 5.

Computation of Ratios Based on COMPUSTAT Data

3.3.1. The Delphi Process judgmental interpretation of the representative variables for the various factors

We, in the role of DM for A.D.A.M. Inc. were guided by the factor loading results and also by those variablesthatdidnotloadacrossthefinalfactors. These latter variables are interesting in that they are independent, in the strong scene, as they did not achieve association in the rotated space greater than √0.5. We will note these non-associative variables in the Tables 6, 7 and 8 using Bold-Italics . Therefore, we will have two groups of variables that exhaust the Factor/Variable Space: Those variables that have loaded on a factor such that the rotate loading is greater than √0.5 and those that did not exhibit such an association. Both groups have guided our interpretation of the Delphi BSC.

3.3.2. The industry factor profile and its relation to the BSC

The industry Factor analysis is presented in Table6. We remark that both Beta and Tobin’s q do not align in association with any of the COMPUSTAT™ financial performance profile variables. This suggests that relative market volatility and stockholder preference are independent measures for the Pre-Packaged Software industry—in and of itself an interesting result. One strong implication of Table 6 for us as DM for A.D.A.M. is that insofar as the BSC analysis is concerned the industry is a mixed portfolio with both disparate hedge and market sub-groupings. See two recent articles that treat these topical relationships in the hedge fund context: Vincent (2008) and Grene (2008). Confirmatory information is also

Beta 0.02 0 0.01 0 -0.1 0 0.03 0.98 0
Tobin's Q 0.02 0.11 0.1 0.03 0 0.04 0.99 0.03 0.03
Current Ratio 0 0.99 0.01 0 0.03 0.07 0.06 0 0.01
Quick Ratio 0 0.99 0 0 0.04 0.06 0.06 0 0
ROA 0.11 0.08 0.23 0.2 0.92 0.04 0 -0.1 0.15
Gross Margin 0.99 0 0.08 0 0.04 0.02 0.03 0 0.04
A/R Turnover 0.02 0.12 0.05 -0.1 0.03 0.99 0.04 0 0
EPS Growth 0.01 0 0.07 0.97 0.17 -0.1 0.03 0 0.12
Cash 0.94 0 0.05 0 0.04 0 0.01 0.02 0.03
Cash & Short-term Investment 0.89 0.05 0.17 0.03 0.02 0.07 0 0.04 0.03
Cash from operations 0.98 0 0.05 0 0.04 0.02 0.05 0 0.06
Receivables Total 0.96 0 0 0.01 0.04 -0.1 0 0.01 0.04
Current Assets 0.96 0.02 0.14 0.02 0.03 0.04 0 0.02 0.03
Current Liabilities 0.98 -0.1 0 0 0.03 0 0 0.01 0.02
Total Assets 0.98 0 0.06 0.01 0.03 0 0 0.03 0
PPE Net 0.94 0 0.1 0.01 0.03 0 0 0.01 -0.1
Net Sales 0.98 0 0.09 0 0.04 0.04 0.01 0 0.04
Depreciation & Amortization 0.90 -0.1 0.2 0.07 0.03 0.02 -0.1 0.02 -0.2
Common Shares Outstanding 0.94 0 -0.2 0 0.02 0 0.04 0.01 0.06
COGS 0.82 0 0.09 0 0.03 0.08 0 0.01 0.05
Net Income(Loss) 0.92 0 0.01 0.01 0.05 0.03 0.08 0 0.18
Diluted EPS 0.10 0 0.48 0.29 0.34 0 0.06 0 0.73
Market Value- Fiscal Year end 0.22 0.01 0.92 0.06 0.2 0.06 0.11 0.02 0.16

Table 6.

Industry Factor Analysis: Mid-Range Year Randomly Selected 2004

provided by the fact that EPS Growth and Market Value are independent variables with respect to the other factor defined variables. For this reason, we note: It will be important to understand that A.D.A.M. Inc., our firm, does not have to compete on a market relative basis—i.e., against the industry as a portfolio. One possible reaction to this, that the decision-makers may decide, is that it would be useful in a strategic planning context to partition the industry into various profile groupings and then re-start the Delphi BSC using these industry sub-groupings as additional benchmarks. We have opted for the other approach that is to use the Amdocs benchmark and continue with the Delphi BSC.

3.3.3. The study firm—A.D.A.M., Inc. and the selected benchmark: Amdocs

We will now concentrate on the relative analysis of the study firm and Amdocs, our positive benchmark. One of the underlying assumptions of this comparative analysis is the stability of the factors in the panel. As we have an auto-correlated panel for these two firms, there is no statistical test for stability for the particular factor arrangement that was produced. If one seeks to have a demonstration of this stability then a simple boot-strapping test will give the required information. See Tamhane and Dunlop (2000, p. 600). For our data test, we conducted a relational test for this data set and found it to be stable over the two following partitions: years 2005 and 2006 compared to 2003 to 2006 and this argues for factor stability.

Beta -0.81 0.55 0.20
Tobin's Q -0.59 0.75 0.30
Current Ratio -0.82 0.52 -0.24
Quick Ratio -0.84 0.49 -0.23
ROA -0.42 0.91 -0.03
Gross Margin 0.98 0.22 0.00
A/R Turnover 0.88 -0.19 -0.43
EPS Growth -0.88 0.47 0.00
Cash 0.88 -0.40 -0.26
Cash & Short-term Investment 0.42 0.91 -0.02
Cash from Operations 0.11 0.86 0.50
Receivables Total 0.92 0.11 0.38
Current Assets 0.65 0.75 0.06
Current Liabilities 0.98 0.16 0.12
Total Assets 0.98 0.20 -0.01
PPE Net 0.99 0.10 -0.05
Net Sales 0.98 0.22 -0.01
Depreciation & Amortization -0.54 -0.84 -0.10
Common Shares Outstanding 0.92 0.40 0.02
COGS 0.97 0.22 -0.03
Net Income(Loss) -0.08 0.99 -0.09
Diluted EPS -0.13 0.99 -0.07
Market Value-Fiscal Year End 0.28 0.96 0.09

Table 7.

Factor Analysis: A.D.A.M., Inc. [ADAM]

As part of the Delhi process as it relates to the interchange of information among the DM so as to create the BCS navigation we, the authors, using the information in Tables 6, 7 and 8 exchanged our ideas as to the interpretation of the information generated by the factor analyses as presented in Tables 6, 7 and 8 by email and also in face-to-face meetings. This was done over a two week period; after that time we felt that we had closure and developed the following two Observations: One using the information in Table 7 above for A.D.A.M and One using the information in Table 8 following for Amdocs. Finally from these two observations we developed the Navigation information for A.D.A.M. Inc. Consider these Observations and the related Navigation Imperatives next.

Observation I: Consider these relative to A.D.A.M. Inc.

For our organization,A.D.A.M. Inc., in the first column of Table 7 (i.e., Factor 1) we note that Beta is inversely associated with the direct—i.e., not computed—balance sheet variables representing the resource configuration of the firm such as: Total Assets, Net PPE, and Total Receivables. This suggests that more resources are associated with lower market volatility. Further, we observe a positive association between Gross Margin (i.e., Net Sales less COGS) and certain balance sheet variables such as Total Assets and Liabilities, but we do not observe any relationship between Reported Net Income [RNI] and these balance sheet variables, implying that although our resource configuration effort is influencing the gross margin it is not aligned with our bottom line RNI improvement. It is also interesting to note that Factor 1 and Beta are not associated with Factor 2 which seems to be best characterized as the growth dimension of A.D.A.M. Inc. For the second factor, our RNI movement is consistent with our cash management (Cash & Short-term Investments) and assets management (ROA) as well as our potential to grow as measured by Tobin’s q.

Beta -0.94 -0.22 0.25
Tobin's Q 0.99 -0.07 0.16
Current Ratio -0.05 -0.99 -0.09
Quick Ratio -0.20 -0.98 -0.10
ROA 0.84 -0.50 -0.24
Gross Margin 0.98 0.05 0.17
A/R Turnover 0.30 -0.95 0.09
EPS Growth -0.73 0.67 -0.09
Cash -0.48 0.69 -0.54
Cash & Short-term Investment -0.96 -0.02 -0.28
Cash from Operations 0.71 0.70 0.07
Receivables Total 0.97 0.08 0.24
Current Assets 0.34 0.88 -0.32
Current Liabilities 0.03 0.99 0.09
Total Assets 0.91 0.34 0.23
PPE Net 0.37 0.81 0.45
Net Sales 0.99 0.01 0.16
Depreciation & Amortization 0.62 0.30 0.73
Common Shares Outstanding -0.48 0.86 0.17
COGS 0.99 -0.01 0.16
Net Income(Loss) 0.98 -0.20 -0.01
Diluted EPS 0.97 -0.23 -0.01
Market Value-Fiscal Year End 0.96 0.19 0.21

Table 8.

Factor Analysis: Amdocs,Inc. [DOX]

Observation II : Relative to Amdocs,Inc.

In comparison, Table 8 presents a very different profile of Amdocs, Inc., our positive benchmark. The first factor of Table 8 has many of the same resource configuration aspects except that both Beta and Tobin's q are featured in Factor 1 as well as is RNI! This strongly indicates that the resource configuration is aligned linearly with the income producing potential. In this way, Amdocs, our positive benchmark, seems to have worked out the asset employment relation to RNI that is not found for our firm, A.D.A.M. Inc. Further, Beta is inversely associated with the “income machine” as reflected by Tobin's q, RNI and Net Sales for Amdocs meaning that the more assets the more they are converted linearly to Reported Net Income and this has a dampening effect on Market volatility. This certainly is a positive/desirable profile and is probably why Tobin's q is positively associated with this profile; we do not see this for A.D.A.M. Inc. as Tobin’s q is only associated with growth and is invariant to Beta and the resources employed as discussed above.

3.3.4. BSC implications: input to the navigation plan

Navigation Imperative I

The benchmarking with Amdocs, Inc. suggests that if we, A.D.A.M. Inc., value the performance configuration as profiled in Table 8 of Amdocs Inc., then we should endeavour to align our resource employment with Reported Net Income (RNI) generation. Based on the benchmarking results, we seem to be successful in utilizing resources to improve Gross Margin but fail to make our resource configuration deliver with respect to the bottom line: Reported Net Income [RNI] as we observed for Amdocs Inc. our positive benchmark.

To align our resource employment with Net Income, we need to examine our policies as they relate to project acceptance and management to make sure that we effectively manage our resources to achieve a higher net income, leading to a higher return on assets (ROA). Accordingly, the principal Financial Profiling action we need to take is to: Attend to ROA. This objective can be accomplished in a variety of ways—either by generating relatively abnormally high RNI but doing so by moderately increasing the Asset generating base or by achieving average or below average RNI but doing so with a highly efficient Asset configuration—i.e., a reduction in the asset base.

For example, consider an example taken from the A.D.A.M. Inc. Website http://www.adam.com/we learn that we have developed:

“The A.D.A.M. Symptom Navigator for the iPhone that is an innovative web application designed and developed specifically for the iPhone, and runs using Apple’s Safari browser. This unique tool allows consumers to quickly and easily find important information about health symptoms, all with just a few taps on their iPhone. With a home screen icon branded specifically to your organization, the application also allows your facility to be seen as a leading provider of health information and services among today’s growing market of mobile users.”

Let us assume that there were two projects which have equal expectation on profitability—i.e., RNI. However, the iPhone Symptom Navigator has double the ROA compared to, let us assume, a PDA-Telemetry Download System—i.e., Project B. In this case, given the Delphi BSC navigation information generated, the DM would prefer the iPhone project on the basis of ROA.

Navigation Imperative II

Our Tobin’s q, which acts as a proxy of growth potential perceived by the market participants, is aligned with the ROA and RNI but cannot line up with our resource configuration variables such as total assets, receivables and net sales. In other words, if we cannot use our resources in a way to boost our RNI and ROA, we will fail to improve our growth profile as well in the eyes of the market participants.Accordingly, this missing connection between our resource configuration variables (e.g., our assets) and our Tobin’s q (i.e., our growth potential) indicates that the principal Market Profiling action we need to take is to: Manage our Growth Profile While Attending to ROA. This action requires us to (1) consider new investment projects that will not only improve short-term financial performance such as ROA but also offer strategic profiling information for informing the market, and (2) pay attention to the impact of the new investment projects as they impact our recorded book value as we consider the return from the new investment projects.

To continue with the simple two project example, if two projects are the same in terms of profitability (e.g., ROA), we should consider the project that will offer richer strategic opportunities. Also, if these two projects permit various asset contracting possibilities such as Operating Leases, Capital Leasing, or Purchasing and there are relatively wide ranges for: Useful Economic Life, Resale Market Valuation as well as the methods of depreciating the asset, then we should consider the project that will have the more favorable impact on our recorded book value given the same performance in profitability.

Advertisement

4. Conclusion

To “close-the-loop” we wish to note that the FinancialDimension information developed from the Delphi BSC or Lean Modeling approach will be included in the BSC along with information on the other three dimensions of the BSC: Customers, Internal Processes, and Learning and Growth. These BSC dimensional encodings are the input to the firm DSS needed to develop priority information for the various projects that the firm may be considering—i.e., project prioritization is the fundamental reason that firms use the BSC or budgeting models. This is to say that the BSC information is “intermediate” information in that this BSC information will be used to characterize the projects that are the planned future of the firm. The final task in the Delphi BSC process then is selecting from the BSC-encoded projects the actual projects to be funded by the firm.

In this regard, to conclude our research report, we wish to note a simple way that one may “prioritize” the navigation information from all four dimensions of the BSC as they are encoded in the projects. For example, in our simple illustrative example where we have focused on the FinancialDimension, we have proposed that there were two projects and the iPhone Project dominated on both of the criteria variables: ROA and Growth Profile Management. It is more likely the case that there will be multiple criteria that result from the Delphi BSC and that the preference weights considering all four of the BSC dimensions will be distributed over the various projects so that there would be no clear dominance. To determine the actual project-action plan from the output of the Lean-version of the Delphi BSC is indeed a daunting task even with a relatively small number of performance criteria. In this regard, it is necessary for the firm to select a method to determine the final set of projects in which the firm will invest so as to satisfy the Overall BSC navigational imperatives for the firm.

There are many such preference or priority processing methods. Based upon our consulting work, we prefer the Expert Choice Model™ [http://www.expertchoice.com/] developed by Saaty (1980 and 1990). We find this Alternative/Project Ranking methodology to be the simplest, easiest to communicate to the DM, and most researched Ranking Model according the average number of annual citations in the literature. As a disclose note: we have no financial interest in Expert Choice Inc. the firm which has developed the application software for the Expert Choice Model™.

Advertisement

Acknowledgments

We wish to thank the following individuals for their detailed and constructive comments on previous versions of the paper: The participants in Technical Session 9 of the 2010 Global Accounting and Organizational Change conference at Babson College, Babson Park, MA USA, in particular Professor Ruzita Jusoh, University of Malaya, Malaysia; the participants at the 2009 National Meeting of the Decision Science Institute, New Orleans LA, USA in particular Professor Sorensen, University of Denver, and additionally Professor Neuhauser of the Department of Finance, Lamar University, Beaumont Texas, USA and Professor Razvan Pascalau of the Department of Finance and Economics, SUNY; Plattsburgh, Plattsburgh, NY, USA.

References

  1. 1. Certified Information Security Manager. 2009CISM Review Manual, ISACA, Rolling Meadows, IL
  2. 2. Green P. E. . Tull D. S. 1975 Research for Marketing Decisions, Upper Saddlewood, NJ, Prentice Hall
  3. 3. Grene S. . December 2008Clarity to Top Agenda. London: Financial Times, 01.12.2010
  4. 4. Hackett Company Report. 2004Balanced Scorecards: Are their 15 Minutes of Fame Over? The Hackett Group Press, Report AGF/213
  5. 5. Harmon H. H. (1960 1960 Modern Factor Analysis, University of Chicago Press, Chicago, IL
  6. 6. Jung-Erceg P. Pandza K. Armbruster A. Dreher C. 2007Absorptive Capacity in European Manufacturing: A Delphi study. Industrial Management + Data Systems, 1 37 51
  7. 7. Jones J. A. 2006Introduction to Factor Analysis of Information Risk. Risk Management Insight [online], Available from http://www.riskmanagementinsight.com/media/documents/FAIR_Introduction.pdf
  8. 8. Kaplan R. Norton D. 1992The Balanced Scorecard-Measures that Drive Performance.Harvard Business Review, 83 7 71 79
  9. 9. Kaplan R. Norton D. 1996Linking the Balanced Scorecard to Strategy. California Management Review, 39 1 53 79
  10. 10. Lusk E. J. Halperin M. Zhang-D B. 2006The Balanced Scorecard: Suggestions for Rebalancing. Problems and Perspectives in Management, 4 3 100 114
  11. 11. Paladino R. 2005Balanced Forecasts Drive Value.Strategic Finance, 86 7 36 42
  12. 12. Saaty T. L. (1980 1980 The Analytic Hierarchical Process, McGraw-Hill, New York, NY
  13. 13. Saaty T. L. 1990How to Make a Decision: The Analytic Hierarchy Process. European Journal. of Operational Research, 48 1 9 26
  14. 14. Sall J. Lehman A. Creighton L. (2005 J. M. P. T. 2005 JMPTM Start Statistic-Version 5, 2nd Ed, Duxbury Press, Belmont CA
  15. 15. Shannon C. 1948A Mathematical Theory of Communication. Bell System Technical Journal, 27July and October), 379 423Available from http://plan9.bell-labs.com/cm/ms/what/shannonday/shannon1948.pdf
  16. 16. Tamhane A. C. Dunlop D. D. (2000 2000 Statistics and Data Analysis, Prentice Hall, Upper Saddle River, NJ
  17. 17. Van der Woerd F. Van den Brink. T. 2004Feasibility of a Responsive Scorecard-A Pilot Study. Journal of Business Ethics, 2 173 187
  18. 18. Vincent M. 2008Ivy League Asset Allocation Excites Wall Street. London: Financial Times, 8 April: 9.

Notes

  • The above lean modelling framework and benchmarking procedure was used by one of the authors as a member of the Busch Center of the Wharton School of the University of Pennsylvania, the consulting arm of the Department of Social Systems Science, as a way to focus decision-makers’ attention on differences between their organisation and (1) the related industry as well as (2) a selected firm in the industry used as a positive benchmark—that is, a best practices case that the study firm would like to emulate. Sometimes, a negative benchmark was also used—in this case an organisation from which the study firm wished to distance itself.
  • We do not discuss with the DM the (0.5 loading rule because this concept requires a relatively sophisticated understanding of the mathematical statistics of factor models. Rather we use the simple loading rule that if a Variable had a weight of 0.71 or greater then that variable is a meaningful descriptor of that Factor. In our experience, this level of detail is sufficient for the DM to use the results of the Factor Model. This does assume, of course, that there will be someone in the firm with sufficient training in statistical modeling to actually use the factor software that is to be used in the Delphi process. This is usually the case.

Written By

Chuo-Hsuan Lee, Edward J. Lusk and Michael Halperin

Submitted: 19 October 2010 Published: 06 September 2011