Open access peer-reviewed chapter

Situational Incompetence: An Investigation into the Causes of Failure of a Large-Scale IT Project

Written By

Darryl Carlton

Submitted: 26 February 2018 Reviewed: 27 March 2018 Published: 05 November 2018

DOI: 10.5772/intechopen.76791

From the Edited Volume

Dark Sides of Organizational Behavior and Leadership

Edited by Maria Fors Brandebo and Aida Alvinius

Chapter metrics overview

1,375 Chapter Downloads

View Full Metrics

Abstract

Information technology (IT) projects in the government (public) sector experience significant challenges. Despite decades of research, the adoption of formal methods, the use of external suppliers and packaged software, these remediation attempts have not appeared to have reduced nor mitigated the problems faced when the public sector undertakes large IT projects. Previous studies have examined the causes of IT project failure, in particular these have focused on factor analysis. A relatively limited number of studies have investigated the contribution of IT competence, and even fewer have considered the role and contribution of non-IT executives in IT project outcomes. This study sought a deeper understanding of what drives the behaviour of large-scale IT projects, and has identified a lack of technical competence and narcissistic leadership as drivers of poor project outcomes.

Keywords

  • IT project failure
  • public sector waste
  • failed projects
  • governance
  • project management
  • critical success factors
  • situational incompetence

1. Introduction

‘There are many ways to make large software systems fail. There are only a few ways ofmaking them succeed.’

                                                                                                                               Capers Jones (2004)

The primary question of this research is why. Why, despite all of the experience; the research, the training, the consultants and software companies focusing attention and billions upon billions of dollars expended, IT projects continue to fail. Despite a significant body of research into the contributory factors (reasons) of these failures little consensus exists [1] as to both the rate of actual failure or even how to measure failure.

Given the immense cost of these high levels of failure [2, 3], it is puzzling that greater progress has not been made to ensure that IT Projects are more consistently delivered to specification and customer satisfaction.

One of the reasons for explaining this high rate of failure is that it has been assumed that IT project failure is due to shortcomings in generic project management capability, rather than due to attributes of IT projects in particular. For example, ‘most of the improvement efforts have focused on advancing variations of the traditional project management paradigm, such as (that which) is embodied by the Project Management Body of Knowledge’ [4].

Two questions arise regarding IT project failure research. First, why is the success rate of IT projects so poor? And secondly, why, despite the efforts of many, the situation fails to improve? This problem is known as ‘Cobb’s Paradox’ [5], which states: ‘We know why projects fail; we know how to prevent their failure—so why do they still fail?’. Cobb made the observation in 1995 while attending a presentation by the Standish Group (authors of the Chaos series of reports) while working at the Secretariat of the Treasury Board of Canada. Cobb’s observation that ‘we know why projects fail’ should not be taken in a literal, completely black and white sense, rather it should be considered to be a reference to the collective body of expert commentary, opinion, research and project practitioners that have offered solutions. Despite the successful implementation of major IT projects, repeatable success continues to be elusive [6].

Cobb was not alone in observing that there is a great deal studied and written about project failure, and that consulting firms propose methodologies and remedies but little actual progress appears to have been made. The International Federation for Information Professionals (IFIP) Working Party 8.6 ran a conference to address this specific issue asking ‘why our scholarship has not been more effective. Is the fault one of theory and inadequate understanding? Or is the problem one of knowledge transfer, the failure to embed research knowledge in the working practices of managers and policy-makers’ [7].

Advertisement

2. What is project failure?

For the purposes of consistency this research has adopted the widely understood term for project failure as being projects that fail to be delivered on time, on budget and with the required scope and functionality.

Previous research has identified high-level issues, in particular lack of senior management involvement [8] or a lack of clearly identified deliverables. The ‘problem of poor requirements engineering and management has been repeatedly and widely discussed and documented for at least 10 years as a contributing cause of project failures’ [9] yet the continuous research and new technologies on these topics ‘has not resulted in a practical solution to the problem’.

IT project failures ‘have been extensively documented and studied’ but with little progress actually being achieved makes ‘Cobb’s paradox as topical today as it was a decade ago’ [10].

It is clear that despite decades of industry experience and practice, decades of research, consulting and advice, there exists little consensus as to why projects continue to run over-budget, over-time and deliver less than what was required. Cobb has argued that ‘we know why projects fail’ suggesting that there is a failure to transfer that knowledge into practice. The US military has questioned that premise and intimidated that it is possible that no paradox exists at all, but in fact we simply have not yet identified why IT projects continue to fail [11].

Advertisement

3. Methodology

The primary focus of this research was to address the lack of clinical studies in the literature on IT project failure, and to understand the failings that have occurred in a ‘sticky, practice-based problem’ [12].

The primary case study documents comprising the raw data collection were drawn from two sources:

  1. the published files of the Queensland Commission of Inquiry into the Queensland Health Payroll Project [13], and

  2. documents obtained under Freedom of Information (FOI) requests to the Department of Health Queensland, and to the Queensland Treasury Department.

The total number of pages of witness statements amounted to 3850. In addition there was a collection of project documentation gathered through Freedom of Information requests that exceeded 5000 pages of emails, reports, project plans and other data.

The data and its collection were independent of the researcher and have been drawn directly from the project and from a Government led inquiry into the project. Witness Statements were taken under Oath by representatives of a Court.

The data collection was rigorous and extensive, with thousands of pages of material examined thus supporting ‘triangulation and sampling’ [14]. The large amount of data collected allowed the researcher to minimise influences that might occur in a small data-set. The large volume of both project data and witness testimony ensured that bias had been removed from the source data (as far as practicable), and that subsequent observations could be compared and contrasted across the multiple statements and project records providing, as far as possible, a balanced perspective to emerge.

Advertisement

4. Findings

Information Technology projects fail, and the cost of these failures is staggering [4, 15, 16, 17, 18]. This concern has been highlighted and repeated for more than 40 years [19, 20, 21, 22, 23, 24, 25, 26].

The Standish Group [18] has found that for ‘development projects that exceed $100 million in labour costs, only 2% are successful, meaning on time and within budget. Another 51% are considered challenged or over budget, behind schedule or did not meet user expectations. The rest, 47%, are seen as outright failures’ [6].

The question that this research examined was not which factors were evidenced in the project studied, but why managers continue to make the same mistakes despite all the advice and training that is available. What this research found was that senior departmental leadership, which included the governance board and Department Head, ignored all the evidence and advice that was presented to them. They conducted themselves in a manner that implied that the project was running well, and that they did not require any input from their own team members. It appeared in fact that they distrusted their own staff relying instead on external vendor input. The leadership team of Queensland Health exhibited strong indicators of organisational narcissism resulting in situational incompetence.

Situational Incompetence exists where an otherwise experienced manager is placed in a position of authority over a domain of activity for which they are neither educated nor experienced. Their lack of knowledge leads them to overestimate their own abilities and to underestimate the challenges. Their lack of expertise results in an inability to identify competence in others, and an inability to intuit an appropriate response when the project experiences challenges.

Advertisement

5. Timeline of events

The Queensland Health Payroll Project had its foundations in another project by the Queensland State Government - the creation of a shared service initiative (SSI). The SSI was a business unit of Queensland Treasury and was named CorpTech. The idea behind the SSI was that all of the administration and back-office services required by each Department could be more efficiently undertaken by a single agency.

With this as the foundation, it was the charter of the shared services to deliver a human resources and payroll capability to several government departments, including the Departments of Education and Health.

In about 2005, the SSI commenced work on implementing a universal payroll solution for all Queensland Government Departments and agencies, starting with the largest two, the Department of Education and the Department of Health.

‘After the whole-of-government decision around 2005 to implement (software from) SAP (corporation), Queensland Treasury decided that they were going to be the systems implementation lead’ [27]. Accenture, as an external party, were engaged on a time and materials basis to provide resources to this SSI project [27].

By mid-2007, there were multiple parties involved in providing resources to the whole-of-government project, including Accenture, IBM and Logica. By March of 2007, it had become apparent to senior Department officers that the SSI was facing significant challenges. The Service Delivery and Performance Commission had reported [28] that organisational change was necessary as the project was behind schedule and over budget. The under-Treasurer’ of the Department commissioned a review to identify potential courses of action’ [28]. The report was delivered to the Department on the 18th of April 2007. What evolved from this was the idea of engaging a ‘Prime Contractor’ that would take responsibility for the ongoing project. Subsequently a Request for Information (RFI) was issued on the 2nd of July 2007, with initial responses received by the 12th of July 2007. Of the ten companies invited to respond only four did so: IBM, Logica, Accenture and SAP.

A more detailed Request for Proposal (RFP) was sent to these four companies on the 25th of July 2007. An Invitation to Offer (ITO) was issued on 12th of September 2007. Responses were received from IBM, Logica and Accenture. SAP had withdrawn from the procurement process.

IBM was the successful tenderer and a contract was entered into on the 5th of December 2007. The Queensland Health payroll project was seen as the priority, and the 5th of December contract between IBM and the State Government included a ‘fixed contract’ to be completed by 31st of July 2008 at a cost of A$6.194 million.

By October 2008 it was reported that ‘IBM had not achieved any of the contracted performance criteria’ [27]. By this stage IBM had been paid A$32 million of a revised A$98 million contract and was forecasting completion would cost A$181 million [28]. The A$6.194 million dollar contract that had been entered into less than 1 year previously had now grown in magnitude to an estimated A$181 million.

On the 14th of March 2010 ‘after ten aborted attempts to deliver the new payroll system it went live’ [28]. The project, originally scheduled for completion on the 31st of July 2008, was now 2 years late.

The ‘go-live’ was ‘catastrophic’ [28], requiring 1000 additional manual staff to enter pay adjustments. The project costs by this time had been estimated at $1.2 billion over the next 8 years of operation.

Advertisement

6. Chaos in the Queensland Government

The Queensland State Government did not appear to have a consistent plan for the solutions for HR, payroll, rostering and recruitment. Different technologies were being deployed across different Departments at the same time, utilising the services of multiple vendors. Some vendors were operating as parts of a single project (on occasion), independently on other projects, and competing against each other for additional business. The overall environment appears to have been chaotic.

CorpTech initially went to market ‘to seek products which could be delivered across Government and meet government-wide needs for HR and Payroll’ [28]. IBM was awarded the contract after proposing a ‘consortium of products - SAP was used as the core, and included Workbrain for rostering arrangements, Recruit ASP for recruitment solutions and SABA for knowledge management’ [28].

Prior to the commencement of the Queensland Health payroll project there are what appear to be conflicting projects awarded to different vendors. One contract, to IBM, to implement four software products to provide a state-wide HR and Payroll solution, and a second contract, awarded to Accenture, to implement HR and Payroll for the Department of Housing.

The IBM proposal [28] included four solution components: SAP ECC5, Recruit ASP, Workbrain and SABA. From the witness statements it is apparent that contention arose as to the transparency and appropriateness of the selection process for these products. For example, Mr. Waite, the head of the government agency tasked with implementing these solutions, stated that ‘to the best of my recollection, no choice about Workbrain had been made by the State before the November 2005 contract’ [28]. In the memorandum [29] dated 28th May 2007, it was noted that Workbrain was going to be implemented in 2008 as the replacement rostering solution. It is therefore clear that the intended use of Workbrain predates the IBM proposal and ultimate contract in December 2007.

The choice of solutions architecture for the Queensland Health Payroll project does not appear to have been determined with consideration of the business or technical needs of the Department. According KPMG [30], ‘as of 2005, the Whole-of-Government system for payroll had been identified as SAP ECC5 and Workbrain. As a result, it was decided that QH would replace the Lattice/ESP system with SAP ECC5/Workbrain as part of the Whole-of-Government Shared Services Initiative’ [40]. Other eyewitness accounts placed the decision to adopt a combination of SAP ECC5 and Workbrain at a much later date (during the 2007 proposals and presentations). ‘The presentation provided by IBM indicated that the Workbrain system would become the award interpreter (in lieu of SAP) …. the presentation was potentially a game changer’ [39]. The issue of product selection would become an issue as the project progressed. Integration between SAP and Workbrain became a significant constraint on the project [38]. As these two accounts indicate, even on what should have been a clear and uncontroversial issue; who made the choice of products and when that decision was made is open to many interpretations. One that does not seem to have been resolved by the end of the Commission of Inquiry.

Towards the end of 2008 the ‘IBM team, working in collaboration with the CorpTech Enterprise Architect, obtained and reviewed the documentation for relevance to clarifying the business drivers underpinning the SSI’ [39]. This document, created several years after the commencement of the project, appears to be the first and only document to address the business drivers and explicit requirements of the project.

At the point of issuing the invitation to offer, having already been to market with a request for information and a request for proposal, the Queensland Health/CorpTech team did not have an ‘Initial Statement of Work’. The Government sought [28, 30], and the vendors responded with, fixed price commitments to a project that was devoid of even the most basic of project components—a statement of requirements! In essence, IBM had agreed to undertake a project, at a fixed price, for which no statement of work existed and no detailed planning of any description had been undertaken.

While no explicit business case appears to exist for the project, and none could be sourced either from the Witness Statements or via Freedom of Information requests, various memoranda [31, 32, 33, 34, 35] collectively cite various justifications that could be retrospectively viewed as business case-like rationales, such as the risks facing the existing LATTICE system, and the need to replace it [28]. In May of 2007, the Manager of HR Operations wrote to the Executive Director of Queensland Health Shared Services [29] to outline these risks and make recommendation as to what actions should be pursued. The overriding reasons stated in this communication for a replacement of the LATTICE system with the new SAP/Workbrain solution was the ‘prohibitive costs of maintaining the LATTICE system and its cessation of support in June 2008’ [27]. In essence then, the business case for the new system was that the old system was about to lose its maintenance and support from the vendor. No evidence has been sighted to suggest that any greater understanding of costs and benefits was undertaken before the contract was awarded to IBM for what became a 1 billion dollar disaster.

The solutions design and architecture appears to have been set by some sort of default when the tender responses confirmed the solutions architecture. The time scale was set by virtue of a fixed price quote for work to be completed by the 30th of July 2008, but the tasks and activities were unknown when the contract was signed. The winning tenderer had committed to meet the time and budget using the products preferred by the Queensland Government [28]. A representative of Accenture responded during the Commission of Inquiry that he ‘observed that price and scheduling were key drivers in the decision to award the tender to IBM’ [27]. Commenting further, the Accenture representative could not ‘determine what price IBM was suggesting in terms of the fixed price or the total expected price’ [27]. Accenture had proposed an initial scope of work and pricing much more in line with IBM’s amended quotation some months later of A$180 million. In meetings with senior Department executives Accenture made it clear that they thought IBM’s price would escalate dramatically once they (IBM) understood the scope of work required [27].

The externally engaged legal firm [36], in preparing their advice with respect to each of the proposals from Accenture, IBM and Logica, stated that ‘we believe on balance that IBM’s Offer gives rise to a greater number of material issues and less thought has gone into IBM’s Offer regarding contractual mechanisms that will assist the customer or enhance the working relationship between the parties’ [36]. This shows further evidence that the experts engaged by the Department were highlighting the risks of the IBM proposal, but these concerns were being ignored.

At this stage of the Queensland Health Payroll project, the Queensland Government had accepted a contract to implement an IT project to a business problem for which no business case existed and no technical solutions architecture had been provided. The IT project was shown by the evidence tabled at the Commission [28] and by the analysis of documents, to be a solution to fulfil an unknown set of requirements for a fixed price and timescale, and oddly one already in government use on an existing challenged project. Furthermore, senior management was acting against the advice of their technical experts [37] and external legal advisors [36].

Advertisement

7. Governance and oversight

Why did senior management of the Department appear to simply ignore the findings of the report(s) that they had commissioned? Did they not believe the findings? Did senior management trust the promises of the vendor to produce an outcome despite what they were being told by the external review? It is not immediately obvious why this situation was allowed to unfold in the manner in which it did. The project appeared to comply with all the appropriate governance structures and reporting requirements, yet an historical or retrospective view would allow that the project was never managed effectively. Indeed, the findings of the Commission of Inquiry [28] state that ‘Its (Queensland Health payroll) failure, attended by enormous cost, damage to government and impact on workforce, may be the most spectacular example of all the unsuccessful attempts to impose a uniform solution on a highly complicated and individualised agency’. The Commissions conclusion was that there were two primary causes for the failure of the payroll project (1) ‘unwarranted urgency’ and (2) a ‘lack of diligence on behalf of State officials’ [28]. The Commissions report elaborated further on lack of diligence, describing it as ‘poor decisions made in scoping the Interim Solution, in their Governance of the project, and in failing to hold IBM to account’ [28]. The Commissioner further reported that ‘the problems are systemic to government and to the natural commercial self-interest of vendors’ [28] which supports the observation that Normalisation of Deviance was at play throughout the conduct of this project. However, these findings by the Commission do not explain what motivated senior management to ignore the lessons learned from immediately preceding projects, to ignore the warnings and advice of their own personnel. It is unclear, from the Commissions report, what specific steps a subsequent project might implement to ensure that they too did not all into these traps.

Advertisement

8. The big question … WHY?

These are the clear and obvious failures of the project: project management failed, there was a lack of requirements definition, management was in conflict. All of the issues which appear in the literature on failed projects—nothing new or unexpected!

Of potential significance is that the evidence provided by witness statements mapped to the project chronology showed that issues related to the identified themes were raised by staff and consultants throughout the project phases, and yet they still they remained as issues that were not resolved nor remediated at the time they were raised. The evidence is that management was made aware of these failures. So it was not a lack of awareness of the failure risks, and therefore highlighting these as the contributory factors of project failure lacks explanatory completeness.

As was evident from the analysis of the witness statements - management was regularly informed of what was going on with their project by both staff and external consultants [37]. Management knew that the project was facing problems (or at least should have known). The reports on the 2005 Whole-of-Government initiative [38], the KPMG Report [30], the KJ Ross report on testing [39], the IBM and CorpTech report to ‘reconstruct’ the business requirements [31] and the 2009 Queensland Audit Office report [40] all provided clear statements identifying where the project was failing and what needed to be done to remedy the situation. Yet the problems persisted until the total project costs had blown out to beyond A$1 billion. Faced with the clear and certain statement that the project was performing badly, and with specific statements of where the project was failing, successive managements failed to act appropriately to stem the problems. The only conclusion that can be drawn from this failure to act is that senior executives of the Department, the Governance and steering committees, the Executive Director did not know what specific actions were available to them, or what they specifically needed to do in order to be effective. The Management and oversight of this project were at a complete loss as to how to effectively manage an information technology project.

To examine the case study from the perspective of a timeline of events, of data and the advice that was available at the time to the participants, the researcher has reconstructed the project from the information sourced by FoI. This method of investigation is referred to as being ‘inside the tunnel’; ‘this is the point of view of people in the unfolding situation. To them, the outcome was not known (or they would have done something else). They contributed to the direction of the sequence of events on the basis of what they saw on the inside of the unfolding situation. To understand human error, you need to attain this perspective’ [41]. In examining this case, and in identifying the contributory factors to project failure the researcher has set aside any preconceived notions or ideas as to why the project failed. The contributory factors explained in greater detail below are drawn from the perspective of what was occurring in the project at the time. What did the management of the project know, and why they were motivated to pursue the decisions that ultimately led this project to a disastrous outcome.

Advertisement

9. Project executives lacked domain expertise

‘Organisational artefacts such as mission statements, goals and objectives, strategic plans and the like function as tools to reduce choice, not to guide it’ [42]. In the same manner, the specification of requirements, the business case, the architecture and solution design of the project are all intended to constrain choice to deliver ‘order’. In the QH project ‘order’ should have been represented by a defined scope of work, a defined project plan which sets out not only what work will be done, but also what work will not be done, and by an agreed contract. None of these things existed on the QH payroll project, and any efforts to enforce them were resisted by the vendor with the support (tacit or otherwise) of Departmental executives.

The issue of transparent flows of information between parties, of experts being able to make informed decisions utilising tacit information compared to less experienced people needing to ‘follow the script’ [43], of actors controlling the release of information, and of stakeholders presenting different versions of themselves across multiple stages becomes critical when one considers both the makeup of the governance and management of the QH project and the individuals involved. ‘The involvement of non-IT stakeholders can actually work detrimentally and confound and confuse proceedings, even causing error’ [15]. Non-IT experienced management, placed in a position of authority ‘may be influenced by some suppliers or colleagues to whose IT knowledge they had access, and insist on a certain course of action’ [15] which may result in confusion, delay or inappropriate decision-making, and contribute to the risk of IT project failure.

An appropriate lens through which to view this performance construct is referred to as the Dunning-Kruger Effect [44]. This effect is where the less competent an individual is with respect to a particular domain then the more they are likely to overstate their perceived knowledge and ability. This may be referred to as a ‘confidence/competence dissonance’. Individuals that lack competence in a particular domain (incompetent) but are not self-aware of their lack of competence, generally perceive their performance to be not significantly inferior to those who possess significant competence, training and ability (the experts).

This phenomena has also been described as the Unskilled and Unaware Problem (UUP) [45]. Essentially UUP argues that individuals that are unskilled in a particular domain overestimate their own competence in both absolute terms and relative terms. Top performers underestimate their absolute and relative performance. Kruger and Dunning [44] found that an unskilled person was more likely to dramatically misstate their absolute and relative competence. Ehrlinger et al. [46] have argued that UUP is a persistent feature of decision-making. Furthermore, and potentially much more concerning for complex IT projects, Kruger and Dunning [44] determined that the skills necessary to do the job, are the same skills necessary to identify competence in others. This facet of the UUP research is particularly important when an unskilled individual is placed in a position of decision-making authority, in this case with respect to an IT Project. Where an unskilled individual possesses neither the skills necessary to do the job, nor the skills necessary to identify competence in others they are not in a position to make informed decisions on complex issues. The application of this principle to the Queensland Health Payroll project would suggest that the Executive Director, the Department Secretary, and the governance boards lacked the skills needed to identify competence in others, and to comprehend informed advice when it was provided, preferring instead to rely upon those with similar personality attributes as themselves.

Engelbrecht et al. [15] aimed to ‘identify whether a causal relationship exists between the various components of business managers’ IT competence and IT success’. What they found was that a ‘business managers’ IT competence can, and does, exert a substantial influence on project success’. They reported a ‘surprising’ finding where a lack of knowledge or competence was likely to have a negative impact on project outcomes, ‘although one would have expected a positive relationship and a positive impact, it has been reported that the involvement of non-IT stakeholders can actually work detrimentally and confound and confuse proceedings, even causing errors’.

Engelbrecht et al. [15] also found that ‘business managers may be influenced by some suppliers or colleagues to whose IT knowledge they had access, and insist on a certain course of action. If that business manager is particularly influential in an organisation, then there could be similar confusions, delays, and even inappropriate decisions’. This finding is reflective of the behaviours referred to in the Witness Statements. The senior executives of Queensland Health deferred to the advice of the vendor, rather than their own staff. The researcher in this instance has neither the data nor the training to consider the role of amoral actors in this project, and has elected instead to make the assumption that the entire collective management must have been acting with the best intent for the Department (even if individual actors may have been compromised). This leads the researcher to conclude that it is a lack of knowledge of information technology projects, and the executives inability to parse the information being presented that lays the foundations of a theory to explain how the Queensland Health payroll project became so dysfunctional and ended in failure.

Given the importance of information technologies to business success, and their presence in almost every endeavour, one would expect to see an increase in technically literate, skilled or experienced managements to provide effective oversight and governance. Coertze and vonSolms [47] found that 10% of organisations had Chief Information Officer (CIO) or equivalent representation at board or executive level of organisational governing management. Only 15% of organisations had board members with any IT-related qualifications, and in their United Kingdom (UK) sample, no organisation exhibited board level oversight of organisational IT through qualified representation directly as a board member. A focus on general business competence over specific IT competence continues at the CIO level where less than 50% of CIOs in the United States of America (US) public sector had primary qualifications from technical or engineering backgrounds [48]. Management and leadership is devoid of the skills needed to understand or lead complex information technology projects.

Advertisement

10. Narcissism and leadership competence

Narcissism, in modern terms has been defined as ‘a person who possesses an extreme love of the self, a grandiose sense of self-importance, and a powerful sense of entitlement’ [49], and while generally applied to individuals, the concept of narcissistic personalities has also been applied to groups and organisations [50]. Of significance in this research is that ‘the narcissistic personality is characterised by the denial of a difference between the ideal and the actual self’ [50] which segues directly into the studies of competence versus confidence by Kruger and Dunning [44] and Ryvkin, Krajc and Ortmann [45]. The narcissistic leader that holds ‘very inflated self-views and (is) preoccupied with having those self-views continuously reinforced [51], was a behaviour which was evident on the Queensland Health payroll project, where the evidence suggested that the project was in trouble this was discounted or ignored because it did not fit the ‘self-image’ of the project leader that everything was under control.

Narcissistic leaders in organisations are more likely to engage in behaviour which might lead to failing standards and reduced ethical and moral behaviour [52] which could be seen to be an antecedent for the ‘normalisation of deviance’. As standards fall, decision by decision, what is considered normal behaviour slowly erodes until a ‘new normal’ gradually and almost imperceptibly emerges.

Narcissism is growing and becoming more prevalent and we can expect to see an increase in organisational narcissism as a direct consequence. Twenge and Foster [53] found that ‘there has been a 30% tilt towards narcissistic attitudes in US students since 1979’, and that ‘The Narcissism Epidemic’ [54] breeds ‘the idea that being highly self-confident is the key to success’. Twenge and Campbell [55] were at pains to point out that there is no correlation between confidence and successful outcomes. Kremer [54] reported that ‘over 15,000 journal articles have examined the links between high self-esteem and measurable outcomes in real life, such as educational achievement, job opportunities, popularity, health, happiness and adherence to laws and social codes’ and found no correlation or causation. Highly confident, narcissistic project leaders are likely to exhibit behaviours that would put projects at risk. They over-estimate their own abilities, and are incapable of observing competence in others and learning by observing others. Narcissistic project leaders will be ‘blind’ to evidence that does not support their distorted view of their own abilities and of the status of the project for which they are accountable.

‘Over the last 30 years confidence has replaced competence’ [54]. Positive thinking has replaced knowledge. An increase in narcissism correlates with the unskilled and unaware problem (UUP) in that ‘individuals become so self-obsessed they cannot identify their own weaknesses or learn from others’ [44]. This narcissistic self-belief and confidence may go some way to explain why an executive with little knowledge of information technology and no formal training or experience in information technology would agree to take on the responsibility of running ‘the largest organisational reform undertaken within the State Government’ [28]. When it comes to the QHP project, it was stated very clearly by the Deputy-Secretary of the Department that the Executive-Director was not skilled in information technology but was a very experienced people manager with greater than 30 years in the public sector [56] mostly in Human Resources.

The potential risk that this lack of (Information Technology) domain expertise causes for Information Technology projects generally, and the Queensland Health project as a specific example is encapsulated by the Dunning-Kruger Effect, ‘that incompetent individuals lack the metacognitive skills that enable them to tell how poorly they are performing, and as a result, they come to hold inflated views of their performance and ability’. They are therefore potentially prone to ignore mounting evidence of their contribution to project related issues, to over-estimate their own ability to diagnose and resolve issues, and to listen to and take advice from unreliable sources. All of which were evident in the witness statements.

Of even greater concern is the UUP findings [45] that not only do the domain illiterate individuals tend to overestimate their own ability relative to their actual performance, they are also at risk of being deficient in identifying relevant domain competence in others, ‘participants who scored in the bottom quartile were less able to gauge the competence of others than were their top-quartile counterparts’ [44]. Furthermore, they found that ‘incompetent individuals fail to gain insight into their own incompetence by observing the behaviour of other people. Despite seeing the superior performances of their peers, bottom-quartile participants continued to hold the mistaken impression that they had performed just fine’ [44], which also aligns with the observations of narcissism in leadership positions.

A possible explanation contributing to the Queensland Health Payroll project failure is that where managers are not technically competent, but perceive themselves as managerially capable, not only are they potentially at risk of overestimating their own ability and underestimating the relative competence of the skilled workers on the project, they do not have the skills to discern the quality of advice being given to them. Essentially, the evidence suggests that they are at high risk of not being able to assess the difference between the veracity of a confident but incompetent colleague or vendor providing advice, in comparison to a competent but less-confident colleague.

These managerial perceptions about domain expertise, confidence and competence carry the risk of significant contribution to poor project management decision-making and governance with implications for overall project failure and success. The decision-making senior project manager with accountability, responsibility and authority needs to be able to assess the information provided to them in order to make well-informed decisions. It is contended in the interpretation of the QH project data presented in this study that the consequences of placing domain-challenged persons in positions of project-critical authority is likely to lead to unsatisfactory outcomes where:

  • managers who lack domain expertise will act the part that they perceive they need to adopt;

  • these managers tend to be incapable of identifying the skilled and competent individuals that can be trusted for expert advice;

  • these managers will not have the cognitive or experiential tools to determine an appropriate course of action when faced with a project related crisis; and

  • these managers are likely to confuse confidence with competence and may be subject to undue influence by other incompetent actors.

In summary, the Queensland Health Payroll project was potentially placed at significant risk by failing to appoint management, governance and oversight that comprised sufficient domain expertise appropriately matched to the size, complexity and nature of the project.

11. Situational incompetence

The question of most concern to this researcher has been to uncover why, despite all of the preceding research, publications, education, training and certification that is available to individuals and organisations undertaking project management of an information technology system, a project could still display all of the mistakes, errors and failings that have been identified in the literature.

In order to understand what occurred on the Queensland Health payroll project, a case study analysis was undertaken following a multi-grounded theory approach. The purpose of the research being conducted in this manner was to allow themes to emerge from the data, and to test theories against observable project related behaviour.

The theme that was the most consistent throughout the project was that senior management was repeatedly made aware of project risks and failings. Reports had been written about the whole-of-government project prior to the creation of the Queensland Health project that specifically enumerated the challenges and risks that needed to be kept front-of-mind to the QH project team [30, 57]. The literature provided no plausible explanation to describe the fact that senior executives responsible for the direct execution of the project, and departmental executives with governance and oversight accountability apparently ignored all of the advice that they were presented with.

What emerged from the data was that the executives in charge of the project, those executives that operated above the hands-on technical level, were manifestly incompetent when it came to issues of information systems project management. The executives simply did not understand the information that was being presented to them, and interpreted professional concerns raised by Queensland Health team members as ‘personality conflicts’. These executives were presented with several formal reports outlining risks and issues, and acted in a manner that under conventional wisdom, would defy rational explanation - the witness statements and project documents provide no evidence of any action being taken to address the issues raised. In fact, when the vendor complained that employees of Queensland Health (that were trying to hold the vendor to its contract), were interfering in the project senior executives of the Department ordered their removal, at the specific request of the vendor. No credibility was assigned to the concerns of the departmental staff, and no investigation appears to have been undertaken by senior management as to why the vendor was unhappy.

Engelbrecht et al. [15] suggests that inexperienced managers will seek advice and guidance from inappropriate sources. Kruger and Dunning [44] offer the observation that the Unskilled and Unware [45] are incapable of identifying their own failings, incapable of independently observing and learning from the competence of others, and incapable of identifying competence in others.

These findings have led this researcher to postulate a new theory: Situational Incompetence.

Situational Incompetence applies when an otherwise experienced executive is placed in a position of authority or accountability for which they lack experience, training or specific skills. In this new role they are effectively incompetent and incapable of providing reasoned advice, guidance or management.

Situational Incompetence has implication for how leaders are selected for complex tasks requiring specialist IT domain knowledge and technical competence, it may also apply to the disciplines requiring specific knowledge of the technology in that domain (e.g. accounting, medicine, engineering, science).

Kruger and Dunning point to potential approaches to remediate this failing. They experimented with providing simple mathematical training to unskilled test subjects which resulted in marked improvements in their ability to recognise competence in others, and to more accurately assess their anticipated performance on a comparison scale.

It is proposed that future research test this theory and apply specific training in information technology to senior executives and measure the impact that has on project outcomes for which those executives have a governance, oversight and user-engagement accountability.

‘Someone implementing IT needs to know which levers to pull, in which context, and at what time’ [58]. `uring out which levers to pull, in which context and at what time requires competence and the intuition borne of experience - without this we are left with Situational Incompetence.

References

  1. 1. Jones C. Software Project Management practices: Failure versus success. The Journal of Defense Software Engineering. 2004:5-9
  2. 2. Charette RN. Why software fails. IEEE Spectrum. 2005;42(9):42-49
  3. 3. Hass KB. The blending of traditional and agile Project Management. PM World Today. 2007;9(5):1-8
  4. 4. Hidding, Nicholas. A new way of thinking about IT project management practices: Early empirical results. Journal of Organisational Computing and Electronic Comemrce. 2017;27(1):81-95
  5. 5. Bourne L. Cobb's Paradox is Alive and Well. 2011. Available from: http://www.mosaicprojects.wordpress.com [Viewed: September 2, 2016]
  6. 6. Thibodeau P. Pennsylvania Sues IBM over Troubled $110M IT Upgrade. 2017. Available from: http://www.computerworld.com/article/3180325/it-industry/pennsylvania-sues-ibm-
  7. 7. Dwivedi YK, Wastell D, Henriksen HZ, De R. Guest editorial: Grand successes and failures in IT: Private and public sectors. Information Systems Frontiers. 2015a;17:11-14
  8. 8. Fortune J, White D. Framing of project critical success factors by a systems model. International Journal of Project Management. 2006;24(1):53-65
  9. 9. Kasser JE, Tran X-L, Matissons SP. Prototype Educational Tools for Systems and Software (PETS) Engineering. Australasian: Association for Engineering Education; 2003
  10. 10. Zhou L, Vasconcelos A, Nunes M. Supporting decision making in risk management through an evidence-based information systems project risk checklist. Information Management & Computer Security. 2008;16(2):166-186
  11. 11. Carl JW, Freeman GR. Nonstationary Root Causes of Cobb's Paradox. Defense Acquisition University; 2010, July
  12. 12. Benbasat I, Goldstein DK, Mead M. The case research strategy in studies of information systems. MIS Quarterly. Sep 1987;11(3):369-386
  13. 13. Queensland Commission of Inquiry published files. Available from: http://www.healthpayrollinquiry.qld.gov.au
  14. 14. Bong SA. Debunking myths in qualitative data analysis, qualitative social research. 2002;3(2)
  15. 15. Engelbrecht J, Johnston KA, Hooper V. The influence of business managers' IT competence on IT project success. International Journal of Project Management. 2017;35:994-1005
  16. 16. Hughes DL, Rana NP, Simintiras AC. The changing landscape of IS project failure: An examination of the key factors. Journal of Enterprise Information Management. 2017;30(1):142-165
  17. 17. Hughes DL, Dwivedi YK, Rana NP, Simintiras AC. Information systems project failure – Analysis of causal links using interpretive structural modelling. Production Planning & Control. 2016a;27(16):1313-1333
  18. 18. Standish Group International Chaos technical report. 1994, 1995, 1999, 2001, 2009, 2010, 2014, 2015. Available from: www.standishgroup.com
  19. 19. Davis GB. Management Information Systems: Conceptual Foundations, Structure and Development. NY: Mcgraw-Hill Inc.; 1974
  20. 20. Lucas HC. Implementation: The Key to Successful Information Systems. New York: Columbia University Press; 1981
  21. 21. Maddison RN, Baker GJ, Bhabuta L, Fitzgerald G, Hindle K, Song JHT. Information System Methodologies. New York: Wiley Heyden on behalf of British Computer Society; 1983
  22. 22. Avison DE, Fitzgerald G. Information Systems Development: Methodologies, Techniques, and Tools. 2nd ed. New York: McGraw-Hill; 1995
  23. 23. Hoffer JA, Valacich JS, George JF. Modern Systems Analysis and Design. 2nd ed. Reading, Mass: Addison-Wesley; 1998
  24. 24. Lauden KC, Lauden JP. Information Systems and the Internet: A Problem-Solving Approach. 4th ed. Orlando: The Dryden Press; 1998
  25. 25. Hawryszkiewycz IT. Introduction to Systems Analysis and Design. 5th ed. Frenchs Forest, N.S.W: Prentice Hall; 2001
  26. 26. Nickerson RC. Business and Information Systems. 2nd ed. NJ: Prentice Hall; 2001
  27. 27. WS032_20130305_SALOUK_witness-statement.pdf
  28. 28. WS122_20130731_royal_commission_report.pdf
  29. 29. WS024_20130228-Geoffrey-WAITE.pdf
  30. 30. WS003_20120531_KPMG_Report_dated_31_May_2012.pdf
  31. 31. PD063_20080911_Prog_42.pdf
  32. 32. WS043_20130317_BLAKENEY,-Maree-signed-statement.pdf
  33. 33. WS120_20130624_Submission-for-Contract-IBM-Australia.pdf
  34. 34. WS118_20130617-POLLOCK-Brendan.pdf
  35. 35. WS116_20130611_Kalimnios-Shea-and-Brown.pdf
  36. 36. WS014_20130225a_SWINSON,-John-signed-statement.pdf
  37. 37. WS013_20130225-Statement-of-Craig-Vayo.pdf
  38. 38. WS085_20130507_Margaret-Berenyi.pdf
  39. 39. PD103_20090422-KJ-Ross-draft-report.pdf
  40. 40. PD108_20090501_Queensland Audit Office.pdf
  41. 41. Dekker S. The Field Guide to Human Error Investigations. Cranfield University Press; 2014
  42. 42. Manning PK. Goffman on Organisations. Organisation Studies. 2008;29(05):677-699
  43. 43. Vo-Tran HC. Information Management and Sharing Practices within a Construction Project Process. 2014. Available from: http://researchbank.rmit.edu.au/view/rmit:160710
  44. 44. Kruger J, Dunning D. Unskilled and unaware of it: How difficulties in recognising one's own incompetence lead to inflated self-assessments. 2009
  45. 45. Ryvkin D, Krajc M, Ortmann A. Are the unskilled doomed to remain unaware? Journal of Economic Psychology. 2012;33:1012-1031
  46. 46. Ehrlinger J, Johnson K, Banner M, Dunning D, Kruger J. Why the unskilled are unaware: Further explorations of (absent) self-insight among the incompetent. Organizational Behavior and Human Decision Processes. 2008;105(1):98-121
  47. 47. Coertze J, von Solms R. E-Infrastructure and e-Services for Developing Countries. AFRICOMM 2012. Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering. In: Jonas K, Rai IA, Tchuente M, editors. A Model for Information Security Governance in Developing Countries. Vol. 119. Berlin, Heidelberg: Springer; 2013
  48. 48. Ionescu V. The competencies of the the CIO, a 2016 analysis of the United States of America federal CIO Council members background. Journal of Defense Resource Management. 2017;8(1)
  49. 49. Duchon and Drake 2008 In Patricia Grant and Peter McGhee. Organisational narcissism: A case of failed corporate governance?. Netherlands: The Heart of the Good Institution. Springer; 2013. pp. 97-109
  50. 50. Brown AD. Narcissism, identity, and legitimacy. Academy of Management Review. 1997;22(3):644-646
  51. 51. Chatterjee A, Hambrick DC. Campbell, Goodie and Foster (2004): It's all about me: Narcissistic chief executive officers and their effects on company strategy and performance. Administrative Science Quarterly. 2007;52(3):352
  52. 52. Alvinius A, Johansson E, Larsson G. Negative organizations: Antecedents of negative leadership? In: Watola D, Woycheshin D, editors. Negative Leadership: International Perspectives. Kingston, Canada: Canadian Defence Academy Press; 2016. p. 58
  53. 53. Twenge JM, Foster JD. Birth cohort increases in narcissistic personality traits among American college students, 1982-2009. Social Psychology and Personality Science. 2010;1(1):99-106
  54. 54. Kremer W. Does confidence really breed success?. BBC World Service. 2013. Avaialble from: http://www.bbc.com/news/magazine-20756247 [viewed January 1, 2018]
  55. 55. Twenge JM, Campbell WK. The Narcissism Epidemic: Living in the Age of Entitlement. New York: Free Press; 2010
  56. 56. WS026_20130301a_BRADLEY_Gerard_Statement_signed.pdf
  57. 57. WS004_20120531_KPMG_Report.pdf
  58. 58. Dwivedi YK, Wastell D, Laumer S, Henriksen HZ, Myers MD, Bunker D, Elbanna A, Ravishankar MN, Srivastava SC. Research on information systems failures and successes: Status update and future directions. Information Systems Frontiers. 2015b;17:143-157

Written By

Darryl Carlton

Submitted: 26 February 2018 Reviewed: 27 March 2018 Published: 05 November 2018