Open access peer-reviewed chapter

The Application of Simple Additive Bayesian Allocation Network Process in System Obsolescence

Written By

Oluwatomi Adetunji

Submitted: 10 May 2021 Reviewed: 24 May 2021 Published: 26 October 2021

DOI: 10.5772/intechopen.98530

From the Edited Volume

Advances in Decision Making

Edited by Fausto Pedro García Márquez

Chapter metrics overview

350 Chapter Downloads

View Full Metrics

Abstract

In designing a system, multi-dimensional obsolescence design criteria such as Scheduling; Reliability, Availability, Maintainability; Performance and Functionality; and Costs affect its overall lifespan. This work examines the impacts of these factors on systems during the design phase using a new application called the Simple Additive Bayesian Allocation Network Process (SABANP). The application uses a combination of Multi-Criteria Decision Making (MCDM) methodology and a Bayesian Belief Network to address the impact of obsolescence on a system. Unlike the requirement of weights that are prevalent in the analysis of MCDM, this application does not require weights. Moreover, this application accounts for functional dependencies of criteria, which is not possible with the MCDM methodologies. A case study was conducted using military and civilian experts. Data were collected on systems’ obsolescence criteria and analyzed using the application to make trade-off decisions. The results show that the application can address complex obsolescence decisions that are both quantitative and qualitative. Expert validation showed that SABANP successfully identified the best system for mitigating obsolescence.

Keywords

  • Obsolescence
  • Multi-Criteria Decision Making
  • Bayesian Belief Network
  • Simple Additive Bayesian Allocation Network Process
  • Diminishing Manufacturing Sources and Material Shortages (DMSMS)

1. Introduction

Obsolescence is an event bound to happen. It occurs when a component or system (hardware and software) cannot carry out required functions or continue to be useful. Reasons include the component not being available for purchase in its original form from the original manufacturer or producer; not being maintainable, affordable to repair, or reliable; technology evolution; and anything else that causes a component or system to no longer be viable [1, 2, 3]. Obsolescence also encompasses discontinuance. However, Pecht and Das [4] made a distinction between the “obsolescence” and “discontinuance” concepts. Discontinuance takes place when the manufacturer stops the production of a component, which occurs at a part-number or manufacturer-specific level, while obsolescence occurs at a technology level [4, 5]. Obsolescence can happen to products, systems, processes, software, policy, standards and organizations.

Many solutions have been proposed for managing obsolescence. However, over the past decade, it is estimated that over $9 billion has been wasted on this problem [6]. The problem is often a result of the rate of technological advancement in systems rendering them obsolete. Managing obsolescence has traditionally been done with a reactive approach. This means that the action taken to resolve the issue occurs after the obsolescence is found. Today, with the rapid growth of digital technology, digital systems and software are reaching their end-of-life sooner rather than later.

Moreover, the contractual agreement between Original Equipment Manufacturer (OEM) and the government is often limited in scope and reactive in nature. This chapter proposes a new application model — a proactive approach that takes into account obsolescence factors that affect systems during the design phase.

The proposed model uses a combination of a Bayesian Belief Network and an MCDM for identifying systems that are more susceptible to obsolescence, which can provide an alert to the system owners to take action before the obsolescence occurs. The combined application of Bayesian Belief Network and MCDM to manage obsolescence in this work serves as the addition to the body of knowledge. This chapter refers to the extension of this methodology as the Simple Additive Bayesian Allocation Network Process (SABANP). The SABANP enables the analyst to define the complex model by connecting a particular Bayesian Believe Network process to the system components (i.e., the leaf nodes), whereby obsolescence characteristics are modeled as an event. When modeling the dynamic characteristics as events, the following set of processes is developed to model the variety of obsolescence criteria:

  1. The time when the event occurs or the time to obsolescence,

  2. The order by which the events will likely occur, and

  3. The event occurrence dependence on time and costs.

The costs that include procurement are required because of their ramifications on the obsolescence problem. The end result of obsolescence in a system is the significant costs it incurs for the organization. The DoD estimated costs upward of $750 M annually for managing obsolescence [7]. Obsolescence is time-dependent; therefore, the model assumed a 95% confidence that systems would be obsolete within two years.

Advertisement

2. Background

In managing system obsolescence, three approaches are used: reactive, proactive and strategic. The reactive approach addresses the obsolescence after the component or part is obsolete, while the proactive one addresses the obsolescence before it occurs [8]. The strategic approach often uses a combination of reactive and proactive approaches to manage the risk of obsolescence; however, the decision models that address obsolescence are underdeveloped [6]. The most agreed upon approach to managing obsolescence is the proactive strategy since it ensures that systems with long life spans are continuously and effectively maintained [9, 10].

Originally, the work began from the need to find better and more effective ways to deal with obsolescence in a proactive manner. To do so, the following obsolescence criteria were chosen from the literature [2, 11, 12]:

(1) Performance and Functionality (P&F); (2) Cost, which includes Acquisition, Licensing and Support; (3) Personnel Training (PT); (4) Reliability, Availability, and Maintainability (RAM); (5) Procurement (PR), which includes Vendor Assembly and Installation Support; (6) Configuration Management (CM); (7) Data Rights and Technical Documentation (DR&TD); and (8) Open Architecture and Standards (OA&S). We also added (9) Technology Readiness Level (TRL), which was adapted from the DoD Technology Readiness Assessment Guidance [13], and (10) Obsolescence Schedule Risk (O&SR).

Each criterion was assessed as either “higher is better” (HG) in the case of a criterion that has a benefit to the stakeholder, or “lower is better” (LW) in the case of non-benefit to the stakeholder to determine the criterion’s weight factor. For example, Cost is defined as LW because high ownership and acquisition costs are a non-benefit to the stakeholder. The same can be said of O&SR. This convention is accounted for within the model construct as shown in Table 1.

Criteria (xj)(1) P&F(2) Cost(3) PT(4) RAM(5) PR(6) CM(7) DR&TD(8) OA&S(9) TRL(10) O&SR
Weight FactorsHGLWHGHGHGHGHGHGHGLW

Table 1.

Criteria benefit and non-benefit weight factors.

Criteria (xj)(1) P&F(2) Cost(3) PT(4) RAM(5) PR(6) CM(7) DR&TD(8) OA&S(9) TRL(10) O&SR
Weights (wj)0.190.120.060.150.110.070.090.070.060.08

Table 2.

Decision maker-assigned weight values.

Experts were asked to assess each system’s P&F; Costs; required level of PT; RAM; whether they are easily Procurable (PR); installation support and vendor assembly in the design phase. These systems run on software programs, and experts were also asked via a survey to assess each system’s CM, the availability of DR&TD, the OA&S, and the TRL by assigning grades on a scale ranging from 0 to 9. These criteria were agreed upon by the experts and are based on the literature review.

Experts rated the O&SR on a scale of 1–5, where 5 represents the highest score and 1 the lowest score. While the rest of the criteria scales are from 0–9, a scale of 1–5 was used for the O&SR because a risk matrix that goes from 1–5 is easily conceptualized by expert practitioners.

2.1 Criteria weights

Weights (wj) were required to sum to one, and all criteria weights met this requirement. The ratings represent the weights of the ten criteria as provided by the decision maker based on experience and expertise. The weight data served as the inputs to the Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS) model. This decision making method, that is, TOPSIS was used to validate the Simple Additive Bayesian Allocation Network Process. Table 2 shows the decision maker-weighted values for each criterion, which total to 1 or 100%.

2.2 The time-discrete Bayesian belief network modeling

The Bayesian Networks are composed of nodes and arcs [14, 15]. Bayesian Networks have the capability to perform diagnostic analysis because of its rich graphical and embedded mathematical capability of modeling and analyzing dynamic behavior of systems [16]. Nodes, in this case, represent Random Variables and arcs between nodes represent the dependencies between the random variables [14]. It uniquely defines joint probability distribution of the random variables. Once the joint probability distribution is known, any random variable query can be solved. Furthermore, random variables can be either infinite (continuous random variable) or finite (discrete random variable) [14]. This chapter only considers discrete random variable since the data that are gathered are from experts and the distribution is discrete. There are three types of nodes: root nodes, sequential root nodes, and leaf nodes. Root nodes are nodes without incoming arcs or without parents, and sequential root nodes are nodes that have incoming arcs and outgoing arcs or are both parents and children. Leaf nodes are nodes without outgoing arcs or without children. Root nodes will have marginal prior probability tables that are associated with them, and sequential root nodes and leaf nodes have conditional probability tables that are associated with them [14]. A conditional probability table provides the probability of each random variable state conditional on the values of its parent nodes. To determine the joint probability distribution, the Chain Rule is used and assumes that the conditional independence is encoded in the designed Bayesian Believe Network structure between the variables.

The joint probability distribution of the variables set X1X2XM is given as follows [4]:

PX1X2Xm=i=1mPXiparentsXi,i=1,2,3,mE1

Recent research works have resulted in better and more efficient algorithms for computing and inferring probabilities in Bayesian Networks. The inference has become easier to the point that algorithms can utilize the independence assumptions between variables and its powerful computations make the command execution quicker for the user. Bayesian Networks have been used extensively in areas such as medical diagnosis, troubleshooting systems, manufacturing control, etc. Nevertheless, it has not been used to mitigate or predict obsolescence in systems with Subject Matter Experts’ (SMEs) input or qualitative analysis.

To develop the model, a discrete-time Bayesian Believe Network is formulated to model the system obsolescence. The leaf node or random variable represents the system component. The system component herein is categorized as a component, sub-system or system that interacts between collections of components. These leaf nodes that are used in the experiment are the Integrated Bridge System that are found on Naval Vessels, as described in Section 2.4.

By using this model, one can analyze the interactions among the criteria and find the critical criteria that could have the most adverse effects on a system’s operations. This model is used to develop simulation test scenarios, such as what if the costs were not a significant factor in the tradeoff analysis or what if the configuration management does not have any effect on the system’s lifecycle. This analysis can also gather information on which obsolescence-related attributes should be prioritized with respect to the design, development, testing, and maintenance.

2.3 Expert judgment

The expert judgments were used to gather the required input data for this experiment. The expert-assigned scores were documented and aggregated for each leaf node based on the agreed upon obsolescence criteria, after normalization of the scores. Though the systems that were employed for the experiment were fielded on Navy ships, the experts were asked to initiate a scenario in which system development was being planned on a new ship class in order to determine the best system against obsolescence through which it can be mitigated.

The system of reference Integrated Bridge System was established after iterative discussions with experts. The research participants were recruited by e-mail with nineteen obsolescence experts completing and returning responses. The minimum requirement for expert judgment participation was set at fifteen returned surveys. Therefore, this number of responses is acceptable for expert judgment. The demographic data also reflected the experts’ diverse experiences. All participants had at least four years of experience in managing obsolescence and DMSMSs with some exceeding 35 years of experience. Approximately 10% had a Ph.D., 58% had a master’s degree, and 32% had a bachelor’s degree. The analysis also revealed that 70% were employed by organizations with 500 or more employees, and these organization sizes ranged from 500 to 100,000 employees.

Participants were asked to complete the study that consisted of approximately 90 data fields. The survey data responses were documented, and the ranking of each alternative on each criterion across the experts was aggregated, normalized and transposed into the model. Exceptions were made for the Cost criterion data scale where the actual cost range data were used. The criteria agreed upon by the experts were used to formulate the survey questions.

2.4 An integrated bridge system (IBS)

An IBS serves as the context of the survey instrument. The IBS on the Naval vessels of the USS Arleigh (DDG-51), the USS Ticonderoga (CG-47) and the USS Nimitz (CVN-68) were examined in this work. The IBS serves as the system-under-test. An IBS is designed to assist the vessel navigator in selecting information that is relevant to the operational context by collecting, processing, and presenting relevant data without cluttering displays with other information that may not be needed at that point. It takes a systems approach to the automated collection, processing, control, and display of ship-control and vital navigational sensor data to maximize the bridge watch efficiency and safety. An IBS is based on human-machine interaction, which integrates all navigational functions and provides accurate navigational information to operators or users with no human error. Its capabilities include multifunction workstations, multiple layers of redundancy, Commercial Off-the-Shelf (COTS) hardware, ease of maintenance, and open system design (i.e., intersystem links to other systems). These systems fall under the purview of being mission essential. Mission essential systems are systems that can have an adverse impact on the mission if they are not operational due to obsolescence.

2.5 Research process

The following steps were used in the research process. One must first determine the current deterministic MCDM methods that are applicable for use in nonlinear (multidimensional) decision analysis. Then, currently available mission-ready systems that serve as points of reference for the study was identified. Finally, an analysis was conducted using the Simple Additive Bayesian Allocation Network Process model with expert judgments in the analysis of alternatives in order to select the best system against obsolescence.

Advertisement

3. Methodology

Simple Additive Bayesian Allocation Network Process (SABANP) utilized the components of the Simple Additive Weighing (SAW) method as the input variables to the Bayesian Belief Network. SAW is an MCDM. To better understand SABANP, it is necessary to provide details of what SAW is and how it is used in the model. The SAW model, which is also known as the WSM or the “weighted average,” is a common approach used for multicriteria analysis [17].

One must first calculate the normalized decision matrix for the benefit criteria (higher is better), where nij is the normalized score of the ith alternative with respect to jth criterion, and rij are the values in the decision matrix provided by the experts [17, 18]:

nij=rijrijmax,i=1,2,3,m,j=1,2,3,nE2

rijmax is the maximum value of the ith alternative with respect to each jth criterion in the decision matrix.

For the non-benefit criteria (lower is better), rijmin is the minimum value of the ith alternative with respect to each jth criterion in the decision matrix [6]:

nij=rijminrij,i=1,2,3,m,j=1,2,3,nE3

The normalized Matrix for IBS is found in Table 3.

The best alternative is the one that maximizes Ai in Eq. (4) below. The weights (wj) are the weighted criteria values, and they sum to 1 as shown in Table 2.

Ai=j=1nwjnij,i=1,2,3,mE4

The SAW model is governed by additive utility theory [19]. As shown in the equation above, each alternative aggregate value is equivalent to the summation of its multiplication. In a one-dimensional case in which the units are similar—for example, seconds, feet, and dollars—the WSM is easy to use [17, 20]. The approach becomes difficult when applied to decision-making problems that are multidimensional [21]. The weights were assessed during the data collection using the direct weighting method. The direct weighting method allows the decision maker to rank the criteria and provide subjective values to the criteria weights based on the defined rank. However, the weights were not needed in the SABANP model to calculate the best alternative. Eqs. (2) and (3) were used since SABANP requires only the normalization of the experts’ inputs. The normalized scores ranging from 0 to 1, as shown in Table 3, were transposed into the SABANP model for the analysis.

3.1 Simple additive Bayesian allocation network process (SABANP)

The SABANP process begins by populating the survey’s raw data into the decision matrix. As shown in Table 4, the score of criterion Cj with regard to alternative Ai is rij and the weight of the Cjis Wj. The weights are not required for the analysis.

CriteriaIBS DDG-51 ClassIBS CG-47 ClassIBS CVN-68 Class
(1) P&F10.9201
(2) Cost0.98410.815
(3) PT0.97170.9371
(4) RAM10.8890.959
(5) PR10.8960.976
(6) CM10.8960.976
(7) DR&TD0.9470.8951
(8) OA&S0.98710.939
(9) TRL110.979
(10) O&SR10.9380.978

Table 3.

SAW normalized matrix.

The time for the system to reach obsolescence was modeled with a 95% confidence interval, i.e., the software and hardware would be obsolete within two years.

C1C2Cn
A 1r11r12r1n
A 2r21r22r2n
A 3r31r32r3n
Amrm1rm2rmn
W1W2Wn

Table 4.

Raw data decision matrix.

The following steps are required to conduct SABANP analysis:

  1. The normalized decision matrix is first determined (Eqs. (2) and (3)). The normalized score (rij) is computed to turn the different attribute dimensions rij into nondimensional attributes, allowing for comparison throughout the attributes.

  2. After the normalized decision matrix is determined, the true and false functions are utilized to establish the initial probabilities:

    The true function is givenasTnij,whereTnij=nij;E5

    And the false function is given as Fnij, where Fnij=1Tnij=nij

  3. Initial probabilities, nij and nij are assigned to each variable set {Xij} in the joint distribution function (Eq. (6)) as shown in Table 5.

C1C2CnC1C2Cn
A 1X11X12X1nA 1n11, n11*n1n, n1n*
A 2X21X22X2nA 2n21, n21*n 2n, n2n*
A 3X31X32X3nA 3n31, n31*n 3n, n3n*
AmXm1Xm2XinAmnm1, n31*n in, nin*

Table 5.

Data decision matrix.

The joint probability distribution (JPD) of the variables set {X11, X21, …, Xmn} is given as follows:

PX11X21Xmn=i=1mj=1nPXijparentsXij,i=1,2,3,m;j=1,2,3,nE6

In modeling the system, the IBS system data collected from experts with respect to the systems alternatives and criteria were analyzed using the SABANP. NETICA™ software was used to develop the model. NETICA™ is a powerful, easy-to-use, complete program for developing belief networks and influence diagrams. It provides an interface for drawing networks, and creating relationships between variables which can be probabilities, equations, or data files.

The systems were modeled using RAM, P&F, PR, CM, DR&TD availability, OA&S, TRL, PT, and O&SR as inherent factors that affect the system’s design with effects on the costs and time to obsolescence. The NETICAL™ software model shows the captured image when the model was simulated twice or when N = 2 as displayed in Figure 1, where N represents the number of times the model and simulation were run. A graphical representation of it is shown in Figure 2 for the IBS 1 on DDG-51. Figures 3 and 4 show the graphical representations of the SABANP models of the IBSs (2 and 3) on CG-47 and CVN-68, respectively.

Figure 1.

System design characteristics of the SABANP model (N = 2) with netica software.

Figure 2.

The IBS for DDG-51 and its equivalent SABANP model.

Figure 3.

The IBS for CG-47 and its equivalent SABANP model.

Figure 4.

The IBS for CVN-68 and its equivalent SABANP model.

The research question was centered on the following: Does using the newly derived methodology (SABANP) to evaluate multiple obsolescence characteristics in a System enable one to predict which system is less susceptible to obsolescence?

The null hypothesis is that SABANP cannot predict which System is less susceptible to obsolescence, and the alternative hypothesis is that SABANP can predict which system is less susceptible to obsolescence. Additionally, statistical analysis was not conducted. Rather, the model was ran one hundred times (100x,) and the results were aggregated for accepting or rejecting the null hypothesis. A sensitivity analysis was also performed on the results.

Three questions were used within the survey to validate the SME inputs. These questions were used to cross-examine the survey data received from the experts. The data were analyzed to check for inconsistencies. Individual responses to system rankings and criteria weights were plotted to ensure that there were no outliers, and that the data are attributed to a credible sample of expert practitioners.

Each criterion was modeled on two functions. The function was based on a true or false table. To construct the table, the true table was given an arbitrary phrase like high, provided, available, supported, highly required, time to obsolescence and best. For example, for RAM, the true table states the following: what is the probability that the system has a high RAM? For P&F, the true table states the following: what is the probability that the system’s P&F is high? This logic of the true table for these examples is applied to the remaining criteria.

The same logic was applied to the false table. To construct the false table, an arbitrary phrase like low, not provided, not available, unsupported, not required, no time and worst was used. For example, for PT, the false table states the following: what is the probability that Personnel Training is not required for the system? In addition, for time, what is the probability that the system will not be obsolete in two (2) years? Table 6 represents the true and false table probabilities. The true table probabilities are the normalized values of the expert judgments that are displayed in Table 3. The false table is the difference between the probability of each event as described in Eq. (5). The criteria are modeled as events. For example, when one event’s true table value is 0.9, the false table value is 0.1. The JPD of the model is given as Eq. (6) and modeled using NETICA™ respectively for IBS 1, IBS 2 and IBS 3 system. Each system is modeled based on the probabilities of each criterion (RAM; P&F; PR; CM; DR&TD availability; OA&S; TRL; PT; and O&SR) given Cost and Time to obsolescence.

CriteriaTRUE IBS DDG-51 ClassFALSE IBS DDG-51 ClassTRUE IBS CG-47 ClassFALSE IBS CG-47 ClassTRUE IBS CVN-68 ClassFALSE IBS CVN-68 Class
(1) P&F100.9200.0810
(2) Cost0.9840.016100.8150.185
(3) PT0.97170.0280.9370.06310
(4) RAM100.8890.1110.9590.041
(5) PR100.8960.1040.9760.024
(6) CM100.8960.1040.9760.024
(7) DR&TD0.9470.0530.8950.10510
(8) OA&S0.9870.013100.9390.061
(9) TRL10100.9790.021
(10) O&SR100.9380.0620.9780.022

Table 6.

True and false table.

3.2 Multi criteria decision making (MCDM)

What is MCDM? MCDM methods provide a way to combine qualitative data (such as expert opinions) and quantitative data in order to analyze various alternatives [17]. When nonlinear factors are present, MCDM techniques are beneficial for discriminating among alternatives. Nonlinear factors are cases where the units of measurement (e.g., feet, seconds, Fahrenheit, and miles/hr.) are not the same. In the case of linear factors, the units of measurement are the same across the attributes or criteria. MCDM techniques can be categorized as fuzzy, stochastic or deterministic [20]. Additionally, a popular MCDM was examined for validation and or comparison to the SABANP result. The MCDM is TOPSIS. TOPSIS was chosen because of its popular usage in the literature and because it is capable of providing a deterministic data approach that accounts for the expert judgment participation, the dimensional criteria space, the methodical representation, and the explanation.

Four steps are required in any decision-making approach that relies on numerical analyses of alternatives to assess system factors’ nonlinearity when selecting an MCDM methodology:

  1. establish the relevant alternatives and criteria,

  2. allocate numerical values to the criteria weights and the alternatives’ impacts on the criteria,

  3. assess the nonlinearity related to the system in consideration when choosing an MCDM method, and.

  4. process numerical values to rank each alternative [17, 21].

There are several methods used to determine criteria weights, such as weight-assessment models [22, 23]. Certain authors have stated that standards are not available for defining which technique yields the most accurate criteria weight because whether the technique is biased is uncertain [24]. Others have suggested pairwise assessment matrices to calculate the significance or weights of criteria [18, 25]. A weighting method can be categorized as subjective or objective, algebraic or statistical, decomposed or holistic, and indirect or direct [17, 26]. In this study, the direct weighting method is selected because this method allows the decision maker to rank the criteria and provide subjective values to the criteria weight based on the defined rank. The direct weighting method was utilized in the analysis of TOPSIS, however, SABANP requires no weight inputs. Often weights are difficult to quantify when there are many experts. The benefit of having no weights is that it simplifies the decision matrix and provides for optimal decision making.

3.2.1 TOPSIS analysis

Established by Yoon [27] and Hwang & Yoon [28], the basic principle underlying TOPSIS is that the selected alternative should have the shortest distance from the positive ideal solution and the farthest distance from the negative ideal solution [29]. Figure 5, which is adapted from Adetunji’s [6] graphic, depicts an assumption for two criteria, where (A) is the negative ideal solution and (A+) is the positive ideal solution [17, 28]. The negative ideal solution is made up of the worst performance value of all the alternatives. The positive ideal solution is made up of the best performance value of all the alternatives.

Figure 5.

Technique for order of preference by similarity to ideal Sslution (TOPIS) graphical representation.

Justifying the selection of A1 is difficult [17, 29] as shown in the visual example in Figure 5; therefore, the proximity (relative closeness) to each of these performance poles (A) and (A+) is measured in the Euclidean sense [17, 28, 30], for which the square root of the sum of the squared distances along each axis is in the ‘attribute space’ [30].

The following steps are required to conduct a TOPSIS analysis:

  1. The normalized decision matrix is first determined using Eq. (7). The normalized score (rij) is calculated to transform the various attribute dimensions Xij from the raw data into non-dimensional attributes, thus allowing for comparisons among the attributes [17, 28, 29]:

    rij=Xiji=1mXij2,wherei=1,2,3,4,m;j=1,2,3,4,nE7

    Table 7 shows the normalized decision matrix.

  2. Calculate the weighted normalized values (vij) by multiplying rij by the criterion weights (wj) [17, 28, 29] (see Table 8 and Eq. (8)):

    vij=wjrij,wherej=1,2,3,4,n;i=1,2,3,4,mE8

    The weighted normalized values are shown in the weight normalized matrix in Table 8.

  3. Evaluate the positive (A+) and negative (A) ideal solution using Eq. (9) [17, 28, 29]:

    A+=maxivijjJminivijjJ+wherei=1234m=v1+v2+v3+vj+vn+,E9
    A=minivijjJmaxivijjJwherei=1234m.=v1v2v3vjvn,
    J+=j=1234njrelates to the benefit criteria}.
    J=j=1234njrelates to the nonbenefit criteria}

    The A+ and A solutions are shown in Tables 9 and 10 respectively.

  4. Calculate each alternative separation from the positive (si+) and negative (si) ideal solutions (use the n-dimensional Euclidean distance) using Eq. (10) [17, 28, 29]:

    Si+=j=1n(vijv+j)2wherei=1,2,3,4m,E10
    Si=j=1n(vijvj)2wherei=1,2,3,4m

    Tables 11 and 12 show the separation measures between the positive and negative ideal solutions.

  5. Calculate the relative closeness of each alternative to the ideal solution (ci*) [17, 28, 29] using Eq. (11):

    ci=SiSi+Si+,0<ci<1,i=1,2,3,4,mE11
    AiequalsA+ifciequals1;AiequalsAifciequals0

  6. Sort the order of preference alternatives by the descending order of ci*: The nearer ci is to one indicates a higher importance of the alternative [17, 28, 29].

CriteriaIBS DDG-51 Class:IBS CG-47 Class:IBS CVN-68 Class:
(1) P&F0.3860.3550.386
(2) Cost0.3320.3270.401
(3) PT0.3620.3550.392
(4) RAM0.3770.3640.388
(5) PR0.3990.3540.382
(6) CM0.3970.3560.387
(7) DR&TD0.3710.3510.392
(8) OA&S0.3820.3870.363
(9) TRL0.3730.3730.366
(10) O&SR0.3740.3980.382

Table 7.

TOPSIS normalized decision matrix for IBS.

CriteriaIBS DDG-51 Class:IBS CG-47 Class:IBS CVN-68 Class:
(1) P&F0.0730.0670.073
(2) Cost0.0400.0390.048
(3) PT0.0220.0210.024
(4) RAM0.0570.0550.058
(5) PR0.0440.0390.042
(6) CM0.0280.0250.027
(7) DR&TD0.0330.0320.035
(8) OA&S0.0270.0270.025
(9) TRL0.0220.0220.022
(10) O&SR0.0300.0320.031

Table 8.

TOPSIS weighted normalized decision matrix for IBS.

V1-V10Positive Ideal
V1+ =0.073
V2+ =0.039
V3+ =0.024
V4+ =0.058
V5+ =0.044
V6+ =0.028
V7+ =0.035
V8+ =0.027
V9+ =0.022
V10+ =0.030

Table 9.

TOPSIS positive ideal solutions for IBS.

V1-V10Negative Ideal
V1- =0.067
V2- =0.048
V3- =0.021
V4- =0.055
V5- =0.039
V6- =0.025
V7- =0.032
V8- =0.025
V9- =0.022
V10- =0.032

Table 10.

TOPSIS negative ideal solutions for IBS.

S1-S3Positive Ideal
S1+ =0.003
S2+ =0.010
S3+ =0.009

Table 11.

TOPSIS separation Mmasures to positive ideal solutions for IBS.

S1-S3Negative Ideal
S1- =0.012
S2- =0.009
S3- =0.009

Table 12.

TOPSIS separation measures to negative ideal solutions for IBS.

3.3 Delivery mechanism

Before administering the research survey, each research participant was provided an information sheet and consent form to complete and instructions on how to complete the survey. Once retrieved, the survey files were password protected and saved. The experts’ participation in the study was voluntary and anonymous. The breakdown of the experts’ demographics is shown in Table 13.

Position(s)Organization(s)Years of Experience
(1) Principal Logistics SpecialistDoD (NSWC Philadelphia)+25
(2) Principal Logistics SpecialistDoD (NSWC Philadelphia)+25
(3) DMSMS/Obsolescence EngineerDoD (NSWC Philadelphia)6
(4) Logistics ManagerDoD Contractor (ICI Services)+35
(5) DMSMS/Obsolescence AnalystNAVSEA SEA 21+7
(6) ILS/Configuration ManagerDoD Contractor (ICI Services)+35
(7) DMSMS/Obsolescence Team LeadDoD (NSWC Port Hueneme)+10
(8) ILS Program ManagerDoD (NSWC Port Hueneme)+15
(9) ILS/Systems EngineerDoD Contractor (LCE Inc.)+30
(l0) DMSMS/Obsolescence ManagerDoD (NSWC Philadelphia)+20
(11) Obsolescence EngineerDoD Contractor (Alion Science)+30
(12) ILS EngineerDoD Contractor (NDI Engineering)+15
(13) Operations EngineerDoD (NSWC Philadelphia)+10
(14) ILS Program ManagerDoD (NSWC Philadelphia)+15
(15) DMSMS/ILS EngineerDoD (NSWC Port Hueneme)4
(16) DMSMS/Obsolescence EngineerDoD (NSWC Philadelphia)+5
(17) DMSMS/Obsolescence Technical Rep.DoD (NSWC Philadelphia)+5
(18) DMSMS/Obsolescence AnalystDoD (NSWC Philadelphia)+4
(19) Systems EngineerDoD Contractor (DDLOMNI Engineering)+20

Table 13.

Experts’ demographics.

Advertisement

4. Results

100 runs of data that were collected from the SABANP model using the IBSs on DDG-51, CG-47 and CVN-68 were aggregated. The results revealed that the DDG-51 IBS has a 52% probability of being the best system among the systems examined in the trade-off analysis as shown in Table 14. The CG-47 IBS is the second best with a 51.78% probability, shown in Table 15 and the CVN-68 IBS is the third best with a 51.56% probability as shown in Table 16. The sensitivity analysis conducted on the result over ten thousand times shows that the DDG-51 IBS is less susceptible to obsolescence than its counterparts.

RunsWorstBest
162.237.8
250.849.2
348.551.5
432.367.7
567.532.5
9972.927.1
10049.650.4
Aggregated Values:47.9952.01

Table 14.

SABANP runs DDG-51 IBS.

RUNSWorstBest
163.736.3
249.250.8
348.551.5
432.367.7
567.532.5
9935.164.9
10057.742.3
Aggregated Values:48.2251.78

Table 15.

SABANP runs CG-47 IBS runs.

RUNSWorstBest
166.533.5
249.250.8
348.551.5
432.367.7
567.532.5
9935.764.9
10057.742.3
Aggregated Values:48.4451.56

Table 16.

SABANP runs CVN-68 IBS runs.

The results of the SABANP model also revealed that in the 100 runs with respect to the best system, Procurement (PR), which includes Vendor Assembly and Installation support, at 79.1% is a critical criterion path that could have the most adverse effect on a system’s lifecycle operations, and it should be prioritized with respect to the design, development, testing, maintenance. This is followed by the TRL at 76.1%, O&SR at 75%, and DR&TD at 72.1%. The least adverse effect is the availability of OA&S at 16.5%. Table 17 show the breakdown of the likelihood of an impact for each obsolescence criterion across the systems that were selected.

Rank (1)Rank (2)Rank (3)
CriteriaMeasureIBS DDG-51 ClassIBS CG-47IBS CVN-68
(1) P&FHIGH43.40%79.90%22.1%
LOW56.60%20.10%77.9%
(2) CostHIGH51.40%49%50.2%
LOW48.60%51%49.8%
(3) PTNOT HIGHLY REQUIRED49%74.8%61.7%
HIGH REQUIRED51%25.2%38.3%
(4) RAMHIGH28.60%17.8%92.4%
LOW71.40%82.2%7.62%
(5) PRSUPPORTED79.10%72.8%34.4%
UNSUPPORTED20.90%27.2%65.6%
(6) CMNOT AVAILABLE55.10%58.2%59.9%
AVAILABLE44.90%41.8%40.1%
(7) DR&TDNOT PROVIDED72.10%61.3%62.4%
PROVIDED27.90%38.7%37.6%
(8) OA&SNOT AVAILABLE16.50%42.2%75.4%
AVAILABLE83.50%57.8%24.6%
(9) TRLHIGH76.10%67.4%37.2%
LOW23.90%32.6%62.8%
(10) O&SRHIGH75%42.2%75.4%
LOW25%57.8%24.6%
Aggregated ResultWorst Case Probability47.99%48.22%48.44%
Best Case Probability52.01%51.78%51.56%

Table 17.

Likelihood of impact for each obsolescence criterion across the selected systems (N = 100).

SystemsTOPSIS “S”TOPSIS “R”
DDG-51 =0.7951
CG-47 =0.4733
CVN-68 =0.4942

Table 18.

IBS ranking of the result (TOPSIS) solutions. “S” score and “R” ranking.

4.1 Validation

To validate the results, comparative analyses were done using TOPSIS. Whereas for SABANP, the weight data were not required since normalization was the only required parameter. The result shows that TOPSIS ranked the DDG-51 IBS as the best system, which follows the results provided by the SABANP model. The results of the ranking and comparisons to TOPSIS is found in the Table 18.

Advertisement

5. Conclusions

The use of SABANP for obsolescence management and as a method for selecting the system or technology that could potentially mitigate obsolescence early in the design stage of a system was successfully demonstrated. The results also indicate that systems should be designed with a proactive obsolescence approach in mind. MCDM that is deterministic (TOPSIS) model was also applied and demonstrated similar results in identifying the best systems among alternatives that mitigate obsolescence. The proposed model is shown to successfully combine quantitative and qualitative expert judgment data that incorporate attributes such as risk and training criteria using SABANP in order to efficiently propagate the evidence of obsolescence.

The analysis conducted in this chapter can serve to provide a holistic analysis of obsolescence in systems. The results of the data analysis were presented to the experts, and concluded that the IBS on DDG-51 was the best system for mitigating obsolescence. Future research should be focused on conducting systems engineering trade studies with the use of SABANP in decision making. This allows for the documentation of early decisions in a program and reducing long-term waste. Notwithstanding, it is recommended that all identified criteria in this work be prioritized equally in the design, development, testing, and maintenance.

This chapter recommends the use of SABANP in obsolescence management in order to conduct trade studies for systems during the design stage. The chapter does not intend to be an authoritative decision mechanism nor provide a recommendation for the best MCDM to use for the normalization technique. Rather, it serves as a new tool, approach, or guidance in obsolescence decision management. Additionally, future research can evaluate the normalization of other MCDM techniques in comparison to the SAW that was utilized in the SABANP model.

Advertisement

Author statement

This chapter is an extension of the author’s original published research article. The views that are conveyed in this article are those of the author and do not represent the official policy or position of NSWC Philadelphia, the DoD, and its contractors, or the U.S. Government.

Advertisement

Acronyms

BBNBayesian Belief Network
CG-47USS Ticonderoga
CMConfiguration Management
COTSCommercial Off-the-Shelf
CVN-68USS Nimitz
DDG-51USS Arleigh
DMSMSDiminishing Manufacturing Sources and Material Shortages
DR&TDData Rights and Technical Documentation
IBSIntegrated Bridge System
MCDMMulti-Criteria Decision Making
NETICALNETICAL Software Tool for modeling system criteria dependence
RAMReliability, Availability, and Maintainability
SABANPSimple Additive Bayesian Allocation Network Process
SAWSimple Additive Weighing
SMESubject Matter Expert
OA&SOpen Architecture and Standards
O&SRObsolescence Schedule Risk
P&FPerformance and Functionality
PRProcurement
PTPersonnel Training
TOPSISTechnique for Order of Preference by Similarity to Ideal Solution
TRLTechnology Readiness Level

References

  1. 1. Geng, F., Dubos, G. F., & Saleh, J. H. (2016). Spacecraft obsolescence: Modeling, value analysis, and implications for design and acquisition. 2016 IEEE Aerospace Conference. doi:10.1109/aero.2016.7500642
  2. 2. Herald, T., Verma, D., Lubert, C., & Cloutier, R. (2009). An obsolescence management framework for system baseline evolution-Perspectives through the system life cycle. Systems Engineering, 12(1), 1–20. doi:10.1002/sys.20106
  3. 3. Sandborn, P. (2007). Designing for technology obsolescence management. In IIE Annual Conference. Proceedings (p. 1684). Institute of Industrial and Systems Engineers (IISE).
  4. 4. Pecht, MG and Diganta Das. 2000. “Electronic Part Life Cycle.” IEEE Transactions on Components and Packaging Technologies 23 (1): 190–192.
  5. 5. Romero Rojo, F. J., Roy, R., & Shehab, E. (2009). Obsolescence management for long-life contracts: state of the art and future trends. The International Journal of Advanced Manufacturing Technology, 49(9–12), 1235–1250. doi:10.1007/s00170-009-2471-3
  6. 6. Adetunji, O., Bischoff, J., & Willy, C. J. (2018). Managing system obsolescence via multicriteria decision making. Systems Engineering, 21(4), 307–321. doi: 10.1002/sys.21436
  7. 7. Adams C. (2005). Getting a handle on cots obsolescence. Avionics Magazine.
  8. 8. Sandborn, P. (2008). Strategic management of DMSMS in systems. DSP Journal, 24–30.
  9. 9. Roy, R., Stark, R., Tracht, K., Takata, S., & Mori, M. (2016). Continuous maintenance and the future – Foundations and technological challenges. CIRP Annals, 65(2), 667–688. doi:10.1016/j.cirp.2016.06.006
  10. 10. Trabelsi, I., Zolghadri, M., Zeddini, B., Barkallah, M., & Haddar, M. (2020, July). FMECA-based risk assessment approach for proactive obsolescence management. In IFIP International Conference on Product Lifecycle Management (pp. 215–226). Springer, Cham.
  11. 11. Verma, D., & Johannesen, L. H. (1999). Supportability engineering and logistics optimization/planning trends and challenges: a system integrator’s perspective. In Proceedings of the International Logistics Congress, University of Exeter.
  12. 12. Verma, D., Powers, T., Blanchard, B. S., Giffin, R. G., Webb, R., & VanBuskirk, D. (1996). COTS/NDI ASSESSMENT AND SELECTION METHODOLOGY. INCOSE International Symposium, 6(1), 123–128. doi:10.1002/j.2334-5837.1996.tb01993.x
  13. 13. Department of Defense, United States (US DoD). (2011). Technology Readiness Assessment (TRA) guidance, Assistant Secretary of Defense of Research and Engineering. https://www.acq.osd.mil/chieftechnologist/publications/docs/tra2011.pdf
  14. 14. Boudali, H., & Dugan, J. (2005). A discrete-time Bayesian network reliability modeling and analysis framework. Reliability Engineering & System Safety, 87(3), 337–349. doi:10.1016/j.ress. 2004.06.004
  15. 15. Salih Geduk & İlkay Ulusoy (2021) A practical analysis of sample complexity for structure learning of discrete dynamic Bayesian networks, Optimization, DOI: 10.1080/02331934.2021.1892105
  16. 16. Kabir, S., & Papadopoulos, Y. (2019). Applications of Bayesian networks and Petri nets in safety, reliability, and risk assessments: A review. Safety Science, 115, 154–175. doi:10.1016/j.ssci.2019.02.009
  17. 17. Georgiadis, D. R., Mazzuchi, T. A., & Sarkani, S. (2012). Using multi criteria decision making in analysis of alternatives for selection of enabling technology. Systems Engineering, 16(3), 287–303. doi:10.1002/sys.21233
  18. 18. Afshari, A., Mojahed, M., & Yusuff, R.B. (2010). Simple additive weighting approach to personnel selection problem, International Journal of Innovation, Management and Technology1, 511–515.
  19. 19. Fishburn, P. C. (1968). Utility Theory. Management Science, 14(5), 335–378. doi:10.1287/mnsc.14.5.335
  20. 20. Triantaphyllou, E., Shu, B., Sanchez, S. N., & Ray, T. (1998). Multi-criteria decision making: an operations research approach. Encyclopedia of electrical and electronics engineering, 15(1998), 175–186.
  21. 21. Triantaphyllou, E., & Mann, S. H. (1989). An examination of the effectiveness of multi-dimensional decision-making methods: A decision-making paradox. Decision Support Systems, 5(3), 303–312. doi:10.1016/0167-9236(89)90037-7
  22. 22. Heerkens, H. (2006). Assessing the importance of factors determining decision-making by actors involved in innovation processes. Creativity and innovation management, 15(4), 385–399.
  23. 23. Olson, D. (2004). Comparison of weights in TOPSIS models. Mathematical and Computer Modeling, 40(7–8), 721–727. doi:10.1016/j.mcm.2004.10.003
  24. 24. Weber, M., & Borcherding, K. (1993). Behavioral influences on weight judgments in multiattribute decision making. European Journal of Operational Research, 67(1), 1–12. doi:10.1016/0377-2217(93)90318-h
  25. 25. Nydick, R. L., & Hill, R. P. (1992). Using the Analytic Hierarchy Process to Structure the Supplier Selection Procedure. International Journal of Purchasing and Materials Management, 28(2), 31–36. doi:10.1111/j.1745-493x.1992.tb00561.x
  26. 26. Ustinovichius, L., Zavadkas, E. K., & Podvezko, V. (2007). Application of a quantitative multiple criteria decision making (MCDM-1) approach to the analysis of investments in construction. Control and cybernetics, 36(1), 251.
  27. 27. Yoon, K. (1980). Systems selection by multiple attribute decision making [Ph. D. thesis]. Manhattan (KS):Kansas State University.
  28. 28. Hwang, C., & Yoon, K. (1981). Methods for Multiple Attribute Decision Making. Multiple Attribute Decision Making, 58–191. doi:10.1007/978-3-642-48318-9_3
  29. 29. Wang, R. (2001). “Performance Evaluation Method-Technique for Order Preference by Similarity to Ideal Solution (TOPSIS).” Researcher.Nsc.Gov.tw/public/caroljoe/Data/02182133671.Ppt.
  30. 30. Kahraman, C. (2008). Multi-Criteria Decision Making Methods and Fuzzy Sets. Springer Optimization and Its Applications, 1–18. doi:10.1007/978-0-387-76813-7_1

Written By

Oluwatomi Adetunji

Submitted: 10 May 2021 Reviewed: 24 May 2021 Published: 26 October 2021