Comparison of different CSSO methods with direct SQP for an analytical optimization problem
1. Introduction
Concurrent Subspace Optimization (CSSO) is one of the main decomposition approaches in Multidisciplinary Design Optimization (MDO). It supports a collaborative and distributed multidisciplinary design optimization environment among different disciplinary groups. Sobieski first proposed the subspace optimization method (Sobieszczanski-Sobieski, 1988), and Sobieski’s blueprint was further developed by Bloebaum and subsequently named the concurrent subspace optimization method (Bolebaum, 1991). Renaud developed a second-order variant of the Global Sensitivity Equation (GSE) method and an alternative potential coordination procedure for the CSSO method (Renaud & Gabriele, 1993a, 1993b, 1994). Sellar proposed to replace GSE with the neutral-network based response surface method (Sellar et al., 1996).
The CSSO method allows a complex couple system to be decomposed into smaller, temporarily decoupled subsystems, each corresponding to different disciplines (subspaces). Each subspace optimization minimizes the system objective function subject to its own constraints as well as constraints contributed from the other subspaces. Each subspace optimization use its own high-fidelity analysis tools as well as given surrogate models or low-fidelity analysis tool provided by the other subspaces for analysis. Subsequently, the subspace optimizations can be performed concurrently. The system-level coordination optimization will be implemented completely based on approximation analysis tools. The subspace optimizations and the coordination optimization will be alternatively performed until results are finally decided by the coordination optimization. Therefore, the CSSO method is particularly suited to applications in a design organization where tasks are distributed among different design groups.
The CSSO method was developed initially for a single objective MDO problem. However, most MDO problems are essentially multi-objective. In recent years more work (Aute & Azarm, 2006; Huang & Bloebaum, 2004; McAllister et al., 2000; McAllister et al., 2004; Orr & Hajela, 2005; Parashar & Bloebaum, 2006; Tappeta & Renaud, 1997; Zhang et al., 2008) has focused on extending existing MDO method to handle such multi-objective MDO problems, by means of integrating a multi-objective optimization method within the MDO framework. This kind of method can be called a multi-objective MDO method.
It is an effective way to integrate multi-objective optimization method within the CSSO framework to develop the multi-objective MDO method. CSSO was extended to solve multi-objective MDO problems, including the Multi-objective Pareto CSSO (MOPCSSO) method, the Multi-objective Range CSSO (MORCSSO) method, the Multi-objective Target CSSO (MOTCSSO) method, the Multi-objective Genetic Algorithm CSSO (MOGACSSO) method and Adaptive Weighted Sum based CSSO (AWSCSSO). In MOPCSSO the Constraint method is integrated within CSSO framework (Huang & Bloebaum, 2007). In MORCSSO and MOTCSSO the concept of designer preference is introduced (Huang & Bloebaum, 2004). In MOGACSSO the Genetic Algorithm is combined with CSSO and in the hope of improving the computational efficiency (Parashar & Bloebaum, 2006). In AWSCSSO the Adaptive Weighted Sum method is introduced into CSSO (Zhang et al., 2008).
2. General description of MDO problem for aircraft system design
Aircraft is a complex engineering system in which multiple disciplines (such as aerodynamics, structure, thrust, noise, electronics, cost, et al.) are included. Actually different disciplines are not independent of each other. For example, deformation of a wing structure affects the aerodynamic lift distribution on the wing and in turn a new deformation is caused, which is the well-known aeroelastic problem. The designers of each discipline cannot work without consideration of other disciplines, which makes the aircraft system design become complicated. Especially in the aircraft preliminary design, the specialists of different disciplines have to often work together and discuss with each other to decide many design variables so that higher performance can be achieved. Aircraft system design is a typical multidisciplinary design problem.
In recent years, industry has paid more attention to improving efficiency in the design of complex systems, such as aircraft. MDO has emerged as an engineering discipline that focuses on the development of new design and optimization strategies for the complex systems. MDO researchers strive to reduce the time and cost associated with the coupling interaction among several disciplines. “Decomposition approaches provide many advantages for the solution of complex MDO problems, as they enable a partitioning of a large coupled problem into smaller, more manageable sub-problems. The resulting computational benefits, besides the obvious one associated with the solution of smaller problems, include creating a potential distributed processing environment. The primary benefit, however, pertains to the savings in personal hours, because groups are no longer required to wait around for other groups in the process to complete their design tasks.”(Huang & Bloebaum, 2007)
Mathematically for minimization problems the general form for MDO can be represented as follows:
Where
3. Different frameworks for concurrent subspace optimization
Global Sensitivity Equation based CSSO and Response Surface based CSSO will be discussed in this chapter as the typical CSSO methods.
3.1. Global Sensitivity Equation based CSSO (GSECSSO)
GSECSSO is a bi-level optimization method. As an example, the GSECSSO method for a problem with subsystem 1 and 2 (Eq. (1): one objective and two coupled subsystems) is stated in following paragraphs.
The mathematical models of subspace optimizations of GSECSSO can be written as
Where
By means of Kresselmeier-Steinhauser (KS) function (Kreisselmeier & Steinhauser, 1979), the cumulative constraint,
Where
Global Sensitivity Equation (GSE) (Sobieszczanski-Sobieski, 1990) is a method for computing sensitivity derivatives of state (output) variables with respect to independent (input) variables for complex, internally coupled systems, while avoiding the cost and inaccuracy of finite differencing performed on the entire system analysis. By using GSE expressed below, the global sensitivities are calculated, by
The responsibility coefficients divide the responsibility of satisfying constraints among all participating subspaces.
The mathematical model of coordination optimization of GSECSSO can be written as
Where
The flowchart of GSECSSO is shown in Fig. 1. The original problem is decomposed and the mathematical models of subspace optimizations and coordination optimization are established as Eq. (2) and Eq. (5). Based on the initial design
In GSECSSO, the subspace optimizations are implemented concurrently with respect to a disjoint subset of design variables, which substantially reduces the complexity of the optimization problem within each disciplinary group. The updated design vector is the simple combination of local optimal design sub-vectors. This provides designers with a significant potential benefit in terms of computational effort. However, sometimes the convergence of GSECSSO is oscillatory and premature. It is due to that the trade-off coefficients and the KS parameter
3.2. A variant of GSECSSO
The variant of GSECSSO (Huang & Bloebaum, 2004) is much more efficient than the original GSECSSO. It is adopted in multiple multi-objective CSSO methods (Huang & Bloebaum, 2004; Parashar & Bloebaum, 2006; Zhang et al., 2008).
In the variant of GSECSSO, several modifications are made: (1) the trade-off coefficients are abandoned since all trade-offs will occur directly with minimization of the objective functions within each subspace; (2) The KS parameter,
Taking the MDO problem in Eq. (1) (one objective, two coupled subsystems, i.e. N=2) as an example, the mathematical models of subspace optimizations for the modified CSSO can be written as
Where
the flowchart of modified CSSO method is shown in Fig. 2. In the first stage, based on the initial design variables the system analysis is implemented. If the initial scenario does not satisfy the constraints, a minimization will be required to reduce initial infeasibility of the constraints as much as possible. In the subsequent stage, all the sensitivities are computed by using GSE. Based on the sensitivity information, the impacts of design variables upon each subspace can be analyzed and the design variables will be allocated to the subspace of the greatest impact. In the third stage, the subspace optimizations are performed concurrently. There is no system optimization in this method. The updated design vector is the simple combination of the local optimal design sub-vectors.
The variant of GSECSSO behaves better than GSECSSO on convergence performance. Nevertheless, how to set the start value and increase step of the KS parameter is still hard to say. These two values will affect convergence to some extent and need to be further investigated. The constraint minimization will bring extra computation cost. Furthermore, the linear approximation may sometimes cause oscillatory and premature convergence.
3.3. Response Surface based CSSO (RSCSSO)
The RSCSSO method performs optimization in bi-level framework. Taking the MDO problem in Eq. (1) (one objective, two coupled subsystems, i.e. N=2) as an example, the mathematical models of subspace optimizations for RSCSSO can be written as
Where
The mathematical model of the coordination optimization of RSCSSO can be written as
In the system-level optimization, all the state variables are approximately calculated and the design variables of all subspaces are optimized.
The flowchart of RSCSSO is shown in Fig. 3. Firstly, the original problem is decomposed into subspace optimizations in Eq. (7) and coordination optimization in Eq. (8). Then couples of sample points of the design variable vector are generated via Design of Experiment (DOE) method, such as the orthogonal experiment design. The system analysis will be performed based on these data and the results will be stored in the database.
Subsequently these data will be used to generate the response surface models for the state variables. After that the subspace optimizations are performed simultaneously, the results of which will be analyzed and augmented into the database for updating the response surface models, as flow (1) in Fig. 3. Then all of the design variables will be optimized in the coordination optimization and the result will be used to update response surface models as well, as flow (2) in Fig. 3. The subspace optimizations and the coordination optimization are alternately performed until stable convergence is achieved.
In the RSCSSO method, the subspace optimizations are to generate sufficient design information for approximation. After the concurrent subspace optimizations, a global approximation problem is formulated about the current design vector using information stored in the design database. It is the coordination procedure of the global approximation that drives constraint satisfaction and the overall system optimization. A fast and robust convergence appears in the RSCSSO method. However, with an increase in design variables the sample points needed for creating response surface models will be greatly increased. Furthermore, augmenting optimal points is not so reasonable for improving response surface models. As the final result is completely decided by the coordination optimization, the subspace optimizations have little impact on the results and can be abandoned.
3.4. Examples
3.4.1. Example 1: An analytical example with coupling relationship
The mathematical model for an analytical MDO example considering two coupling disciplines is of the form
Where
The comparison of the results obtained by different methods is listed in Table 1. Sequential Quadratic Programming (SQP) method is taken as a reference for the convenience of this comparative study. From Table 1, several conclusions can be drawn as follows: (1) the system-level analysis can be remarkably reduced when using different CSSO methods, which shows that the CSSO methods are well suited for MDO problems; (2) the variant of GSECSSO method outperforms the RSCSSO method, in view of the optimization results as well as the required number of system- and disciplinary-level analysis.
variables | SQP | Variant of GSECSSO | RSCSSO |
8.002860 | 8.002943 | 8.031944 | |
3.028427 | 3.028442 | 2.993963 | |
0.000000 | 0.000000 | 0.192359 | |
0.000000 | 0.000000 | 0.000000 | |
Number of system analysis | 122 | 7 | 38 |
Number of discipline 1 analysis | 0 | 72 | 179 |
Number of discipline 2 analysis | 0 | 64 | 144 |
3.4.2. Example 2: Gear reducer optimization
This example is taken from the reference (Azarm & Li, 1989). The object is to minimize the overall weight, subject to the constraints for bending and torque stresses. The mathematical model for the optimization problem is as following:
The optimization problem is classified into two disciplines: bearing discipline and shaft discipline, without any coupling relationship.
The optimization problem is solved by SQP and different CSSO methods. The results are depicted in Table 2. The convergence histories are shown in Fig. 5. From this comparative study, some conclusions can be drawn as follows: 1) Both GSECSSO and the variant of GSECSSO enable the reduction of system analysis, which in turn improve the efficiency; 2) No reduction of the system analysis is achieved by using RSCSSO. Nevertheless, the optimization procedure as well as the data-flow interface is much simpler, and the optimization can be potentially improved by implementing parallel system analysis for sample points and parallel subspace optimization; 3) The variant of GSECSSO method outperforms the GSECSSO method.
Variables | SQP | GSECSSO | Variant of GSECSSO | RSCSSO |
2994.341316 | 2995.606759 | 2995.607422 | 2994.193811 | |
3.5 | 3.500000 | 3.500000 | 3.500021 | |
0.7 | 0.7 | 0.7 | 0.700000 | |
17 | 17 | 17 | 17.000000 | |
7.3 | 7.300000 | 7.3 | 7.300000 | |
7.715320 | 7.733301 | 7.733330 | 7.715316 | |
3.350215 | 3.352131 | 3.352156 | 3.349631 | |
5.286654 | 5.287256 | 5.287246 | 5.286643 | |
Number of system analysis | 40 | 15 | 11 | 51 |
Number of bearing-discipline analysis | 0 | 326 | 188 | 280 |
Number of shaft-discipline analysis | 0 | 2986 | 1818 | 288 |
4. Different frameworks for multi-objective concurrent subspace optimization
The MOPCSSO, MORCSSO and AWSCSSO methods will be discussed in this chapter as the typical multi-objective CSSO methods.
4.1. Multi-objective Pareto (MOPCSSO)
The constraint method is an effective multi-objective optimization method which optimizes preferred objective with others treated as constraints. MOPCSSO is developed by introducing the constraint method into the variant of GSECSSO.
Taking the MDO problem in Eq. (1) (two objectives, two coupled subsystems, i.e. N=2) as an example, the mathematical models of subspace optimizations of MOPCSSO can be written as
The variant of GSECSSO method described in Sub-section 3.2 is adopted in the single objective optimization for each objective function. The flowchart of MOPCSSO is same as that in Fig. 2. During the system optimization, all objective functions are improved simultaneously or individual objective function are improved without worsening the others. The optimization continues until no further improvement can be made so that the Pareto optimum can be obtained finally.
4.2. Multi-objective Range CSSO (MORCSSO)
The goal programming method is one of the most popular multi-objective optimization techniques that consider designer’s preference. MORCSSO and Multi-objective Target CSSO (MOTCSSO) are both developed by combining idea of the goal programming method with MOPCSSO.
The
If the design point does not meet the preference at beginning, the first-stage optimizations are implemented to obtain a design in the preferred range from the starting point.
The mathematical models of subspace optimizations of MORCSSO can be written as
In the second stage the design point is optimized closer to the Pareto frontier gradually within the preferred range.
After the optimization in the first stage, the design is in the designer’s preferred objective range. Then the mathematical models of subspace optimizations of MORCSSO is same as those of MOPCSSO in Eq. (11).
In the course of optimization in the second stage, the design point maybe flees out of the preferred objective range again. In such a case, the optimization below should be performed.
For the MORCSSO method, in the case of
Why does the optimization fail in this case? It can be analyzed from Eq. (15) in the objective space shown in Fig. 7. For Eq. (15), (1)
How to solve this problem? From Fig. 7, we only need to move the line
4.3. Adaptive Weighted Sum based CSSO (AWSCSSO)
The procedure of solving the Pareto front by AWSCSSO is similar to Adaptive Weighted Sum (AWS) method (Kim & de Weck, 2004, 2005). As an example, the AWSCSSO method for a generic bi-objective problem, with subsystem 1 and 2, is stated in the following paragraphs (Eq. (1): two objectives and two coupled subsystems). In the same way AWSCSSO can be also applied for the multi-objective problem with three or more subsystems.
In the first stage a rough profile of the Pareto front is determined.
The variant of CSSO method described in Sub-section 3.2 is adopted in the single objective optimization for each objective function and objective function is normalized as following:
When
Where
Then with a large step size of the weighting factor the usual weighted sum method is used in the variant of GSECSSO to approximate the Pareto front quickly. The subspace optimization for AWSCSSO can be expressed as
Where the value with symbol ‘^’ above is a linearly approximated one,
By estimating the size of each Pareto patch, the refined regions in the objective space are determined. An example of the mesh refinement in AWSCSSO is shown in Fig. 9. Where hollow points represent the newly refined node PE (expected solution) while solid points represent initial four nodes that define the patch. As shown in Fig. 9, the quadrilateral patch is taken as an example. If the line segment that connects two neighboring nodes of the patch is too long, it is divided into only two equal line segments. The central point becomes the
new refined node. These refined nodes are connected to form a refined mesh. Then the sub-optimizations in Eq. (19) are performed using different additional constraints for different refined nodes and the new Pareto points are obtained. In next step, according to the prescribed density of Pareto points, the Pareto-front patch that is too large will be refined again in the same way. In subsequent steps, the refinement and sub-optimizations are repeated until the number of Pareto points does not increase anymore.
In the subsequent stage only these regions are specified as feasible domains for sub-optimization problem with additional constraints. Each Pareto front patch is then refined by imposing additional equality constraints that connect the pseudo nadir point (PN) and the expected Pareto optimal solutions (PE) on a piecewise planar surface in the objective space (as shown in Fig. 10).
Sub-optimizations are defined by imposing additional constraint
The additional inequality constraint is
Where L is the adaptive relax factor that is less than 1.
Fig. 11 shows the framework of AWSCSSO. In Fig. 11,
4.4. Examples
4.4.1. Example 1: Convex Pareto front
This problem is taken from a test problem (Huang, 2003). This is a test problem available in the NASA Langley Research Center MDO Test Suite. It has two objectives, F1 and F2, to be minimized. It consists of ten inequality constraints, four coupled state variables and ten design variables in two coupled subsystems. The mathematical model is not listed here for concision. We refer the readers to the test problem 1 in the corresponding references.
The comparison of the solution obtained by MOPCSSO and AWSCSSO is shown in Fig. 12. It can be concluded that for the problem with convex Pareto front a uniformly-spaced wide-distributed and smooth Pareto front can be obtained by AWSCSS method. When using MOPCSSO I have not captured the whole range.
4.4.2. Example 2: Non-convex Pareto front
This problem consists of two objective functions, six design variables and six constraints. Two objectives,
The comparison of Pareto front obtained by AWSCSSO and MOPCSSO is shown in Fig. 13. It is concluded that, for the problem with non-convex Pareto front, the more uniformly-spaced, more widely-distributed and smoother Pareto front is also obtained by the AWSCSSO method.
4.4.3. Example 3: Conceptual design of a subsonic passenger aircraft
The mathematical model of this problem is defined as
The objective functions in Eq. (23) are to maximize useful load fraction (
Design Variable ∕ unit | Symbol | Lower limit | Upper limit |
Wing area ∕ m2 | S | 111.48 | 232.26 |
Aspect ratio | AR | 9.5 | 10.5 |
Design gross weight ∕ 103kg | 63.504 | 113.400 | |
Installed thrust ∕ 103kg | 12.587 | 24.948 |
Two disciplines, aerodynamics and weight, are considered in this problem. The dataflow between and in subsystems is analyzed in Fig. 14, where
Two disciplines, aerodynamics and weight, are coupled. When the state variables in aerodynamics such as cruise velocity with the longest range, lift coefficients, zero-lift drag coefficients, skin-friction drag coefficients, lift-to-drag ratio are computed, some state variables in Weight such as
The Pareto front obtained using AWSCSSO is shown in Fig. 16. Each solution on Pareto front is obtained using CSSO with iterative subspace optimizations. Taking one of the optimal designs as example, the values of the design variables are:
5. Conclusion
The CSSO method is one of the main bi-level MDO methods. Couples of CSSO methods for single- and multi-objective MDO problems are discussed in this chapter. It can be concluded that, (1) number of the system analysis can be greatly reduced by using the CSSO methods, which in turn improve the efficiency; (2) the CSSO methods enable concurrent design and optimization of different design groups, which can greatly improve efficiency; (3) the CSSO methods are effective and applicable in solving not only single-objective but also multi-objective MDO problems.
For the CSSO methods, although the RSCSSO method behaves more robust, it will actually reduce to a single-level surrogate modeling based MDO method since the subspace optimizations have little impact on the results. So the GSECSSO method is more promising as a bi-level method and worth further studying. The future study on the GSECSSO method will focus on improving its robustness and efficiency. For the multi-objective CSSO methods, the AWSCSSO method behaves better on obtaining widely-distributed Pareto points. The future work on the multi-objective CSSO methods will focus on improving the solution quality and also on testing it for more realistic engineering design problems.
References
- 1.
AIAAAute V. Azarm S. A. 2006 “Genetic Algorithms Based Approach for Multidisciplinary Multiobjective Collaborative Optimization,” , Virginia2006 6953 - 2.
Azarm S. Li W. C. 1989 Multi-level “. Design Optimization. using Global. Monotonicity Analysis,”. A. S. M. E. Journal of. Mechanisms Automation in. Design 111 2 259 263 - 3.
,Bolebaum C. L. 1991 “Formal and Heuristic System Decomposition in Structural Optimization,” NASA-CR-4413,. - 4.
Dissertation, The State University of New York, New York,Huang C. H. 2003 “Development of-Objective Multi. Concurrent Subspace. Optimization Visualization Methods. for Multidisciplinary. Design,” Ph. D. - 5.
AIAAHuang C. H. Bloebaum C. L. 2004 Incorporation “. of Preferences. in-Objective Multi. Concurrent Subspace. Optimization for. Multidisciplinary Design,”. 10th A. I. A. A. I. S. S. M. O. Multidisciplinary Analysis. Optimization Conference. New York. 2004 4548 - 6.
Huang C. H. Bloebaum C. L. 2007 Multi-Objective “. Pareto Concurrent. Subspace Optimization. for Multidisciplinary. Design,” A. I. A. A. Journal Vol. 8 1894 1906 - 7.
Kreisselmeier, G., Steinhauser, R., 1979 "Systematic Control Design by Optimizing a Vector Performance Index," , Zurich, Switzerland,. - 8.
AIAAKim I. Y. de Weck O. L. 2004 Adaptive “. Weighted Sum. Method for. Multiobjective Optimization,”. 10th A. I. A. A. I. S. S. M. O. Multidisciplinary Analysis. Optimization Conference. New York. 2004 4322 - 9.
Kim I. Y. de Weck O. L. 2005 Adaptive “. Weighted-sum Method. for Bi-objective. Optimization Pareto. Front Generation,”. Structural Multidisciplinary Optimization. No 149 158 - 10.
Lewis K. Mistree F. “. 1995 Designing Top-level Aircraft Specifications-A Decision-based Approach to A Multiobjective, Highly Constrained Problem,” , Bellevue, WA, AIAA1431 - 11.
Lewis, K., 1997 “An Algorithm for Integrated Subsystem Embodiment and System Synthesis,” Ph.D. Dissertation, Georgia Institute of Technology, Atlanta, Georgia, August - 12.
Mc Allister C. D. Simpson T. W. Yukesh M. . 2000 Goal Programming Applications in Multidisciplinary Design Optimization,” 8th , CA, AIAA2000 4717 - 13.
AIAAMc Allister C. D. Simpson T. W. Lewis K. Messac A. 2004 Robust “. Multiobjective Optimization. through Collaborative. Optimization Linear Physical. Programming,” 10th. A. I. A. A. I. S. S. M. O. Multidisciplinary Analysis. Optimization Conference. New York. 2004 4549 - 14.
AIAAOrr S. A. Hajela P. 2005 Genetic “. Algorithm Based. Collaborative Optimization. of A. Tilrotor Configuration,”. 46th A. I. A. A. A. S. M. E. A. S. C. E. A. H. S. A. S. C. Structures Structural. Dynamics Materials Conference. Texas 2005 2285 - 15.
AIAAParashar S. Bloebaum C. L. 2006 Multi-objective “. Genetic Algorithm. Concurrent Subspace. Optimization . M. O. G. A. C. S. S. O. for Multidisciplinary. Design,” 47th. A. I. A. A. A. S. M. E. A. S. C. E. A. H. S. A. S. C. Structures Structural. Dynamics Materials Conference. Rhode Island. 2006 2047 - 16.
Renaud, J.E. and Gabriele, G.A., 1993 “Second Order Based Multidisciplinary Design Optimization Algorithm Development,” Advance in Design Automation,65 2 347 357 - 17.
Renaud J. E. Gabriele G. A. “. 1993 Improved coordination in non-hierarchic system optimization,” AIAA Journal,31 12 2367 2373 - 18.
Renaud J. E. Gabriele G. A. 1994 Approximation “. in non-hierarchic. system optimization,”. A. I. A. A. Journal 32 1 198 205 - 19.
AIAASellar R. S. Batill S. M. Renaud J. E. 1996 Response “. surface based. concurrent subspace. optimization for. multidisciplinary system. design,” 34th. A. I. A. A. Aerospace Sciences. Meeting 96 0714 - 20.
Sobieszczanski-Sobieski, J. ,1988 “Optimization by Decomposition: A Step from Hierarchic to Non-hierarchic Systems,” Recent Advances in Multidisciplinary Analysis and Optimization, NASA CP-3031, Hampton,. - 21.
Sobieszczanski-Sobieski J. 1990 Sensitivity “. of Complex. Internally Coupled. Systems,” A. I. A. A. Journal Vol. 1 153 160 - 22.
Tappeta R. V. Renaud J. E. 1997 Multiobjective “. Collaborative Optimization,”. A. S. M. E. Journal of. Mechanical Design. No 403 411 - 23.
Zhang K. S. Han Z. H. Li W. J. Song W. P. Bilevel . 2008 “Adaptive Weighted. Sum Method. for Multidisciplinary. Multi-Objective Optimization,”. A. I. A. A. Journal 46 10 2611 2622