December 23, 2015

Optimization With Fixed Budget

The development of modern products is hardly imaginable without application of advanced engineering simulation and optimization tools. To be competitive it is important to find the optimal design solution that provides the best values for all efficiency criteria and meets all the requirements and constraints in a short time.

At the design stage, an engineer operates with some computational model that substitutes a product to be developed. The best practice is to use various CAD/CAE systems for building the computational model (figure 1). The complex product require the complex computational model while the quality of the final optimal configuration depends on the computational model quality. Obviously, the more complex computational model the more resources it requires for calculations. The engineer always has to find a compromise between the quality of developed product and the time required to find the optimal solution.

Figure 1. Scheme of computational model

In this regard, it is a common practice to use metamodels or surrogate models, which replace the resource-intensive computational models. The surrogate models are fast but approximate. These models are built based on data obtained through running of the more accurate underlying computational model.

pSeven algorithmic core provides a wide range of methods and tools to build surrogate models: some of them are special implementation of well-known techniques (e.g., RSM, Splines, Linear Regression, Kriging) [5], others are our proprietary techniques (e.g., Higher Dimensional Approximation, Sparse Gaussian Processes, Tensor Approximation and Incomplete Tensor Approximation, Tensor Gaussian Processes) [5]. However, the engineer should not be worried how to choose the proper techniques for a particular problem, as our SmartSelection tool is smart enough to do it for him.

Once the surrogate model is built, the engineer can use it to conduct a detailed study of the designed product, but also to solve optimization problems. The most common approach (we denote Approx+Optimizer) can be described as follows.

1) Building a surrogate model for each output parameter of the computational model (figure 2). A central question is how to choose the training data to build approximation model? There are various techniques for design of experiments (DOE) provided by pSeven algorithmic core [6]. The most widely used method is Latin Hypercube (LHS) [6]. The number of points is limited by resources the engineer has at his disposal. It is customarily to say that the engineer sets a budget or the number of calls of the computational model (N). At this stage, the training data are usually generated without taking into account functional constraints on design variables and output parameters. 

Figure 2. Building of surrogate model

2) Solution of the initial optimization problem with the surrogate model in the loop (figure 3).

3) Validation of the obtained optimal solution using the initial computational model (figure 3).

Figure 3. Optimization based on surrogate model

The advantage of this approach is its simplicity. The engineer is able to indicate the acceptable budget. If he has data obtained from previous experiments, then he can directly use them to train the surrogate model. However, this approach is not free of drawbacks: there is a likelihood that the found optimum solution after validation stage would be infeasible, in other words, the solution violates as least one constraint. It is unclear which solution in this situation should be selected by the engineer as the best one. Moreover, it is difficult to make recommendations for the engineer that will allow him certainly obtain feasible optimal solution. It will be showed below, that increase in size of training data set does not guarantee that the final solution will satisfy all imposed constraints.

Alternative approach is to use the global optimization method SBO (Surrogate-Based Optimization). This method is developed by the researchers in DATADVANCE. Details of this method can be found on our website [1-3]. It is important that the final solution reached by SBO method always satisfies all constraints.

Mentioned above our SmartSelection tool allows to select the best-suited method for solving optimization problem too. This selection is based on problem statements and other information (so called, Hints) imposed by the engineer. For example, if the criteria and/or constraints are computationally intensive, then SBO method will be selected automatically (figure 4).

Figure 4. GUI of Optimizer block

Program implementation of SBO method in pSeven allows using previously computed configurations. These data should be sent to port designs of Optimizer block(figure 5) in the following sequence: variable parameters, criteria and constraints.

Figure 5. Ports of Optimizer block

It is important to note that the main idea of SBO method is not to build a surrogate model with great accuracy but to build such a model that allows finding a compromise between the accuracy of the approximation model and the number of expensive computational model runs. Therefore, the method can estimate required budget based on the number of design variables, number of criteria and constraints. At the same time, the user has an opportunity to impose the required budget.

The algorithmic implementation of SBO method (it is also true for all methods implemented in pSeven) allows taking into account the linear constraints on the input design variables of the model during the iterative process. In other words, all generated points will satisfy these constraints. This topic will be discussed in the up-coming notes on our website, so stay tuned.

We employed both these approaches to solve a single objective optimization problem of high-speed rotating disk that is the essential part of the gas-turbine engine. There are two distinct models of this disk in Examples section of pSeven package. The first model is implemented based on analytical equations. The second one is implemented based on SolidWorks and ANSYS packages (figure 6). You can familiarize yourself with the details of this problem in Examples and on our website [4]. In short, the problem is to minimize the disk mass (mass, kg) varying 6 geometric design parameters, subjected to two constraints: strength constraint – maximum stress in the disk (smax 600 MPa), assembly constraint – radial displacement (umax 0.3 mm). It should be noted that this example is not intended to be an extensive investigation of the efficiency of these two approaches. This example is for demonstration purpose.

Figure 6. Geometrical model of rotating disk in SolidWorks and stresses in ANSYS

Table 1. Results of experiments

Figure 7. Results of experiments (red circle marker denotes infeasible solution)

Thus, "Approx+Optimizer" approach provides designs with a smaller mass than SBO method while solving this particular problem. However, after validation step, the most part of these designs (with N<230) are turned out to be infeasible.  These solutions are shown in red in Table 1 and Figure 7. These results are connected with the fact that imposed budget is not sufficient to build accurate models for objective function and constraints. For the same reason (small budget) SBO method achieves the best solution only for N=230. SBO method sets this value automatically. However, in all experiments the solutions obtained by SBO are feasible. Moreover, we see that SBO method has a tendency to improve the optimal solution with the increase in the budget.

In conclusion, we admit that SBO and "Approx+Optimizer" approaches are considered to give equally good results for the current problem with large N. The results are very promising, proving the potential of running the optimization on surrogate models. Once again, we draw attention to the fact, that this example is not intended to be an extensive investigation of the efficiency of these two approaches. This example is for a demonstrative purpose only.

Resources:

  1. DATADVANCE pSeven Core Documentation - GTOpt - Surrogate-Based Optimization
  2. DATADVANCE Tech Tips: SBO algorithms for expensive functions optimization
  3. DATADVANCE Tech Tips: Notes on surrogate-based optimization
  4. DATADVANCE pSeven Core - Generic Tool for Sensitivity and Dependency Analysis (GT SDA)
  5. DATADVANCE pSeven Core - Generic Tool for Approximation (GT Approx)
  6. DATADVANCE. pSeven Core - Generic Tool for Design of Experiment (GT DoE)

Interested in the solution?

Click to request a free 30-day demo.

Request demo