July 27, 2016

New surrogated based robust design optimization approach in pSeven

In robust optimization the goal is to find a good solution that is stable against some types of uncertainties. We consider the following mathematical definition of robust optimization:

where X denotes design variables bounded by box bounds, ς is uncertainty represented by probability distribution S, F, C1 and C2 are functions that form statistical objectives (mean values) and constraints (mean values or probabilities), depending from deterministic (X) and stochastic (ς) parameters. Li, Ci and α are given bounds.

In pSeven, as a generic approach for solving such optimization problems, we provide a robust optimization technique that is based on common random number (CRN) variance reduction technique. The idea of the approach is to deduce the original problem to a series of approximating optimization problems. Each approximating optimization problem is constructed from the original problem using finite sample approximation for statistical operation and is solved using gradient-based methods. CRN make such approximating optimization problem smoother. Solution of previous approximating optimization problem is used as starting point for the next one. Gradual increase of sample size and accuracy of approximating optimization problem ensures solution convergence of the original problem and also avoids wasting computations away from solution. This approach is local but efficient for smooth problems, when CRN can be generated and no strict restriction on number of evaluations is imposed.  

In previous release (pSeven 6.7) we introduced new robust optimization technique (RSBO) that is based on surrogate models (response surfaces). This new approach, that can handle integer variables, does not rely on CRN and is globalized by default. In framework of this approach surrogate models are built over sample-based approximation for objectives and constraints of the investigated problem. This approach employs optimal computational budgetallocation to adapt the sample size and the intensity of exploration in order to satisfy strict limitation on number of evaluations.

These two approaches have different requirements and cover different types of applications. CRN is suitable for smooth analytical problems. RSBO finds its application in engineering optimization problems with stochasticity. Quantitative comparison of these approaches is hardly reasonable. Below we provide a qualitative overview of results obtained by these approaches on a test problem.

Let us consider a test bi-objective problem:

where

In this problem the goal is to identify robust Pareto frontier taking into account probabilistic constraints. 

   
                        Mean estimation                        Error estimation

On pictures above you can see results of optimization. The approach with CRN can accurately identify robust frontier, but to build such frontier large amount of evaluations are needed (about 300 000 in this case). The RSBO approach may work in cases of much more restrictive computational budget (3 000). The precision of frontier coverage is less but with error estimation it provides good overview for identifying promising design areas for more intensive investigation. 

Several notes about results

1. It is theoretically difficult to have a good estimation for probabilistic constraints.  In contrast to the expectation that converges fast the probability converges much slower.   For common problems usually hundred evaluations are enough to estimate mean, but it may require thousands of evaluations to get probability. And for good approximation of Pareto front hundreds of thousands of evaluations become almost unavoidable. And the question that we tried to resolve by introducing RSBO is to find other good trade-off between number of evaluations and accuracy. 
2. Because of the less accuracy of RSBO some results slightly violate constraints and go beyond the Pareto front.

 

By Alexis Pospelov, Senior Researcher, DATADVANCE