14.2. GTOpt Tests

14.2.1. Methodology

GTOpt testing is based on solving various single- and multi-objective optimization problems with or without constraints. The problems that require a special approach (surrogate model assisted methods) due to expensive evaluations of objectives and/or constraints, and robust optimization problems are not yet included in the test suite.

This section describes general methodology and defines measures. Test problem definitions are found in the Test Problems section.

14.2.1.1. Performance Measures

Since testing does not regard the internal details of utilized optimization methods, the test criteria are selected to be as general as possible. We consider the following measures:

  • distance \(Q\) to the analytical or the best known solution,
  • total number of objective function evaluations \(N_f\), and
  • total number of constraint function evaluations \(N_c\).

Thus, for every test problem we have a triple \((Q, N_f, N_c)\), which describes GTOpt performance in this test. By definition, non-regression means that it is not allowed to be dominated in Pareto sense by the previous result \((Q, N_f, N_c)_{prev}\) for the same test. In other words, the test requires that GTOpt performance does not degrade in Pareto sense compared to previous versions:

\[(Q, Nf, Nc)_{now} \leq (Q, Nf, Nc)_{prev}\]

14.2.1.1.1. Distance

The distance \(Q\) to the analytical or the best known solution, which may be considered as the quality of GTOpt solution. Generally, the optimization result contains a number of optimal feasible solutions, so proposed \(Q\) is the Hausdorff half-distance defined as

\[Q(sol, {sol_ref}) = max_{x \in sol} min_{y \in {sol_ref}} \rho(x, y),\]

where \(sol\) stands for the obtained solution, \({sol_ref}\) denotes the reference set, and \(\rho(x,y)\) is the objective-space Euclidean distance between two points. It is assumed that in the case when no analytical solution is available, the reference set is sufficiently dense (contains an adequate number of optimal points).

14.2.1.1.2. Objective Evaluations

The total number of objective function evaluations \(N_f\) required to finish the run. In case the problem provides analytical derivatives for objectives, we add \(N \cdot N_{grad}\) to this number, where \(N\) is the number of problem variables, and \(N_{grad}\) is the total number of objective gradient evaluations.

14.2.1.1.3. Constraint Evaluations

The total number of constraint function evaluations \(N_c\) required to finish the run. In case the problem provides analytical derivatives for constraints, we add \(N \cdot N_{cgrad}\) to this number, where \(N\) is the number of problem variables, and \(N_{cgrad}\) is the total number of constraint gradient evaluations.

14.2.1.2. Performance Profile

Section Performance Measures defines common criteria which are not based on any properties of underlying optimization algorithms and apply directly to any optimization software. Thus they may also be used in comparing different solvers. However, these definitions are limited to one particular test problem, which is inappropriate for representative non-regression testing. For this matter, a conventional approach is proposed to “average” the results from multiple test cases and reveal GTOpt quality with respect to its previous revisions.

For GTOpt the performance profile \(P\) is a function showing the share of test cases for which the performance measure (\(Q\), \(N_f\) or \(N_c\)) is below the given value \(a\):

\[P(a) = \frac{\textrm{number of test cases TC with } {Perf}({\rm TC}) < a}{\textrm{total number of test cases}} ,\]

where \({Perf}({\rm TC})\) is the performance measure. The performance profile function \(P\) monotonically increases from 0 to 1; better results (with regard to the considered performance measure \({Perf}\)) correspond to higher-lying performance profiles.

14.2.2. Test Problems

This section includes individual GTOpt test problems (actual problem definitions used in tests).

14.2.2.1. Single-Objective Problems

task001

Single-objective box-constrained problem in 2D.

\[\begin{split}\begin{array}{c} f(\overline{x}) = 100 (x_1 - x_0^2)^2 + (1 - x_0)^2 \\ x_0 \in [0, 2] \\ x_1 \in [0, 2] \end{array}\end{split}\]

Initial guess: \(x_0 = x_1 = 0\)

Analytical solution: \(x_0 = x_1 = 1, f = 0\)

task002

Single-objective constrained problem in 4D.

\[\begin{split}\begin{array}{c} f(\overline{x}) = -x_0 \\ c_0(\overline{x}) = x_1 - x_0^3 - x_2^2 \\ c_1(\overline{x}) = x_0^2 - x_1 - x_3^2 \\ c_2(\overline{x}) = x_1 - x_0^3 \\ c_3(\overline{x}) = x_0^2 - x_1 \\ x \in [-1, 20]^4 \\ c_0 = 0 \\ c_1 = 0 \\ c_2 \geq 0 \\ c_3 \geq 0 \\ \end{array}\end{split}\]

Initial guess: \(x_0 = x_1 = x_2 = x_3 = 10\)

Analytical solution: \(x_0 = x_1 = 1, x_2 = x_3 = 0, f = -1\)

task003

Single-objective constrained problem in 2D.

\[\begin{split}\begin{array}{c} f(\overline{x}) = x_0^2 + x_1^2 \\ c(\overline{x}) = x_0 + x_1 + 2 \\ x \in [-4, 4]^2 \\ c \leq 0 \end{array}\end{split}\]

Initial guess: \(x_0 = x_1 = -2\)

Analytical solution: \(x_0 = x_1 = -1, f = 2\)

task004

Single-objective constrained problem in 2D.

\[\begin{split}\begin{array}{c} f(\overline{x}) = 100 x_0^2 + 100 x_1^2 - x_0 - 100 \\ c(\overline{x}) = x_0^2 + x_1^2 \\ x \in [-\infty, \infty]^2 \\ c = 1 \end{array}\end{split}\]

Initial guess: \(x_0 = 0.08, x_1 = 0.06\)

Analytical solution: \(x_0 = 1, x_1 = 0, f = -1\)

task005

Single-objective constrained problem in 4D.

\[\begin{split}\begin{array}{c} f(\overline{x}) = x_0 - x_1 - x_2 - x_0 x_2 + x_0 x_3 + x_1 x_2 - x_1 x_3 \\ 8 - x_0 - 2 x_1 \geq 0 \\ 12 - 4 x_0 - x_1 \geq 0 \\ 12 - 3 x_0 - 4 x_1 \geq 0 \\ 8 - 2 x_2 - x_3 \geq 0 \\ 8 - x_2 - 2 x_3 \geq 0 \\ 5 - x_2 - x_3 \geq 0 \\ x \in [0, \infty]^4 \end{array}\end{split}\]

Initial guess: \(x_0 = x_1 = x_2 = x_3 = 0.1\)

Analytical solution: \(x_0 = 0, x_1 = 3, x_2 = 0, x_3 = 4, f = -15\)

task006

Single-objective unconstrained problem in 2D.

\[\begin{split}\begin{array}{c} f(\overline{x}) = (x_0 - 10^4)^2 + (x_1 - 2 \cdot 10{-4})^2 + (x_0 x_1 - 2)^2 \\ x \in [-\infty, \infty]^2 \end{array}\end{split}\]

Initial guess: \(x_0 = x_1 = 1\)

Analytical solution: \(x_0 = 10^4, x_1 = 2 \cdot 10^{-4}, f = 0\)

task007

Single-objective unconstrained problem in 4D.

\[\begin{split}\begin{array}{c} f(\overline{x}) = (x_0-1)^2 + 100 ( (x_1 - x_0^2)^2 + (x_2 - x_1^2)^2 + (x_3 - x_2^2)^2 ) \\ x \in [-\infty, \infty]^4 \end{array}\end{split}\]

Initial guess: \(x_0 = -1, x_1 = x_2 = x_3 = 1\)

Analytical solution: \(x_0 = x_1 = x_2 = x_3 = 1, f = 0\)

14.2.2.2. Multi-Objective Problems

task008

Multi-objective box-constrained problem in 2D.

\[\begin{split}\begin{array}{c} f_{0}(\overline{x}) = 4 (x_0 - 1)^2 + (x_1 - 1)^2 \\ f_{1}(\overline{x}) = (x_0 + 1)^2 + 4 (x_1 + 1)^2 \\ x \in [-10, 10]^2 \end{array}\end{split}\]

Initial guess: \(x_0 = x_1 = 0\)

Analytical solution: \(x_1 = \frac{x_0 + 1 + 16 (x_0 - 1)}{x_0 + 1 - 16 (x_0 - 1)}\)

task009

Multi-objective box-constrained problem in 2D.

\[\begin{split}\begin{array}{c} f_{0}(\overline{x}) = x_0^4 + x_1^4 - x_0^2 + x_1^2 - 10 x_0 x_1 + 0.25 x_0 + 20 \\ f_{1}(\overline{x}) = (x_0 - 1)^2 + x_1^2 \\ x \in [-2, 2]^2 \end{array}\end{split}\]

Initial guess: \(x_0 = x_1 = -1\)

Analytical solution: not known (reference set: GTOpt_task009_ref_set.csv)

task010

Multi-objective constrained problem in 2D.

\[\begin{split}\begin{array}{c} f_{0}(\overline{x}) = x_0^4 + x_1^4 - x_0^2 + x_1^2 - 10 x_0 x_1 + 0.25 x_0 + 20 \\ f_{1}(\overline{x}) = (x_0 - 1)^2 + x_1^2 \\ c(\overline{x}) = (x_0 - 1)^2 + (x_1 - 1)^2 \\ x \in [-2, 2]^2 \\ c \geq 1 \end{array}\end{split}\]

Initial guess: \(x_0 = x_1 = -1\)

Analytical solution: not known (reference set: GTOpt_task010_ref_set.csv)

task011

Multi-objective constrained problem in 2D.

\[\begin{split}\begin{array}{c} f_{0}(\overline{x}) = x_0 \\ f_{1}(\overline{x}) = (1 + x_1) / x_0 \\ x_1 + 9 x_0 - 6 \geq 0 \\ -x_1 + 9 x_0 - 1 \geq 0 \\ x_0 \in [0.1, 1] \\ x_1 \in [0, 5] \end{array}\end{split}\]

Initial guess: \(x_0 = x_1 = 1\)

Analytical solution: not known (reference set: GTOpt_task011_ref_set.csv)

task012

Multi-objective box-constrained problem in 2D.

\[\begin{split}\begin{array}{c} f_{0}(\overline{x}) = 1 - exp( - (x_0 - \sqrt{2})^2 - (x_1 - \sqrt{2})^2 ) \\ f_{1}(\overline{x}) = 1 - exp( - (x_0 + \sqrt{2})^2 - (x_1 + \sqrt{2})^2 ) \\ x \in [-4, 4]^2 \end{array}\end{split}\]

Initial guess: \(x_0 = x_1 = 0\)

Analytical solution: not known (reference set: GTOpt_task012_ref_set.csv)