15.6. Changelog

Versions

15.6.1. pSeven Core v2024.04

15.6.1.1. Updates and Changes

  • 480 GTApprox: enabled the support for output noise variance in the HDA technique to improve HDA model quality, if the noise variance data is supplied (see Data with Errorbars, outputNoiseVariance in build()).
  • 480 GTApprox: updated the smart training algorithm to take into consideration the output noise variance when selecting the final model.

15.6.1.2. Documentation

  • 480 Updated section Data with Errorbars and other descriptions related to the output noise variance support, which is now enabled for the HDA technique.
  • 476 Clarified the GTApprox/RSMMapping option description, fixing the incorrect explanation of the "MapStd" method.

15.6.1.3. Bugfixes

  • 469 GTApprox: fixed a regression issue from pSeven Core 6.52 where updating certain models with linear dependent outputs raised an exception.
  • 477 GTApprox: fixed an issue in smart training, where build_smart() raised an exception, if the training sample has tensor structure and contains duplicates or ambiguous points.
  • 480 GTApprox: fixed an issue with using point weights with the HDA technique, which could cause a segmentation fault or model quality degradation.
  • 480 GTApprox: fixed an issue with using output noise variance data in smart training where it used the RSM technique only, if you pass outputNoiseVariance to build_smart().
  • 450 GTOpt, GTDoE: fixed a few issues where the result was incorrectly assigned the USER_TERMINATED or NANINF_PROBLEM status. The USER_TERMINATED status is now set only if the process was interrupted by a watcher or a keyboard interrupt.
  • 450 GTOpt, GTDoE: fixed an issue in tasks with categorical variables where some data could be missing from the result, if an error occurred while processing one of the combinations of categorical variables’ levels.
  • 473 GTDoE: fixed an issue in tasks with deferred constraint evaluations where, if no minimization or maximization objectives are defined, the potentially optimal subset in result appeared empty (in such tasks, all potentially feasible points are also potentially optimal by definition).
  • 483 GTOpt, GTDoE: fixed an irrelevant warning about inaccurate modeling of linear responses appearing in the log in some tasks.

15.6.2. pSeven Core v2024.03

Note

pSeven Core is now available on PyPI so you can install it with pip (python -m pip install psevencore).

The distutils installation method (setup.py) is no longer supported, as well as the deprecated method of installing multiple pSeven Core versions side-by-side. Instructions related to that method are completely removed from this manual. See section Version Compatibility Issues for details.

To install pSeven Core v2024.03 with pip, you have to manually remove the previous versions (v2024.02 and earlier), which were installed using distutils. pip does not support uninstallation of distutils packages; if you get the related error while installing pSeven Core v2024.03, see section Installation and Uninstallation in known issues.

15.6.2.1. New Features

  • 457 pSeven Core is now compatible with Python 3.12, and you can install it from PyPI using pip. See the updated Installation section for details.

15.6.2.2. Updates and Changes

  • 462 GTDoE: added the support for the internal modeling of linear responses in all space-filling DoE techniques and in Adaptive Design of Experiments. Now they automatically enable modeling for responses hinted as linear, except the adaptation type objectives in Adaptive Design of Experiments. For such responses, if no linear model can fit the response data with enough accuracy, that response is deemed non-linear, and GTDoE continues to request its evaluations from the blackbox but issues a warning.

15.6.2.3. Documentation

  • 457 Updated section Installation with regard to installing pSeven Core from PyPI. Also added the description of the pSeven Core version numbering scheme (see Version Numbering).
  • 457 Updated sections Version Compatibility Issues and Known IssuesInstallation and Uninstallation to provide guidelines about removing previous versions of pSeven Core, which were installed using distutils, and about the recommended ways to handle side-by-side installations, version upgrade and downgrade, which replace the obsolete custom side-by-side installation feature.
  • 462 Updated the descriptions of GTDoE hints @GTOpt/LinearityType and @GT/EvaluationLimit due to the added support for linear response modeling in the space-filling DoE techniques and in Adaptive Design of Experiments.

15.6.2.4. Bugfixes

  • 462 GTDoE: fixed a regression issue from pSeven Core v2024.02 where build_doe() raised an exception, if you run a space-filling DoE and specify the design space bounds as a NumPy array (using the blackbox parameter).
  • 462 GTDoE: fixed an issue with linear response modeling in Adaptive Design, where it unexpectedly skipped modeling some of the linear responses — for example, did not model linear minimization or maximization objectives.
  • 456 GTDoE: fixed an issue with optimal points selection (in tasks with minimization or maximization type responses), where the optimal points with missing values of evaluation type responses were not included into the optimal point set.

15.6.3. pSeven Core v2024.02

Note

Since this release, pSeven Core uses the date based version numbering scheme so as to make the numbering consistent with other pSeven products and adhere better to the current Python rules for version specifiers. Release series are titled vyear.month — for example, v2024.02 — with the leading zero in month. The full (package) version number is year.month.days without leading zeros, where days is the number of days since the beginning of the month when this release series started. Note that days can be greater than 31. For example, the v2024.04 series might begin with the initial release 2024.4.18 in April, followed by the next release of this series 2024.4.40 in May, which would be an update or fix to the 2024.4.18 release.

15.6.3.1. New Features

  • 290 GTApprox enables parallel submodel training (“wide” parallelization) for GBRT models with the new fast parallel training mode, which offers an increase in performance though cannot guarantee deterministic training. See Submodels and Parallel Training and GTApprox/SubmodelTraining for details.

15.6.3.2. Updates and Changes

  • 452 GTOpt, GTDoE: if you have set a watcher, the main process (solve(), build_doe()) additionally calls it once the final result is ready, making the final result also available through the watcher. See section Watchers for details.

15.6.3.3. Documentation

15.6.3.4. Bugfixes

  • 452 GTOpt, GTDoE: fixed an issue with watchers where, if you set a watcher, the watched process did not call it when evaluating certain kind of responses, so it was impossible to get intermediate evaluation results or interrupt the process even if the response evaluations take a long time. For example, you could not get an intermediate result from a space-filling DoE technique, and could not interrupt GTOpt if it requested the values of the evaluation type responses while finalizing the result.
  • 452 GTOpt, GTDoE: fixed an issue with watcher where the main process continued to request response evaluations after being interrupted by a watcher.
  • 452 GTOpt, GTDoE: fixed several issues with the optimization and Adaptive Design results where the result contained incorrect values of linear and quadratic responses, if you have defined any maximization type responses, or categorical variables, or constants.
  • 458 GTOpt, GTDoE: fixed an issue where certain values of linear and quadratic responses, which were actually calculated, were missing from Result.designs().

15.6.4. pSeven Core 6.52

15.6.4.1. Bugfixes

  • 449 GTOpt, GTDoE: fixed an exception, which occurred if you use an initial sample in a task with maximization type responses.
  • 446 GTOpt, GTDoE: fixed an issue where you could unintentionally rewrite values of variables in the evaluation history while processing the input data (queryx) in evaluate().
  • 437 GTOpt, GTDoE: fixed an issue with values of quadratic responses missing from results in tasks with internal modeling of quadratic responses enabled.
  • 442 GTOpt, GTDoE: fixed a FutureWarning when you print the optimization or DoE result.
  • 376 GTApprox: fixed a regression from pSeven Core 6.24 where the model code exported to Octave had compatibility issues related to the names of functions generated by GTApprox.
  • 442 GTApprox: fixed a DeprecationWarning when training a model with linear dependencies between outputs.
  • 448 GT: fixed a SyntaxWarning when importing pSeven Core modules in Python 3.12.

15.6.5. pSeven Core 6.51

15.6.5.1. Updates and Changes

  • 381 416 GT: updated pSeven Core to support Python 3.12 and NumPy 1.21 and newer.
  • 415 416 GT: updated p7core.Result to support tasks with deferred response evaluations (the @GT/EvaluationLimit hint set to 0): added the "potentially feasible" and "potentially optimal" sample filters. They provide the designs to consider when evaluating the deferred responses — that is, the designs that will resolve into feasible or infeasible (optimal or suboptimal) after such evaluations, and are not yet defined until then.

15.6.5.2. Documentation

15.6.5.3. Bugfixes

  • 431 GTOpt, GTDoE: fixed an issue where the intermediate result was prepared even when you do not request it. The performance impact of this issue was negligible.
  • 426 GTApprox: fixed an issue where None was not supported as a valid value in model metainformation.

15.6.6. pSeven Core 6.50

15.6.6.1. Updates and Changes

  • 400 GT: updated the string representation of p7core.Result and p7core.gtopt.Result for better readability. Printing a result now also outputs the solutions and design points, if there are less than 40 — which might be useful, for example, when you want to quickly check the solution of a single-objective optimization problem.

15.6.6.2. Bugfixes

  • 413 GTOpt: fixed an issue with maximization objectives where, if you supply an initial sample with such objectives, the result (solution) could be inconsistent with the initial sample — for example, certain optimum points from the initial sample were not included into the result solutions.
  • 415 GTOpt, GTDoE: fixed an issue in tasks with categorical variables, where the optimum point sets in results could contain suboptimal (Pareto dominated) points.
  • 415 GTOpt, GTDoE: fixed an issue where, if you define analytical gradients in your problem, pSeven Core could violate the response evaluation limits (budget).
  • 406 GTApprox: fixed an issue with selecting the train and test data subsets in smart training, which could lower the model quality.
  • 406 GTApprox: fixed an issue in smart training where setting the smooth model requirement (see @GTApprox/ModelFeatures) could cause an unexpected loss of model quality.
  • 414 GTDoE: fixed an issue where infeasible points could appear in the feasible point sets in results.
  • 389 GTDoE: fixed an issue with the GTDoE/Technique option in blackbox-based mode in generate() where the option was not ignored as intended and, if set, caused an exception during generation.
  • 417 GTSDA: fixed an issue with the CSTA method (Sobol indices) where it incorrectly considered an output constant, if the output values in the analyzed sample are small enough (about \(10^{-5}\)).
  • 400 GT: fixed a minor issue with the string representation of the DoE and optimization results where the numbers of feasible and infeasible points displayed when you print a result were different from their actual numbers in the result.

15.6.7. pSeven Core 6.49

15.6.7.1. Bugfixes

  • 407 GTDoE: fixed issues with the initial sample pre-processing in Adaptive Design, where GTDoE raised an exception if the initial sample included any point where a discrete variable has a value that is not in its list of levels, or a point where an integer variable has a non-integer value. Now such points are preserved in Result.designs(), and may also be found in Result.solutions(), if the initial sample provides the required response values in those points. Note that if any response value for such a point is missing from the initial sample, GTDoE never requests its evaluation to avoid violating your task definition.
  • 371 GTDoE: fixed an issue with Adaptive Design results where the result did not include the response evaluations that were not requested by the Adaptive Design algorithm, even if you actually calculated and returned those values.
  • 392 GTApprox: fixed an issue with updating GP models where an error occurred if you use a GP model with categorical variables as the initial model in training.
  • 392 GTApprox: fixed an issue with model update where, if you use an initial model with categorical variables, you were required to specify the categorical variables (set the GTApprox/CategoricalVariables option) even though the types of variables are known from the initial model.
  • 371 GT: fixed an issue where sometimes you could not print a Result using Result.pprint().

15.6.8. pSeven Core 6.48

15.6.8.1. Updates and Changes

  • 405 GTApprox: updated the GTApprox/OutputTransformation option to resolve the related compatibility issue, which appears in pSeven Core 6.47. Added "" as a separate valid value (no longer a synonym of "none"); the "" value is now the default. See the option description for details.

15.6.8.2. Documentation

15.6.8.3. Bugfixes

  • 409 GTDoE, GTOpt: fixed a regression issue from pSeven Core 6.47 where Adaptive Design (build_doe()) or optimization (solve()) could raise an IndexError exception during the results postprocessing.
  • 405 GTApprox: fixed an issue with using an initial model when training a model with multidimensional output (two or more outputs), which could negatively affect the resulting model quality or cause the training to stop with an error. This issue can possibly appear in the following cases:
  • 402 GTApprox: fixed an issue where an exception from build() or build_smart() brought the da.p7core.gtapprox.Builder instance to an invalid state, rendering that instance unusable and requiring you to replace it with a new da.p7core.gtapprox.Builder instance. In particular, this issue was the cause of errors when using GTApprox in a Jupyter notebook, which occurred if you interrupt model training in a notebook.

15.6.9. pSeven Core 6.47

Note

Using this release is not recommended as it has several issues, which are resolved in pSeven Core 6.48. If you update from pSeven Core version 6.46 or below, it is recommended to skip version 6.47 and update directly to 6.48 or a more recent version.

The issues found in this release include:

The above issues are resolved in pSeven Core 6.48.

Also note there is another 6.47 compatibility issue described in Version Compatibility Issues (related to model update and technique compatibility). That one issue is not resolved by updating to 6.48 and may require you to update your code in certain cases.

15.6.9.1. New Features

  • 81 GTApprox now supports incremental training (model update) for Gaussian Processes (GP) models:
    • If you use smart training to update a GP model (specify that model as initial_model to build_smart()), the method is no longer limited to using the Mixture of Approximators (MoA) technique and can return an updated GP model. To ensure that, you can also manually set GTApprox/Technique to "GP" in options.
    • If you use manual training (build()) to update a GP model, you can obtain an updated GP model by setting the GTApprox/Technique option to "Auto" or "GP".

15.6.9.2. Updates and Changes

  • 359 GTDoE: updated the algorithms used to train internal response models in the Adaptive technique, thus improving the technique performance.
  • 359 GTApprox: the GTApprox/GPPower option now supports the "Auto" value and makes it default. That value is primarily intended to explicitly “unlock” the option for automatic tuning, which is a part of smart training.
  • 396 GTApprox: changed the default value of the GTApprox/OutputTransformation option to "auto" to enable its automatic tuning in smart training by default. Previously the default was "none", which blocked the automatic tuning.

15.6.9.3. Documentation

15.6.9.4. Compatibility Issues

This release contains a few compatibility issues — see section Version Compatibility Issues for details and actions required.

  • Removes an inconsistency in the gtapprox.Builder.build() and gtapprox.Builder.build_smart() method behavior when you use them to update an initial model but select an technique not compatible with that model by setting the GTApprox/Technique option. Due to this update, GTApprox in pSeven Core 6.47 and above versions will raise an exception in certain cases which did not raise exceptions in 6.46 and below.
  • Changes the default GTApprox behavior due to the changed default value of the GTApprox/OutputTransformation option. This issue appears in 6.47 only and you can avoid it by skipping this version.

15.6.9.5. Bugfixes

  • 399 GTApprox: fixed an issue where an initial model was ignored in training, so it returned a new model trained on the new data only, instead of updating the initial model with new data.
  • 399 GTApprox: fixed an issue where explicitly setting the "Auto" (default) value for an option “locked” that option in smart training, which could negatively affect the trained model quality: the "Auto" effectively canceled the intelligent tuning of that option, and smart training always used the default option value.
  • 399 GTApprox: fixed an issue with the MoA technique where it could not train a model with a small sized initial sample, even if the sample satisfied the minimum size requirements.
  • 387 GTApprox: fixed an issue with initial models that have linear dependencies in outputs, where those dependencies were ignored when training an updated model, or caused an exception in training.
  • 396 GTApprox: fixed an issue where using an initial model trained with output data transformation applied caused an exception unless you set the GTApprox/OutputTransformation option to "auto".
  • 359 GTDoE: fixed an issue with the automatic technique selection where the Adaptive technique could be selected in a task that it does not support (for example, all categorical variables), resulting in an exception.
  • 359 GTDoE: fixed an issue with the Adaptive Design of Experiments technique where it incorrectly set an error status (such as NANINF_PROBLEM or UNSUPPORTED_PROBLEM) in the generation result, despite the generation had finished without errors — so the correct status would be SUCCESS.
  • 377 GTDoE: fixed an issue with results post-processing, which caused an extra delay before GTDoE returned the final result.

15.6.10. pSeven Core 6.46

15.6.10.1. Bugfixes

  • 384 GTApprox: fixed a regression issue from pSeven Core 6.44 where model export to FMU raised an error, if the model has categorical outputs.
  • 355 GTDoE: fixed an issue with the Adaptive Design technique in unconstrained tasks where the number of generated points was different from count specified to build_doe().
  • 373 375 GTDoE: fixed issues with the Adaptive Design technique in unconstrained tasks without adaptation objectives where it effectively ignored the evaluation limits set for other types of responses (evaluation responses, minimization or maximization objectives).
  • 375 GTDoE: fixed an issue with the Adaptive Design technique where it could finish prematurely before reaching the response evaluation limit when run with an initial sample where some response values are missing.
  • 373 GTDoE: fixed an issue with sample filtering in results where, if you specify a list of filters to Result.designs(), only the first filter from that list was applied.
  • 379 GTDoE: fixed issues with automatic technique selection and default sample size selection in tasks with constant variables.
  • 368 GTDF: fixed issues with the low-fidelity sample bias compensation algorithm (the GTDF/UnbiasLowFidelityModel option) where in certain cases applying the bias compensation noticeably decreased the low-fidelity model quality.

15.6.11. pSeven Core 6.45

15.6.11.1. New Features

  • 220 GTDoE: the Adaptive Design of Experiments technique now supports mixed designs where some variables are continuous while others are discrete (previously it required either all continuous or all discrete variables).
  • 360 GTDoE: all DoE techniques now have a default sample size — if you do not set the number of points to generate, it is determined from the number of variables, their types and properties.

15.6.11.2. Updates and Changes

  • 220 GTDoE: the Adaptive Design of Experiments technique is now usable in the space-filling mode — that is, in tasks that do not define any adaptation objectives and do not provide an initial sample.
  • 360 GTDoE: improved automatic selection of the DoE technique in order to select the most suitable technique for the given task, regarding the types of variables and responses as well as certain other settings.
  • 220 GTDoE: the GTDoE/Adaptive/InitialDoeTechnique option of the Adaptive Design of Experiments technique is now deprecated and should no longer be used as it is going to be removed in future versions. This option is kept for version compatibility only and no longer has any effect — the Adaptive Design of Experiments technique now always generates its initial training set using the uniform sampling criterion (see Criteria), which provides a better DoE for training initial response models.

15.6.11.3. Documentation

15.6.11.4. Bugfixes

  • 220 GTDoE: fixed an issue with the Adaptive Design of Experiments technique where using it from build_doe() in the sample-based mode raised an unexpected UnsupportedProblemError exception.
  • 220 GTDoE: fixed an issue with the Adaptive Design of Experiments technique where setting the GTDoE/Adaptive/InitialCount option to a certain value could cause an InvalidOptionsError exception even though the option value is valid.
  • 360 GTDoE: fixed an issue with discrete and stepped variables in the Orthogonal Array technique where you could not specify a non-trivial number of levels to use for variables of those types — in the GTDoE/OrthogonalArray/LevelsNumber list, the only valid values for those types were 0 (use all levels from the variable definition) and 2 (use the minimum and maximum levels only).
  • 360 GTDoE: fixed an issue with the Orthogonal Array technique where in high-dimensional tasks it generated a DoE consisting of a single point.

15.6.12. pSeven Core 6.44

15.6.12.1. Updates and Changes

  • 328 GTOpt, GTDoE: reworked the implementation of the mixed-integer solver, which is used in tasks with linear responses and discrete or integer variables. In particular, in large-scale linear optimization problems where analytical gradients are enabled (see enable_objectives_gradient(), enable_constraints_gradient()), the new solver provides improved performance and stability, as well as in certain other optimization problems with linear responses and integer variables.
  • 361 GTApprox: updated the implementation of output noise variance for the GP technique, thus improving the quality of GP models trained with that feature enabled (the outputNoiseVariance parameter in build(), build_smart()).

15.6.12.2. Bugfixes

  • 362 GTOpt, GTDoE: fixed an issue with optimization results and GTDoE results in tasks with minimization or maximization objectives where, if several Pareto-equal solutions were found, only one of those solutions was included into the optimal solution set.
  • 369 GTDoE: fixed issues with point filtering in results where certain points were classified incorrectly — for example, in Adaptive Design the feasible point sets could be empty if you prohibit evaluating some objective, despite evaluating objectives is not required to test points for feasibility.
  • 370 GTDoE: fixed an issue with results where certain response values were present in solutions() but missing from designs().
  • 369 GTDoE: fixed an issue with missing response values in initial samples where GTDoE evaluated responses for the initial sample points that violate the variable bounds or have invalid variable levels.
  • 366 GTApprox: fixed an issue where using the GTApprox/IVSubsetSize option in smart training (build_smart()) caused an InvalidOptionsError exception.
  • 367 GTApprox: fixed an issue with model export to FMU where, if the model has categorical inputs that accept integer values of levels, the export changed the set of valid levels for such inputs.
  • 361 GTApprox: fixed a regression issue from pSeven Core 6.11 where exporting an RSM model with categorical inputs to Octave generated code that caused errors when executed.
  • 361 GTDF: fixed an issue with the MFGP technique where it sometimes could not train a model, if the training sample contains a constant column in inputs.
  • 361 GTApprox, GTDF: fixed a rare issue with training samples stored in models, where reading the sample from a model could return incorrect data.
  • 369 GT: fixed an issue with StreamLogger, which caused an exception if the stream argument is an object that does not set its encoding attribute.

15.6.13. pSeven Core 6.43

15.6.13.1. Updates and Changes

  • 357 GTApprox: when training a model with input or output constraints, GTApprox now issues a warning to the training log, if your training sample contains points that violate the constraints set using the x_meta, y_meta arguments to build().

15.6.13.2. Documentation

15.6.13.3. Bugfixes

  • 244 GTOpt: fixed an issue with linear responses where optimization could unexpectedly switch to gradient-based methods, if all expensive responses in your problem are linear, and linear response modeling is required (GTOpt/RestoreAnalyticResponses is True).
  • 331 GTOpt, GTDoE: fixed issues with the @GT/EvaluationLimit response hint where explicitly setting it to any “auto” value (-1 or "Auto") caused unexpected behavior in certain tasks — for example, a response with that hint could be evaluated only once or not evaluated at all.
  • 351 GTOpt, GTDoE: fixed an issue with results (p7core.Result) that contain missing response values where designs() did not recognize some of the available fields listed in da.p7core.Result.designs_fields.
  • 331 GTOpt, GTDoE: fixed an issue in tasks with categorical variables where optimization or DoE generation stopped prematurely, if you set some response evaluation limit.
  • 354 GTDoE: fixed an issue with categorical variables in Adaptive Design where the technique raised an exception, if you do not specify the number of points to generate (the count argument to build_doe()).
  • 351 GTDoE: fixed an issue where using the Orthogonal Array technique caused an exception, if all design variables are frozen (constant) except one continuous variable.
  • 53 GTApprox: fixed a minor regression issue from pSeven Core 6.42 where calculating model internal validation statistics required more CPU time than in previous pSeven Core versions.
  • 345 GTApprox: fixed an issue with error metrics in models with input or output constraints where in certain cases the RRMS and \(R^2\) metrics were calculated incorrectly for such models.
  • 53 GTApprox: fixed an issue where a response gradient could be 0 for an input point where the response value is NaN — in this case, the gradient should also be NaN, since the response is undefined at that point.
  • 352 GTApprox: fixed an issue with the PLA technique in pSeven Core for Windows 32-bit (x86) where PLA trained an incorrect model, if the training sample contains duplicates.
  • 351 GT: fixed an issue with custom watchers (see Watchers) where the process connected with your watcher could call it even after the watcher have returned False to interrupt the process. If the watcher is stateless and can possibly return True after it has once returned False, this issue caused inability to interrupt the watched process.

15.6.14. pSeven Core 6.42

15.6.14.1. Updates and Changes

  • 336 GTDoE: adjusted the handling of generic constraints in Adaptive Design to generate more feasible design points in tasks where the budget is too low to obtain accurate enough constraint models — at the cost of a possible contraction of the design space area explored by the Adaptive Design algorithm.
  • 336 GTApprox: for models with categorical outputs, the raw internal validation data (the "Dataset" key in iv_info) contains the new "Predicted Probabilities" sub-key. This new key contains statistical data for each categorical output, where you can find the predicted probability of every level defined for a given categorical output.
  • 49 GTSDA: you can now use GTSDA sensitivity analysis (rank()) in discrete tasks: the blackbox-based sensitivity analysis methods (the Screening, Sobol (FAST), Sobol (CSTA), and Taguchi techniques in the blackbox-based mode) are updated to support blackboxes with non-continuous variables (discrete, stepped, or categorical).
  • 49 GTSDA: improved performance of the Screening technique, in particular when sensitivity index variance estimates are required (the GTSDA/Ranker/VarianceEstimateRequired option).
  • 49 The Taguchi technique assigns levels to variables automatically, if you do not specify them — the GTSDA/Ranker/Taguchi/LevelsNumber option is no longer required.
  • 341 General: updated pSeven Core for compatibility with recent NumPy versions, including NumPy 1.24.

15.6.14.2. Documentation

15.6.14.3. Bugfixes

  • 339 GTDoE: fixed a regression issue with the Adaptive Design technique from pSeven Core 6.41 where the technique did not generate any new design points, if there is a response that is prohibited to evaluate (the @GT/EvaluationLimit response hint is set to 0).
  • 338 GTOpt, GTDoE: fixed an issue with linear response modeling under a limited budget where a failure to fit the response data to a linear model due to a too restrictive response evaluation limit did not raise an exception as intended with the GTOpt/RestoreAnalyticResponses (GTDoE/AdaptiveDesign/RestoreAnalyticResponses) option set to True, and an incorrect linear model was created and used instead.
  • 200 GTOpt, GTDoE: fixed an unclear warning about the handling of linear responses that depend on a discrete variable with non-uniform distribution of its levels.
  • 338 GTOpt, GTDoE: fixed a compatibility issue with NumPy version 1.11 where the response evaluation limit hints were applied incorrectly if you use pSeven Core with NumPy 1.11.
  • 348 GTApprox: fixed an issue where some valid models caused an exception with an incorrect message about an incompatible or corrupt model when you load a model from file or deserialize it from a string.
  • 336 GTApprox: fixed issues with models that define input constraints:
    • Model information (details) of a constrained model contained incorrect values of error metrics because input constraints were disregarded in error calculation. Correct information could be obtained only by validating the model against the training sample, using validate().
    • Internal validation information (iv_info) of a constrained model contained incorrect values of error metrics because input constraints were disregarded during the internal validation.
  • 336 GTApprox: fixed an issue with models having categorical inputs or outputs where the raw internal validation data (the "Dataset" key in iv_info) did not contain the specific values of categories (input levels, output class labels) but stored their indexes used internally by GTApprox.
  • 49 GTSDA: fixed a regression issue from pSeven Core 6.16 where the Taguchi technique ignored the GTSDA/Ranker/Taguchi/LevelsNumber option.
  • 49 GTSDA: fixed an issue with the Sobol and Screening techniques where these techniques produced incorrect sensitivity scores or variance estimates, and rank() could raise an exception, if the data sample you use contains a constant input column, or there is a frozen variable (a constant) defined in your blackbox.
  • 341 General: fixed a compatibility issue with NumPy version 1.24 that caused an ImportError when using pSeven Core modules.

15.6.15. pSeven Core 6.41

15.6.15.1. Updates and Changes

  • 302 323 GTOpt, GTDoE: problem validation now runs faster thanks to the improved implementation of the gtopt.Solver.validate() and gtdoe.Generator.validate() methods.
  • 315 GTOpt, GTDoE: reworked handling of discrete variables in Surrogate-Based Optimization and Adaptive Design algorithms, improving result quality and algorithm stability in tasks with discrete variables.
  • 295 GTApprox: improved export of GP-based models with a high number of inputs to C source code formats (executable, DLL) to optimize the model’s memory usage and avoid issues caused by high memory consumption, that could lead to a stack overflow.
  • 302 GTApprox: if you use an initial model in training, the initial model information is printed to the training log.

15.6.15.2. Documentation

  • 42 Fixed missing arguments in the GTSDA rank() method description.

15.6.15.3. Bugfixes

  • 282 GTDoE: fixed an issue with Adaptive Design Result where some solution points could be improperly flagged as not evaluated, after which solution filtering could not work correctly — for example, feasible() did not return all feasible points because some of them had the “not evaluated” flag.
  • 326 GTOpt, GTDoE: fixed an issue where results could contain a number of points with missing response values, if you specify the "@GT/EvaluationLimit" hint for all responses but do not set options that limit the number of response evaluations (such as GTOpt/MaximumIterations or GTDoE/AdaptiveDesign/MaximumIterations) — the optimization and adaptive design algorithms did not stop as they should once all limits set by the "@GT/EvaluationLimit" hints are reached.
  • 302 GTOpt: fixed an issue with intermediate results in problems with categorical variables, where the intermediate result did not contain all currently known result points.
  • 283 GTOpt, GTDoE: fixed an issue with linear evaluation type objectives where, if linear response modeling is enabled, such objectives were also modeled, and model response values could appear in results.
  • 302 GTOpt, GTDoE: fixed an issue where hint values were not validated properly, and an invalid hint value was accepted but replaced by the hint’s default value without any warning.
  • 324 GTOpt: fixed an issue where a misleading warning about conflicting objective evaluation limits appeared in the optimization log.
  • 325 GTOpt, GTDoE, GTApprox: fixed an issue where several methods that accept an integer argument raised an invalid argument exception if that argument is of certain type — for example, numpy.int64.
  • 323 GTOpt: fixed an issue where NaN was accepted as an initial guess value for a variable.

15.6.16. pSeven Core 6.40

15.6.16.1. Updates and Changes

15.6.16.2. Documentation

15.6.16.3. Bugfixes

  • 317 GTApprox: fixed an issue with the @GTApprox/TimeLimit smart training hint where specifying the 0 value (no limit) caused a InvalidOptionValueError exception in training.
  • 305 GTApprox: fixed minor issues in model code exported to C source formats, which caused compilation warnings in clang.

15.6.17. pSeven Core 6.39

15.6.17.1. Updates and Changes

  • 145 When you use concurrent response evaluations in optimization or Adaptive Design, the options and hints that limit the number of response evaluations are now interpreted with regard to concurrency \(p\) (set by the GTOpt/ResponsesScalability or the GTDoE/ResponsesScalability option, respectively): they specify the maximum allowed number of evaluated batches, where a batch contains up to \(p\) points (note that GTOpt and GTDoE do not guarantee that every evaluation batch contains exactly \(p\) points).

    This change makes GTOpt and GTDoE behavior consistent whether you disable or enable concurrent evaluations. In previous versions, a consistent configuration of the concurrent evaluations mode, given the maximum allowed number of (concurrent) response evaluation calls \(N\), required you to set evaluation limits equal to \(N \cdot p\) and to implement a watcher that would interrupt the process once the number of evaluated batches reaches \(N\). Without a watcher, the number of actually evaluated batches could exceed the limit \(N\), because some batches contain less than \(p\) points, and the evaluation limits specified by options and hints were checked considering only the total number of points evaluated, with no regard to concurrency.

    The changes described above affect the following:

15.6.17.2. Documentation

15.6.17.3. Bugfixes

15.6.18. pSeven Core 6.38

15.6.18.1. Updates and Changes

  • 152 GTDoE: the Adaptive Design of Experiments technique automatically selects the number of points generated per iteration by default, which often improves performance as compared to the previous versions. The GTDoE/Adaptive/OneStepCount option now has a default corresponding to that behavior.
  • The following features in GTDoE are now considered deprecated and may be removed in future pSeven Core versions:

15.6.18.2. Documentation

15.6.18.3. Bugfixes

  • 301 GTOpt: fixed an issue where result designs() could exclude some of the initial sample points even if you use designs() to get all problem data (call it without a data filter).
  • 297 GTOpt: fixed an issue where GTOpt returned a wrong solution of a maximization problem with a linear objective.
  • 301 GTOpt, GTDoE: fixed an issue with linear response modeling where a failure to fit the response data to a linear model (that is, detecting a non-linear dependency) did not raise an exception as intended with the GTOpt/RestoreAnalyticResponses (GTDoE/AdaptiveDesign/RestoreAnalyticResponses) option set to True.
  • 293 GTOpt: fixed an issue where gradient-based optimization could return a result with an incorrect status — for example, NANINF_PROBLEM for a successfully solved problem.
  • 299 GTOpt: fixed an issue with the GTOpt/ResponsesScalability option in surrogate-based optimization where the initial training set size could be not a multiple of the option value.
  • 300 GTDoE: fixed an issue where the Adaptive Design of Experiments technique ignored the GTDoE/ResponsesScalability option when evaluating the missing responses from an initial sample.

15.6.19. pSeven Core 6.37

15.6.19.1. Bugfixes

  • 288 GTOpt, GTDoE: fixed an issue with handling linear response data received from the blackbox in optimization and Adaptive Design, which caused point duplication in results, if you enable internal modeling of linear responses.
  • 288 GTOpt, GTDoE: fixed an issue with handling initial samples in optimization and Adaptive Design where certain initial points that are not valid to evaluate, according to the task definition, still could be evaluated when filling in the missing response values in the initial sample.
  • 289 GTApprox: fixed an issue where a large HDA model with high approximator complexity (GTApprox/HDAPMax) was successfully trained and saved to a file, but then failed to load from that saved file.
  • 286 GTApprox: fixed an issue with the GBRT training technique where it incorrectly selected subsamples for internal validation, which lead to incorrect model quality estimates and caused quality degradation in smart training, where internal validation is used to determine quality metrics. The issue is identifiable in the model training log: the count of points used to calculate internal validation errors, printed to the log, is less than expected according to the internal validation sampling settings (GTApprox/IVSubsetCount, GTApprox/IVSubsetSize, and GTApprox/IVTrainingCount options).
  • 286 GTApprox: fixed an issue with the GBRT training technique where it ignored the GTApprox/IVTrainingCount option setting when used in smart training (build_smart()).
  • 109 GTApprox: fixed a rare issue where build_smart() failed to train a model with categorical variables, if you specify certain sets of techniques in @GTApprox/EnabledTechniques — for example, enable only HDA and GBRT.
  • 288 GTDoE: fixed an issue with handling initial samples where the initial sample points that are not valid to evaluate were excluded from the “all designs” sample type in DoE results, despite that sample should contain all known design points.
  • 224 GTSDA: fixed an issue where sensitivity analysis (rank()) produced NaN score values if you use a GTApprox model with input constraints (for example, input minimum and maximum bounds) as the analysis blackbox.
  • 285 GTSDA: fixed an issue with the CSTA Sobol indices estimation method, which caused an exception when using that method with an up-to-date version of NumPy.
  • 285 GTSDA: fixed an issue where the FAST Sobol indices estimation method did not work if the analysis blackbox returns NaN or Inf response values.

15.6.20. pSeven Core 6.36

15.6.20.1. Updates and Changes

  • 267 GTDoE: for compatibility with Adaptive Design, most GTDoE techniques now handle categorical variables in a similar manner — generate a DoE for each possible combination of categories, so generation result may be used as an initial sample in Adaptive Design without issues. See section Types of Variables for more details.

15.6.20.2. Documentation

15.6.20.3. Compatibility Issues

This release changes handling of categorical variables in GTDoE for compatibility between the Adaptive Design technique and other techniques, which may be used to obtain an initial sample for Adaptive Design. If your design includes categorical variables, generation results in 6.36 will be different from 6.35 if you are using build_doe(), but generate() keeps behavior from 6.35 by default. It is also possible to reproduce behavior from 6.35 in build_doe(). See sections Version Compatibility Issues and Types of Variables for details.

15.6.20.4. Bugfixes

  • 276 GTOpt: fixed an issue in tasks with initial samples, where the result samples returned by Result.designs() and Result.solutions() contained duplicates of initial sample points.
  • 276 GTOpt, GTDoE: fixed an issue with pre-processing initial samples that contain points with missing response values, where evaluating such a point to fill in the missing values replaced the known response values at that point with NaN.
  • 262 GTOpt, GTDoE: fixed an issue where getting problem properties in gtopt.ProblemGeneric.prepare_problem() or gtdoe.ProblemGeneric.prepare_problem() raised an exception.
  • 264 GTApprox: fixed a logging issue where parts of the internal validation log were missing from the model training log.
  • 272 GTDoE: fixed an issue with build_doe() results where you could not get response values from Result.designs().
  • 235 GTDoE: fixed an issue with the Adaptive Design technique where it generated more feasible points than requested (more than count in build_doe()), if you prohibit evaluation of some response by setting the @GT/EvaluationLimit hint for that response to 0.
  • 262 GTDoE: fixed an issue where results could contain invalid response values, if you set a low evaluation limit for some response using the @GT/EvaluationLimit hint.
  • 206 GTDoE: fixed incorrect behavior of the Adaptive Design of Experiments technique in discrete tasks (no continuous variables), where generation did not stop if it reached the maximum possible DoE size — that is, generated a full factorial DoE including all possible combinations of variable levels.
  • 267 GTDoE: fixed an issue with the Adaptive Design of Experiments technique where its internal model was missing from the result (p7core.Result.model was None), if there is a constant variable in the task.
  • 267 GTDoE: fixed an issue with the Adaptive Design technique where using the GTDoE/CategoricalVariables option caused an exception.
  • 206 GTDoE: fixed an issue with integer variables in the Orthogonal Array technique that caused an exception unless you specify the GTDoE/OrthogonalArray/LevelsNumber option.
  • 206 GTDoE: fixed several issues where using constant variables caused errors in the Adaptive Design, LHS, OLHS, Orthogonal Array, and Fractional Factorial techniques.
  • 257 GTDoE: fixed an issue where using a GTApprox model with categorical outputs as a DoE blackbox caused an exception.

15.6.21. pSeven Core 6.35

15.6.21.1. New Features

  • 82 GTOpt: added the support for maximization problems: specify maximization responses with the new @GT/ObjectiveType hint.
  • 82 GTOpt, GTDoE: added the support for evaluation responses. This response type is intended for satellite computations: evaluation responses are never analyzed by optimization and DoE algorithms but their values are stored in evaluation history and included into results. Specify evaluation responses with the new @GT/ObjectiveType hint.

15.6.21.2. Updates and Changes

  • 82 GTOpt, GTDoE: added more solution filters in p7core.Result.solutions(): for example, you can get only the solutions that were present in an initial data sample, or exclude such points to get only new solutions.
  • 270 GTOpt, GTDoE: you can specify constant variables (freeze a variable) using the new @GT/FixedValue hint.
  • 270 Added methods that you can use to update variable, objective, and constraint hints instead of completely resetting them: see update_variable_hints(), update_objective_hints(), and update_constraint_hints().
  • 82 269 GTDoE: space-filling DoE techniques no longer ignore the @GT/EvaluationLimit hint. Evaluation limits set by that hint now apply when evaluating responses for the generated DoE points.
  • 82 GTDoE: in tasks with objectives, they are now evaluated in all resulting DoE points regardless of their feasibility (with regard to evaluation limits set by the @GT/EvaluationLimit hint). Previously, GTDoE checked point feasibility and, if possible, evaluated objectives in feasible points only.
  • 253 GTApprox: the set_remote_build() method is deprecated, using it is not recommended as it may get removed in future versions.

15.6.21.3. Documentation

15.6.21.4. Bugfixes

15.6.22. pSeven Core 6.34

15.6.22.1. Updates and Changes

  • 252 GTOpt, GTDoE: minor performance improvements in some kinds of optimization and adaptive design tasks.

15.6.22.2. Bugfixes

  • 246 GTOpt, GTDoE: fixed an issue with re-using the same problem instance in different solve() or build_doe() runs, where designs() in an optimization or adaptive design Result always returned all designs obtained in all runs that had used the same problem instance, instead of returning only designs from the last run as intended.

15.6.23. pSeven Core 6.33

15.6.23.1. New Features

15.6.23.2. Updates and Changes

  • 249 GTOpt: added support for categorical variables. A problem with categorical variables is solved as a number of subproblems, one for each possible combination of categories (levels of categorical variables).
  • 221 GTApprox: improved quality of MoA models with categorical variables.
  • 212 GTApprox: improved smart training (build_smart()) performance when training RSM models.
  • 249 GTDoE: added support for categorical variables in the Adaptive Design technique. An adaptive design that includes categorical variables is generated by running an independent subtask for each possible combination of categories.
  • 247 GTDoE: blackboxes with constraints are now supported by all DoE techniques.

15.6.23.3. Documentation

15.6.23.4. Bugfixes

  • 242 GTOpt: fixed an issue in robust optimization problems with computationally cheap responses, where GTOpt could produce a solution that clearly violates a chance constraint.
  • 221 236 GTApprox: fixed an issue where a MoA model with categorical variables could evaluate some inputs incorrectly and output NaN.
  • 212 GTApprox: fixed an issue with RSM technique settings in smart training (build_smart()) where the GTApprox/RSMFeatureSelection option was internally set to "RidgeLS" by default.
  • 85 GTDoE: fixed issues with watcher support.

15.6.24. pSeven Core 6.32

15.6.24.1. Updates and Changes

  • 86 redmine-19155 GTOpt, GTDoE: in optimization and Adaptive Design, you can now limit the number of evaluations individually for any response (objective or constraint) regardless of its computational cost using the new @GT/EvaluationLimit response hint. In Adaptive Design, you can also use the new hint to require a certain number of linear response evaluations to train a linear response model. See the GTOpt and GTDoE hint references for full details.

    This update deprecates the @GTOpt/ExpensiveEvaluations response hint, which works only with computationally expensive responses. GTOpt will keep supporting @GTOpt/ExpensiveEvaluations for compatibility with previous pSeven Core versions, but you are advised to avoid using that deprecated hint in favor of @GT/EvaluationLimit.

  • 231 GTApprox: optimized operations with models that have a high number of inputs or outputs. You may notice an increase in performance when loading such models from disk or accessing their details.

15.6.24.2. Documentation

15.6.24.3. Bugfixes

  • 234 GTOpt, GTDoE: fixed an issue where you could get an empty result if variable or response names contain Unicode characters — for example, when using a GTApprox model with Unicode names of inputs and outputs as a blackbox in build_doe().
  • 219 231 GTApprox: fixed an issue where training a model with a high number of outputs could fail at the finalization stage with a MemoryError exception.
  • 243 GTApprox: fixed an issue where you could not compile a model exported to C (program source, MEX source, Excel-compatible DLL source) or one of the supported FMI formats due to a syntax error in the exported C code (undefined symbol).

15.6.25. pSeven Core 6.31.1

15.6.25.1. Updates and Changes

  • 217 GTOpt, GTDoE: improved accuracy of linear response models trained internally when you enable the GTOpt/RestoreAnalyticResponses or GTDoE/AdaptiveDesign/RestoreAnalyticResponses option.
  • 227 GTOpt, GTDoE: improved stability and accuracy when training internal models of quadratic constraints, in particular, equality type constraints, which previously could lead to an infeasible problem.

15.6.25.2. Bugfixes

  • 230 227 GTOpt, GTDoE: fixed an issue with determining the evaluation limit (the maximum number of iterations allowed) when it is not set by user: in problems where all variables are non-continuous, meaning that there is a finite number of possible combinations of variable values, the evaluation limit determined by optimization or Adaptive Design techniques could get higher than the number of possible combinations, leading to duplicate evaluations and other unwanted effects.

15.6.26. pSeven Core 6.31

15.6.26.1. New Features

  • 176 FMI 2.0 support: you can export a GTApprox model to a Functional Mock-up Unit (FMU) for Model Exchange and Co-Simulation — a new type of FMU introduced by the FMI 2.0 standard. See section Model Export and export_fmi_20() for details. GTApprox also keeps the support for exporting FMI 1.0 compliant FMU for Model Exchange (export_fmi_me()) and FMU for Co-Simulation (export_fmi_cs()).
  • 170 GTApprox provides a new function to split a data sample into train and test subsets optimized for model training and validation — see train_test_split().

15.6.26.2. Updates and Changes

  • 98 GTApprox: improved quality of models with many categorical variables, or a few categorical variables that have large sets of levels.

15.6.26.3. Documentation

15.6.26.4. Compatibility Issues

This release changes handling of categorical variables in GTApprox, which results in improved quality of models with categorical variables but creates a minor compatibility issue in RSM models with categorical variables. For such models, details may now omit the "Regression Model" key, which previously was present in most cases. See sections Version Compatibility Issues and Regression Model Information for details.

15.6.26.5. Bugfixes

  • 219 GTApprox: fixed a “maximum recursion depth exceeded” error in smart training (build_smart()) when training a model with a high number of outputs.
  • 223 GTDoE: fixed an issue in Adaptive Design of Experiments where it ignored the batch size settings (GTDoE/ResponsesScalability or GTDoE/Adaptive/OneStepCount) in tasks with non-continuous variables.
  • 215 GTDoE: fixed an issue with the Adaptive Design technique where the number of designs generated could be less than expected, if some responses evaluate to NaN.

15.6.27. pSeven Core 6.30

15.6.27.1. Updates and Changes

  • 135 216 136 GTApprox: several updates in smart training aimed at improving the balance between training performance and final model quality, and providing better support for incremental training (training with an initial model). In particular, the MoA and GBRT techniques are now enabled in smart training by default.
  • 136 GTApprox: smart training models are no longer required to support gradients. This change was necessary to enable the GBRT technique in smart training by default. To train a model with gradient support, add the gradient requirement using the @GTApprox/ModelFeatures hint (see section Model Features for details).
  • 134 GTApprox: improved performance of the RSM technique when training quadratic models.

15.6.27.2. Documentation

15.6.27.3. Bugfixes

  • 203 GTOpt, GTDoE: fixed an issue where optimization or adaptive DoE results could contain NaN values of linear responses.
  • 216 GTOpt, GTDoE, GTApprox: fixed an internal issue which sometimes caused performance degradation of optimization, adaptive DoE, or smart training in the Windows version of pSeven Core. That issue did not affect the quality of optimization or adaptive DoE results, or quality of GTApprox models.
  • 134 201 GTApprox: fixed a few issues with training parallelization, which could cause performance degradation with large training samples.
  • 195 GTApprox: fixed an issue with train and test subsets selection in smart training, which could negatively affect the final model quality.
  • 209 GTApprox: fixed a deadlock issue in smart training, which could appear if the GBRT technique is enabled.
  • 135 GTApprox: fixed an issue where details of a MoA model obtained from smart training did not contain information about MoA option values used in training.
  • 135 GTApprox: fixed an issue where it was impossible to interrupt training when the MoA technique is used.
  • 134 GTApprox: fixed an issue where smoothing was enabled for RSM models despite that feature is practically useless for RSM.

15.6.28. pSeven Core 6.29

15.6.28.1. New Features

  • 169 The Adaptive Design technique in GTDoE now supports batch mode, which enables efficient usage of blackboxes that support concurrent response evaluations. To set up concurrency, use the new GTDoE/ResponsesScalability option.
  • 186 When training a GTApprox model with linear dependency between outputs, you can specify an error threshold for the internal model of that dependency using the new GTApprox/PartialDependentOutputs/RRMSThreshold option. For details, see section Output Dependency Modes and the option description.
  • 189 150 Introduced a new type of variables — stepped variables, supported in GTOpt and GTDoE. This type is intended for a rather common case of a variable that represents a continuous quantity, but can be changed only in certain increments due to various practical reasons. To specify variable type, set the @GT/VariableType hint when adding a variable. For more information about stepped variables, see section Types of Variables in the GTDoE guide.
  • 189 In optimization, you can specify resolution for variables, which enables GTOpt to adjust precision of optimization algorithms accordingly and often improves performance. For details, see the @GT/Resolution hint description. A similar hint is also supported in GTDoE.

15.6.28.2. Updates and Changes

  • 90 GTOpt, GTDoE: you can set up a strict requirement to train internal models of linear responses, which requires initial sampling of such responses prior to starting optimization but makes the behavior regarding linear responses more predictable: if modeling succeeds, the algorithm continues to use the obtained internal response model; if linear response modeling fails, the algorithm stops immediately. See the GTOpt/RestoreAnalyticResponses and GTDoE/AdaptiveDesign/RestoreAnalyticResponses option descriptions for more details.
  • 177 GTDoE: unified behavior of DoE techniques in cases when it is not possible to generate the requested number of points. Space-filling techniques no longer raise an exception in such cases but issue a warning and return a result, which contains all points the technique could generate.
  • 188 GTApprox, GTDF, GTDR: the following methods are deprecated and will be removed in future versions:

15.6.28.3. Documentation

15.6.28.4. Bugfixes

  • 191 GTOpt: fixed an issue where getting the string representation of an infeasible problem result (for example, printing the result) raised an exception.
  • 199 GTOpt, GTDoE: fixed an issue where optimization and adaptive DoE techniques in tasks with discrete variables that have non-uniform distribution of levels (for example: [0.0, 0.5, 1.0, 2.0]) incorrectly handled linear constraints that do not depend on those discrete variables. Due to internal variable and response type conversions, all linear constraints were treated as generic functions, which often led to their violation when it could be avoided. Now linear constraints that do not depend on discrete variables of the said kind are always processed as linear functions. Note that if a constraint is defined as linear, and it depends on a discrete variable, behavior does not change from the previous version: such constraints are internally treated as generic.
  • 177 GTApprox: fixed a compatibility issue in export of TA models with BSPL factors to the Octave format.
  • 197 GTApprox: fixed an issue where details of a model with categorical variables could contain incorrect information about input constraints.
  • 187 GTDoE: fixed an issue where if the Adaptive Design technique was used in a task with no responses, result designs() returned an empty array.
  • 177 GTDoE: fixed issues with frozen variables (constants) in the Optimal Design technique.
  • 161 GTDoE: fixed an issue where using a GTApprox model with a neglected variable as a blackbox in build_doe() caused an exception.
  • 196 GTDoE: fixed an issue where using a GTApprox model with categorical outputs as a blackbox in build_doe() raised an exception in space-filling DoE techniques that actually support such models.
  • 192 GTDoE: fixed incorrect behavior of the GTDoE/Sequential/Leap GTDoE/Sequential/Skip options in the Random Sampling technique.

15.6.29. pSeven Core 6.28

15.6.29.1. Updates and Changes

  • 158 GTApprox: improved time limit implementation (@GTApprox/TimeLimit) in smart training (build_smart()) to take advantage of the time-quality trade-off provided by the GTApprox/Accelerator option. Smart training now automatically adjusts GTApprox/Accelerator individually for every approximation technique it tries. With lower time limits, internal models created during smart training are less accurate but train faster, which allows GTApprox to try more techniques in given time. This potentially yields better results than training internal models with maximum quality but limiting the set of used techniques, as in previous versions.
  • 141 GTDoE: improved automatic selection of levels for continuous variables in the Full Factorial technique. Levels are now selected with the aim to generate a full factorial design that has size as close as possible to the requested points count. The technique is now able to select a different number of levels for each continuous variable. For example: in previous versions, the Full Factorial technique always selected the same number of levels for each continuous variable, so for a design with 5 continuous variables and requested point count of 242 (\(3^5-1\)) it generated 32 points (\(2^5\)), because it could select only 2 levels for each of the 5 variables. New version in this case generates 216 points, variables in the resulting design have varying number of levels (\(4 \times 3 \times 3 \times 3 \times 2 = 216\)).
  • 141 GTDoE: similarly, improved selection of levels for continuous variables in the Orthogonal Array technique in the case when the number of levels is not specified. The technique now selects a varying number of levels for continuous variables in order to generate a proper or balanced array (see Array Types and Requirements) with size close to the requested point count. Previously it often generated a full factorial DoE with a minimum number of levels.

15.6.29.2. Bugfixes

  • 158 GTApprox: fixed an issue with using the GBRT technique in smart training (build_smart()) where it stopped prematurely after training a candidate GBRT model, if you did not use a test sample.
  • 185 GTApprox: fixed an issue where internal validation information was missing from a model trained in the partial linear output dependency mode, if the training sample contained duplicates.
  • 158 GTApprox: fixed an issue where parallel threads sometimes were not used in training despite parallelization is enabled.
  • 141 GTDoE: fixed various issues with categorical variables in the Adaptive Design of Experiments technique.
  • 141 GTDoE: fixed various issues with discrete variables in the Optimal Design technique.
  • 141 GTDoE: fixed an issue where the Parametric Study technique incorrectly selected levels for continuous variables, using one of the variable bounds instead of a central point placed between bounds.
  • 141 GTDoE: fixed various issues related to using constant variables in different DoE techniques (discrete and categorical variables defined with 1 level or “frozen” continuous variables defined with equal bounds).

15.6.30. pSeven Core 6.27

15.6.30.1. Updates and Changes

  • 111 GTDoE: the following techniques now have a default for the number of DoE points they can generate: Full Factorial, Fractional Factorial, Parametric Study, Optimal Design, Box-Behnken, Orthogonal Array. To use the default, set the count of points to generate to 0.
  • 151 You can now get all available problem data from p7core.Result using the new designs() method.
  • 122 GTApprox: exported models with categorical inputs or outputs now support string values of those inputs and outputs.
  • 122 GTApprox: models exported to C can now load samples from CSV files.
  • 122 GTApprox, GTDR: models exported to C now output only model values (no gradient values) to simplify data piping: for example, now you can redirect a GTDR model output to a GTApprox model input as is. To output gradient values as in previous versions, add the -d option to the model’s command line.

15.6.30.2. Documentation

15.6.30.3. Bugfixes

  • 174 GTApprox: fixed an issue where training a model with partial linear output dependency mode caused an error if point weights were specified.
  • 146 GTApprox: fixed an issue where smart training with an initial model sometimes failed because it could not train a model with specified options and hints.
  • 160 GTApprox: fixed several issues with training a model with categorical outputs, which could cause errors in training.
  • 123 GTApprox: fixed issues with incorrect statistics for categorical outputs in model details.
  • 122 GTApprox: fixed several issues related to using string values with models that have categorical inputs or outputs.
  • 122 GTApprox: fixed an issue where Unicode names of model variables could cause malformed log messages and other errors in logging.
  • 151 GT: fixed an issue where da.p7core.Result.solutions() returned result data with field order not as specified by its fields argument.

15.6.31. pSeven Core 6.26

15.6.31.1. Updates and Changes

  • 106 GTOpt: improved stability of surrogate-based optimization algorithms in problems where local search methods are applicable to improve solution accuracy.
  • 95 GTDoE: discrete and categorical variables are now supported in most DoE techniques with some technique-specific limitations — see section Types of Variables in the GTDoE guide for details.
  • 95 GTDoE: the build_doe() method now supports all GTDoE techniques.

15.6.31.2. Documentation

15.6.31.3. Bugfixes

  • 137 GTOpt: fixed an issue with batch evaluation support where the GTOpt/ResponsesScalability option was sometimes ignored in multi-objective problems or if local search is enabled.
  • 173 GTApprox: fixed an issue with internal validation where specifying size of validation subsets (GTApprox/IVSubsetSize) could lead to an error if the training sample contains ambiguous points (different response values for same inputs).
  • 163 GTApprox: fixed an issue with smart training where build_smart() produced a warning if the GBRT technique is enabled in training settings.
  • 147 GTDoE: fixed an issue with the Adaptive Design of Experiments technique where it could raise an exception if the initial sample consists of a single point or contains a constant input column.

15.6.32. pSeven Core 6.25

15.6.32.1. New Features

  • 94 GTApprox, GTDF and GTDR now support using pandas DataFrame and Series as data samples when training and evaluating models. In GTApprox in particular, pandas data samples may be used to specify categorical inputs and outputs — see sections Categorical Variables and Categorical Outputs in the GTApprox guide for details. Model evaluation methods now return pandas data samples if the sample to evaluate is a pandas.DataFrame or pandas.Series.

15.6.32.2. Updates and Changes

  • 119 GTDoE: improved quality of the Adaptive Design technique in cases where objective evaluations are prohibited (sample-based Adaptive Design with objectives), or the blackbox returns NaN values of an objective.
  • 116 GTApprox: improved parallelization of the stepwise regression algorithm in RSM, resulting in increased performance in cases where stepwise regression is used as the RSM feature selection method (see GTApprox/RSMFeatureSelection), and also increased performance of smart training (build_smart()) with stepwise regression enabled (default).
  • 120 GTApprox: improved tuning of GBRT and MoA technique parameters in smart training.
  • 140 GTApprox: introduced new model input and output variability types: set (input) and piecewise constant (output). Variability type is automatically determined by GTApprox when training a model. See section Input and Output Descriptions in the GTApprox guide for details.
  • 120 140 GTApprox: the TBL technique now supports model update (training with an initial model). Note that the TBL technique can update only TBL models.
  • 120 GTApprox: clarified various error and warning messages related to handling categorical data in training samples.
  • 144 GTOpt: the GTOpt/Techniques option that limits the set of allowed optimization techniques is now deprecated and is kept for version compatibility only. It should no longer be used as it is going to be removed in future versions.

15.6.32.3. Documentation

15.6.32.4. Bugfixes

  • 125 116 GTApprox: fixed several issues in smart training (build_smart()), due to which it did not behave as intended when being interrupted by a watcher (see Watchers) and could ignore interrupts, become unresponsive or enter a deadlock.
  • 116 GTApprox: fixed an issue where using the @GTApprox/TimeLimit hint in smart training could lead to a deadlock.
  • 149 GTDoE: fixed an issue in the Adaptive Design technique where it ignored the initial sample in problems with categorical or discrete variables.
  • 140 GTApprox: fixed several issues in determining types of model inputs and outputs, which could lead to inability to use an initial model in training.
  • 140 GTApprox: fixed an issue with editing metainformation in models with categorical outputs, where values of output levels (class labels) could be replaced with indexes of levels (class indexes) if you use modify().
  • 94 GTApprox: fixed an issue with updating a model with categorical inputs (using an initial model in training), where values of categorical inputs in the training sample were not checked against the levels of categorical inputs in the initial model.
  • 94 GTApprox: fixed an issue with GBRT models where categorical inputs were described as continuous in model details.
  • 132 GTOpt: fixed an issue where a problem sometimes could be incorrectly qualified as a NaN/Inf problem if it returns NaN values of objectives or constraints at some points.

15.6.33. pSeven Core 6.24

15.6.33.1. New Features

  • 52 GTApprox, GTDR: added the support for exporting a model to a set of source files, which helps avoid compilation problems when exporting large models. Also you can now pack source files into an archive upon export. See the export_to(), compress_export_to(), and decompress_export_to() method descriptions for details.

15.6.33.2. Updates and Changes

  • 127 GTOpt: improved solver behavior when the GTOpt/ResponsesScalability option is specified so that sizes of point batches sent for evaluation are better adjusted to the GTOpt/ResponsesScalability value. Note that it does not guarantee that point batch size is a multiple of GTOpt/ResponsesScalability at every solving iteration; see the option description for details.
  • 128 GTOpt: all problem classes now support skipped response evaluations, so you are no longer required to implement gtopt.ProblemGeneric.evaluate() just for this one feature. Skipped evaluations are indicated by None values in evaluation results. In addition, evaluate() now also supports None response values.
  • 52 GTApprox, GTDR: model export to C and Octave now generates less code.

15.6.33.3. Documentation

15.6.33.4. Bugfixes

  • 107 GTApprox: fixed an issue with multithreading on processors with a high number of cores (10 or more) which caused training performance degradation compared to pSeven Core 6.16.
  • 96 GTApprox: fixed an issue where you could not update a GBRT model (use incremental training), if the initial model was trained with a sample that contains duplicates.
  • 129 GTOpt: fixed an error when using analytical gradients in an optimization problem with a combination of cheap and expensive responses.

15.6.34. pSeven Core 6.23

15.6.34.1. New Features

  • 72 GTApprox: you can now get a progress estimate while training a model, using a watcher attached to Builder (see Watchers).

15.6.34.2. Updates and Changes

  • 89 GTApprox: improved smart training algorithms to avoid loss of model quality in cases where the training sample is a Cartesian product of several factor sets (as in Tensor Approximation).
  • 114 GTApprox: improved smart training algorithms to avoid performance degradation when training models with a high number of input variables.
  • 114 GTApprox: the GTApprox/Accelerator option now affects RSM parameter estimation in smart training.

15.6.34.3. Documentation

15.6.34.4. Bugfixes

  • 103 GTApprox: fixed an issue where training a model with named inputs and outputs in Python 2.7 raised a UnicodeEncodeError exception if the names contain local language characters.
  • 115 GTApprox: fixed an issue where you could not use a model with categorical outputs as an initial model in training (update a model).
  • 99 GTApprox: fixed issues with compatibility of GTApprox/CategoricalOutputs with the GTApprox/Componentwise and GTApprox/DependentOutputs options.
  • 99 GTApprox: fixed an issue where the training_sample of a model with categorical outputs contained enumerations of output levels instead of actual output values.
  • 99 GTApprox: fixed meaningless values in output sample statistics for models with categorical outputs.
  • 72 GTApprox: fixed a rare error that occurred when training a model with categorical variables with the @GTApprox/EnabledTechniques smart training hint.
  • 104 GTDoE: fixed the Adaptive Design technique behavior in cases where the blackbox returns NaN values of an objective.

15.6.35. pSeven Core 6.22

15.6.35.1. New Features

  • GTApprox now supports training models with categorical (discrete) outputs, which can take values only from a predefined set (the output levels) and can be strings. See section Categorical Outputs for details.

15.6.35.2. Updates and Changes

  • 78 GTOpt, GTApprox: improved the support for interactive Python environments, such as JupyterLab or interactive Python consoles. You can now use the Ctrl C hotkey to stop optimization or model training that runs in an interactive session.
  • 67 GTApprox: added the GTApprox/CategoricalOutputs option.
  • 67 GTApprox: updated details:
    • For a categorical output indexed j, details["Output Variables"][j]["variability"] is "enumeration", and details["Output Variables"][j]["enumerators"] contains the list of output levels.
    • Added the cross-entropy loss metric for categorical outputs. In dictionaries containing model accuracy information, values of this metric are stored under the "LogLoss" key (smaller values are better). For continuous outputs, this metric is always NaN.
  • 67 GTApprox, GTDF: updates similar to the above appear in gtapprox.Model.validate(), gtapprox.Model.iv_info, gtdf.Model.validate(), and gtdf.Model.iv_info — see their descriptions for details.
  • 67 GTApprox: if the model has string categorical outputs, calc() returns an ndarray with dtype=object to accommodate for string values.

15.6.35.3. Documentation

15.6.35.4. Bugfixes

  • 83 GTOpt: fixed a bad allocation error when solving a large-scale problem with integer variables.
  • 83 GTOpt: fixed slow initialization of large-scale problems.
  • 67 GTApprox: fixed a bug in the GBRT technique, which caused an error when training a GBRT model with many categorical variables with large sets of levels.

15.6.36. pSeven Core 6.21

15.6.36.1. New Features

  • 63 GTDoE: improved Adaptive Design of Experiments to minimize the RRMS error of Gaussian processes (GP) models trained using a DoE generated by this technique. Added two new options to control the technique behavior:
  • 63 GTDoE: the Adaptive DoE generation result now includes the internal model trained during generation (see gtdoe.Result.model).

15.6.36.2. Updates and Changes

  • 26 GTApprox: increased performance and improved stability of the PLA training technique.
  • 26 GTApprox: added extrapolation support to models trained using the PLA technique.

15.6.36.3. Documentation

15.6.36.4. Bugfixes

  • 71 GT: fixed an issue with multithreading in the Windows version of pSeven Core, due to which since version 6.17 it did not utilize available CPU cores if running on a system with several processor groups — only one processor group was used.
  • 75 GTApprox: fixed an issue in build_smart() due to which it performed internal validation twice if a testing sample is specified.
  • 75 GTApprox: fixed a bug in internal validation, due to which it produced incorrect results if any of the training samples is a slice of a NumPy array.

15.6.37. pSeven Core 6.20

15.6.37.1. New Features

15.6.37.2. Updates and Changes

  • 5 GT: all blackbox-based techniques now accept GTApprox and GTDF models as a blackbox, in addition to the generic Blackbox. See section Blackbox for details.

15.6.37.3. Documentation

15.6.37.4. Bugfixes

  • 43 GTOpt: fixed an issue where solving a valid robust optimization problem raised an UnsupportedProblemError.
  • 57 GTApprox: fixed an error that could occur when loading a model from a file, if the model was trained with an accuracy evaluation requirement (GTApprox/AccuracyEvaluation enabled).

15.6.38. pSeven Core 6.19

15.6.38.1. New Features

  • 16 The Adaptive Design technique in GTDoE (build_doe()) now supports two essential settings: the aim for the number of feasible points to generate (the count argument) and the response evaluation limit (see GTDoE/AdaptiveDesign/MaximumIterations, GTDoE/AdaptiveDesign/MaximumExpensiveIterations). Generation stops either when it finds the aimed number of feasible points, or when it reaches the evaluation limit — whatever happens first. You can also use the method with such parameter combinations as:
    • Aim for the maximum possible number of feasible points within some evaluation limits: set count to 0 and impose limits with the above options.
    • Aim for a specific number of feasible points without predefined evaluation limits: set count to the aimed number of points and leave the limiting options default.

15.6.38.2. Updates and Changes

  • 16 GTOpt: the solver can now handle evaluation failures — evaluate() may return a “could not calculate” response, and optimization continues, if the number of failures is not too high. See the method description for details.
  • 16 GTOpt: when optimization finishes or is interrupted, and there are no results to return, solve() now returns an empty solution with an appropriate result status, instead of raising an exception.
  • 34 GTApprox, GTDF: more flexible internal validation settings. Set either the number or size of cross validation data subsets with GTApprox/IVSubsetCount or GTApprox/IVSubsetSize (and similar GTDF options). A suitable number of training sessions is determined automatically, or you can limit it with GTApprox/IVTrainingCount (GTDF/IVTrainingCount). See the option descriptions for full details.
  • 9 GTApprox: the GBRT technique now handles NaN values of variables in the training sample in a specific way. If you set the GTApprox/InputNanMode option, it keeps points where some (but not all) variables are NaN and actually uses them in training.
  • 9 GTApprox: if the model validation sample contains invalid data, validate() now issues a warning instead of raising an exception.
  • 25 GTDoE: improved uniformity and variability of DoE generated by the Adaptive Design technique in tasks with constraint responses and no objectives.
  • 16 GTDoE: due to the recent improvements in performance and quality of the OLHS technique, it is now the default initial DoE technique in the blackbox-based adaptive mode in generate() (see GTDoE/Adaptive/InitialDoeTechnique).

15.6.38.3. Documentation

15.6.38.4. Bugfixes

  • 16 GTOpt, GTDoE: fixed some issues in validation of variable bounds, which rarely led to incorrectly qualifying the problem as infeasible.
  • 17 GTApprox: fixed an issue where a TA or iTA model training failed if GTApprox/TAModelReductionRatio is 1 and exact fit is required. Reduction ratio 1 means no reduction, so the exact fit requirement is valid in this case.
  • 54 GTApprox: fixed a crash when smoothing an SGP model trained on noisy data with given output noise variance (see Data with Errorbars).
  • 19 GTDF: fixed the inability to export some GTDF models to C#. Note that GTDF model export requires you to convert it to a GTApprox model first by loading the GTDF model from a file via the gtapprox.Model constructor.
  • 8 GT: fixed various issues with incorrect or excessively strict argument type checks, which did not agree with common Python practices.

15.6.39. pSeven Core 6.18

Note

This release drops the support for old versions of Python 2 and the support for 32-bit Linux platforms.

  • Using pSeven Core in Python 2 now requires at least Python 2.7.
  • pSeven Core for Linux no longer supports 32-bit platforms. pSeven Core for Windows continues to support both 32-bit and 64-bit Windows editions.

15.6.39.1. Updates and Changes

  • 4 33 37 18293 GTApprox: the Tensor Approximation (TA) technique now supports piecewise linear approximation (PLA) as a technique that you can use for TA factors. In particular, this enables TA to provide n-linear approximations on a grid.
  • 18423 GTApprox: improved compatibility of exported model code, added more validity checks to avoid compilation issues.
  • 18293 GTApprox: updated training sample size requirements for the HDA, GP, HDAGP, SGP, and MoA techniques. See section Sample Size Requirements in the GTApprox guide for details.
  • 18552 GTApprox: models converted from GTDF models (by loading a GTDF model from a file) now support accuracy evaluation.
  • 19226 GTDoE: added the da.p7core.gtdoe.measures.discrepancy() function to calculate the DoE discrepancy metric, which is a robust uniformity metric (see section Uniformity).
  • 18552 GTDF: added the gtdf.Model.grad_ae() method to calculate model’s accuracy evaluation gradients.

15.6.39.2. Documentation

15.6.39.3. Bugfixes

  • 18796 GTApprox: fixed an extrapolation issue in some MoA models, which returned 0 everywhere outside of the training sample domain.
  • 10 GTApprox: fixed an issue where evaluating a MoA model exported to Octave (.m) raised an error if the input point is outside of the training sample domain.

15.6.40. pSeven Core 6.17

Note

pSeven Core 6.17 is going to be the last major release that provides support for old versions of Python 2 and for 32-bit Linux platforms.

  • Future versions of pSeven Core will not support Python 2.5 and 2.6. Using pSeven Core in Python 2 will require at least Python 2.7.
  • Future versions of pSeven Core for Linux will not provide a 32-bit setup package. This concerns only the Linux version: pSeven Core for Windows will continue to support both its 32-bit and 64-bit editions.

15.6.40.1. New Features

  • 18958 GTApprox: you can now select which techniques are enabled or disabled in smart training using the new @GTApprox/EnabledTechniques hint. See section Training Features in Smart Training for details.
  • 17845 GTApprox: added the support for model output thresholds — another kind of output bounds. When thresholds are set for a model, the model guarantees that its outputs are always within bounds: if it calculates some output value which exceeds a threshold, it returns the threshold value instead. See sections Model Metainformation and Output Constraints for details.
  • 19166 GTDoE: the OA technique now supports the property-preservation mode: given an initial sample which is an orthogonal array, it updates the sample with the requested number of points in such a way that the resulting sample is also an orthogonal array.

15.6.40.2. Updates and Changes

  • 19494 GTOpt, GTDoE: added more robust checks for the initial sample data passed to Surrogate-Based Optimization and Adaptive Design. When analyzing the initial sample, these techniques now update it as needed to ensure that the sample properly covers the design space defined by variable bounds. In particular, this helps in avoiding unwanted localization in tasks where the initial sample is clustered in a relatively small area of the design space.
  • 19508 GTOpt: added an optional local search mode to the Surrogate-Based Optimization technique. In this mode, GTOpt searches an area close to the current optimum, adjusting the location and size of this area for new iterations. It uses local response models, which take less time to train, thus making the algorithm less time-consuming. See the GTOpt/LocalSearch option for details.
  • 18797 18946 19083 GTApprox: several internal improvements in smart training (build_smart()), which in many cases result in increased model quality and higher training performance.
  • 19083 GTApprox: updated the GP and HDAGP techniques to better detect overtraining and adjust the algorithm so it preserves model quality.
  • 18954 GTApprox: improved the parallel training implementation so its performance under Linux now scales better with the number of CPU cores.
  • 18807 GTApprox: improved performance of the PLA, RSM, SPLT, TA, and iTA techniques in the case when the model is trained with an option to discover and keep linear dependencies between outputs (GTApprox/DependentOutputs set to "PartialLinear").
  • 18422 GTApprox: when the GP technique automatically selects its training algorithm version (GTApprox/GPLearningMode is default), it now takes the GTApprox/Accelerator setting into account.
  • 17845 GTApprox: when you train with an initial model, its metainformation is now re-used — GTApprox copies it to the trained model and updates with new information if you want to make changes. See Model Metainformation for details.
  • 18946 GTApprox: the model accuracy and internal validation statistics in smart training are now calculated using the entire training dataset, even if the data was split to the train and test subsets using the @GTApprox/TrainingSubsampleRatio hint. Previously smart training did not use the test subset data when calculating statistics.
  • 19165 GTApprox: changed the formatting of model accuracy and internal validation information printed to the training log for better readability.
  • 18551 GTApprox: optimized the output constraints formula for brevity (see Constraints Formula).
  • 19113 GTApprox: the GTApprox/GPType, GTApprox/RSMFeatureSelection, GTApprox/RSMType, and GTApprox/SPLTContinuity options now support the "Auto" value and make it their default. That value is primarily intended to explicitly “unlock” an option for automatic tuning, which is a part of smart training.
  • 19120 18927 19500 GTApprox: improved compatibility with GTDF models that are loaded from a file with the gtapprox.Model constructor in order to convert a GTDF model to GTApprox.
  • 18927 GTDF: gtdf.Model.details now contain information about the training technique, options, model accuracy, and training sample statistics.
  • 18914 GTDoE: significantly increased the OA technique performance and stability for big samples and designs where variables have many levels (about 10 or more). Added an auto setting for the GTDoE/OrthogonalArray/MultistartIterations option, which is now default.
  • 19196 GTDoE: noticeably increased performance of the OLHS technique in cases where it is used to generate big samples (thousands of points).
  • 19166 GTDoE: improved the support for categorical variables in the LHS and OLHS techniques, resulting in better sample point distribution when generating a DoE with a combination of continuous and categorical variables.
  • 19147 GTDoE: improved OLHS generation quality in the property-preservation mode.
  • 19022 GTDR: updated the NLPCA technique to better avoid overtraining.
  • 19219 GTDR: model details now contain information about options used on training.
  • 18435 GTApprox, GTDF, GTDR: improved the string representation of models.
  • 18425 GTApprox, GTDF, GTDR: leading and trailing whitespaces are now stripped from strings stored to model metainformation. Also, Unicode strings in metainformation are NFKC-normalized.
  • 19262 GTApprox, GTDF, GTDR: updated the methods that export model C code for compatibility with the Tiny C Compiler.
  • 19162 GT: changed the upper limit for the number of parallel processes set by the GTApprox/MaxParallel, GTDF/MaxParallel, GTDR/MaxParallel, GTDoE/MaxParallel, GTOpt/MaxParallel, and GTSDA/MaxParallel options to 512.

15.6.40.3. Documentation

15.6.40.4. Compatibility Issues

This release changes the upper limit for the options that set the maximum number of parallel threads created by pSeven Core (GTApprox/MaxParallel and similar). Also changes the upper limit for GTDoE/OrthogonalArray/MultistartIterations. These updates may require minor updates in your code — see section Version Compatibility Issues for details.

15.6.40.5. Bugfixes

  • 19190 19195 GTOpt: fixed incorrect processing of user-defined gradients in mean variance problems.
  • 19157 GTApprox: fixed a bug in smart training (build_smart()), which could cause a training error when @GTApprox/TryOutputTransformations is enabled. In such cases, training continued but some submodels were not trained completely, so the final model returned by build_smart() could miss some information or contain other issues.
  • 19127 GTApprox: fixed an error when training is configured to limit the model input domain (GTApprox/InputDomainType is set), and the training sample contains NaN values of variables which should be ignored (GTApprox/InputNanMode is set to "ignore").
  • 19269 GTApprox: added correct error handling for the case when export_to() cannot export a GBRT model to C code due to its size. It will now raise an exception when out of memory, instead of generating incorrect C code that does not compile.
  • 19203 GTApprox: fixed an issue where model training could freeze if the training sample contains many NaN output values, and the GTApprox/OutputNanMode option is set to "Predict".
  • 19131 GTApprox: fixed an issue where the GP technique could enter an infinite loop if the training sample contains degenerate data.
  • 19165 GTApprox: fixed an issue where build_smart() raised an exception if the HDAGP technique was selected manually.
  • 19165 GTApprox: fixed a bug in the HDAGP technique which could lead to overtraining in some cases.
  • 18807 GTApprox: fixed the GTApprox/InputNanMode option compatibility with the SPLT technique.
  • 18807 GTApprox: fixed the support for output transformation (GTApprox/OutputTransformation) in the PLA technique.
  • 18946 GTApprox: fixed a bug that rarely lead to incorrect calculation of model accuracy and internal validation statistics, if the training sample contains many NaN output values.
  • 17879 GTApprox: fixed an issue where model details contained incorrect information about constant outputs.
  • 18807 GTApprox: fixed an issue where the training log contained incorrect values of the GTApprox/OutputTransformation option.
  • 19041 GTApprox: fixed an issue where GTApprox could not export a MoA model, which was trained in a deprecated pSeven Core version, to C# code.
  • 19111 GTApprox: fixed a bug in model export to the Octave code, due to which GTApprox failed to export a model that was trained with GTApprox/OutputNanMode set to "Predict".
  • 19512 GTApprox: fixed a compatibility issue in the exported Octave code.
  • 19137 GTApprox: fixed an incorrect error message in the case when smart training fails to train any model.
  • 19165 GTApprox: fixed several Python 2.5 compatibility issues.
  • 18432 GTDF: fixed incorrect training option information in model details.
  • 19153 GTDoE: fixed a few problems with the adaptive DoE algorithms in generate() which negatively affected their performance, stability, and result quality.
  • 19185 GTDoE: fixed an issue where exception messages, which report a technique error, contained a technique number instead of its name.
  • 19184 GT: fixed several cases where pSeven Core did not correctly release a license, which caused incorrect license usage count and other license issues.

15.6.41. pSeven Core 6.16.4

This is a maintenance release, which does not contain any functional changes or updates.

15.6.41.1. Documentation

15.6.42. pSeven Core 6.16.3

15.6.42.1. Updates and Changes

  • 18983 GTOpt: increased performance of surrogate-based optimization algorithms, resulting in noticeably faster problem solving.
  • 19019 GTOpt: increased solver performance in problems with a high number (about 500 or more) of variables or constraints.
  • 19036 GTApprox: improved stability of several approximation techniques, including MoA and GBRT.
  • 19092 GTApprox: clarified some error messages.
  • 19047 19079 GTDoE: increased performance of the Adaptive Design techinique (build_doe()) and improved its stability and result quality in certain kinds of tasks, in particular when generating a uniform adaptive DoE with linear constraints.
  • 18811 GT: a LicenseError exception message now contains the text of a FlexNet Publisher license server error message when available.

15.6.42.2. Documentation

15.6.42.3. Bugfixes

  • 19112 GTOpt: fixed incorrect behavior in cases when the initial sample (the sample_x argument to solve()) contains an invalid value of a discrete or integer variable. The method now informs about such errors by raising an InvalidProblemError exception.
  • 19068 GTApprox: fixed a bug in internal validation of models trained on samples with a grid-like structure (multigrid mesh), which could cause an access violation error in the case when the training sample contains a high number (order of \(10^5\)) of points.
  • 18902 GTApprox: fixed an issue where using an initial model in training caused an error, if this initial model was a GTDF model trained in the componentwise mode (see GTDF/DependentOutputs) and loaded to GTApprox with the gtapprox.Model constructor.
  • 19104 GTApprox: fixed an issue where export to the Octave format was not available for GTDF models trained with the MFGP or DA techniques and loaded to GTApprox with the gtapprox.Model constructor.
  • 19057 GTDoE: fixed incorrect behavior of adaptive DoE algorithms in tasks with linear constraints when any of the following conditions is met:
    • the design includes discrete variables,
    • an initial sample is given, or
    • the design includes variables with different orders of magnitude, and design space normalization (GTDoE/Normalize) is disabled.
  • 19104 GTDF: fixed the GTDF/HFA/SurrogateModelType option not recognizing "GBRT" and "PLA" (actually supported approximation techniques) as its valid values.
  • 18811 GT: fixed license requests performed by pSeven Core to make them thread-safe.

15.6.43. pSeven Core 6.16.2

15.6.43.1. Updates and Changes

  • 19003 GTApprox: added a summary of model output properties to the end of the model training log. This summary lists training options applied to each output and contains a few other details, such as used approximation techniques and output dependencies and constraints, if they exist.
  • 18992 GTApprox: the automatic sample splitting algorithm, which is used in smart training (enabled by setting @GTApprox/TrainingSubsampleRatio to 0, see Training Features), was adjusted for the case of a large training sample (several thousands of points). This change helps to avoid certain negative effects on model quality, which were observed in previous versions when automatic splitting was used with large samples.

15.6.43.2. Bugfixes

  • 19024 GTApprox: fixed an issue where the trained model could show unexpectedly high errors for some outputs, if the search for linear dependencies between outputs is enabled (GTApprox/DependentOutputs is set to "PartialLinear"), and internal validation is on.

15.6.44. pSeven Core 6.16.1

15.6.44.1. Updates and Changes

  • 18509 GTApprox: the C# source code export format is now supported for all GTApprox models (not yet supported for GTDF models loaded to gtapprox.Model). Previously, it was available only for GTApprox models trained with the HDA, RSM, or TBL technique. Note that C# source export requires an up to date license valid for pSeven Core 6.16 and above.
  • 18962 GTApprox: improved quality of approximation techniques based on Gaussian processes (GP, SGP and other) for noisy training data and for models with high input dimension.
  • 18888 GTDoE: changed the default value of the GTDoE/OrthogonalArray/MultistartIterations option to 500. The old default (10) was often too low to generate a proper orthogonal array.
  • 18888 GTDoE: removed the deprecated GTDoE/OrthogonalArray/MaxIterations option. Use GTDoE/OrthogonalArray/MultistartIterations to control the time-quality trade-off.

15.6.44.2. Documentation

15.6.44.3. Compatibility Issues

This release removes the deprecated GTDoE/OrthogonalArray/MaxIterations option, which may require updates in your code if you use the Orthogonal Array DoE technique. See section Version Compatibility Issues for details.

15.6.44.4. Bugfixes

  • 18918 GTOpt: fixed an issue with occasionally long solving time of the Surrogate-Based Optimization technique in problems with a high number of linear constraints.
  • 18913 GTApprox: fixed an error in the internal validation procedure, which occurred when the training sample contains many non-numeric values in the input part (values of variables), and GTApprox/InputNanMode is set to "ignore".
  • 18888 GTDoE: fixed incorrect behavior of the GTDoE/OrthogonalArray/MultistartIterations option.

15.6.45. pSeven Core 6.16

15.6.45.1. New Features

  • 17970 GTApprox: you can now limit the model’s input domain, so it returns NaN outputs for input points which do not satisfy the input constraints. The constraints can be specified manually or determined automatically by GTApprox, based on the bounding box of the training sample. See the GTApprox/InputDomainType option for details.
  • 18469 18494 GTDoE: the Adaptive Design technique and space-filling DoE techniques, which use internal quality criteria based on spatial measures (OLHS in particular), now by default perform design space normalization internally to ensure that their criteria work correctly. Also added an option to control this behavior, see GTDoE/Normalize.
  • 17455 18403 GTDoE: completely reimplemented the Orthogonal Array technique. The new version is significantly faster, more convenient to use and eventually more stable. One of its distinctive features is the ability to adjust to any sample size by generating a “nearly orthogonal” array — in contrast with the old implementation, which always required a specific sample size. See section Orthogonal Array for details.

15.6.45.2. Updates and Changes

  • 18023 16990 17770 18500 18463 18248 GTOpt: several improvements and corrections in the internal solving algorithms resulting in higher stability and performance for certain problem types, in particular problems with no objectives (constraint satisfaction) or without constraints (unconstrained problems), and problems where design variables have significantly different order of magnitude.
  • 18467 GTOpt: added an option to disable the special treatment of response evaluation failures, which is used by default in surrogate-based optimization (in the current and previous pSeven Core versions). Note that disabling it is useful only in rare cases where evaluation failures can occur at random. See GTOpt/DetectNaNClusters for details.
  • 17348 GTOpt: the minimum allowed number of evaluations for computationally expensive responses is now 1 instead of 3 (see GTOpt/MaximumExpensiveIterations).
  • 17881 GTApprox: improved the model validation procedure used internally in smart training to estimate quality of intermediate (candidate) models in the case when you do not supply a separate test sample. The new method provides more accurate quality estimates, eventually resulting in more accurate final models. See section Smart Training for more details.
  • 18320 18460 18462 GTApprox: several performance and stability improvements for the MoA technique. In particular, increased performance when parallelization is enabled (see GTApprox/MaxParallel).
  • 17443 GTApprox: extended the set of GP technique options tuned by the smart training algorithm (build_smart()), potentially allowing it to create more accurate GP models.
  • 18394 GTApprox: added the initial support for periodic kernel to Gaussian processes-based techniques, see GTApprox/GPType.
  • 17443 GTApprox: further improvements (continuing from pSeven Core 6.14) in the data analysis algorithm, which is used in smart training to prevent oscillations of model outputs when a smooth model is required.
  • 17265 GTApprox: you can now convert a GTDF model into a GTApprox model by loading it from a file with the gtapprox.Model constructor. In particular, this allows exporting GTDF models to various formats using gtapprox.Model.export_to().
  • 18351 GTApprox: models trained with the HDA, RSM, or TBL technique can now be exported to C# source code (see Model Export and export_to()). Note that C# export requires an updated pSeven Core license.
  • 17968 17908 GTApprox: added information on model input and output constraints to gtapprox.Model.details. Also details now stores a copy of warnings extracted from the model’s training log (build_log). See sections Model Details, Input Constraints, and Output Constraints for details.
  • 18063 GTDoE: when an initial sample with missing values of objectives or constraints is given to adaptive DoE, it calculates the missing values when possible and includes these calculations in the final result.
  • 18467 GTDoE: the Adaptive Design technique (provided by build_doe()) now supports special treatment of response evaluation failures (similar to GTOpt), which enables it to avoid design space areas where response functions are undefined. Also added an option to control this behavior, see GTDoE/AdaptiveDesign/DetectNaNClusters.
  • 18444 GTDoE: categorical variables with only 1 level are now allowed (see GTDoE/CategoricalVariables).
  • 17908 GTDF, GTDR: gtdf.Model.details and gtdr.Model.details now store a copy of warnings extracted from the model’s training log.

15.6.45.3. Documentation

15.6.45.4. Compatibility Issues

This release contains a new implementation of the Orthogonal Array DoE technique, which is generally compatible with the old version and does not require changes in code, but introduces some changes in options and features. It may also slightly affect results of the Taguchi ranking technique in GTSDA, which uses orthogonal arrays internally. See sections Version Compatibility Issues and Orthogonal Array for more details.

15.6.45.5. Bugfixes

  • 18450 GTOpt: fixed the support for integer variables in problems with no objectives (constraint satisfaction problems).
  • 18478 GTApprox: fixed an exception in the MoA technique when it is used with an initial model and a training sample which has a grid-like structure (multigrid mesh).
  • 18050 GTApprox: fixed the incompatibility of exported model C code with 32-bit compilers.
  • 18066 GTApprox: fixed build_smart() sometimes not saving option values to model details for those options, which were tuned automatically during model training.
  • 18461 GTApprox: fixed a cross-validation bug in smart training, which could lead to an error when training a model with categorical variables and point weights without a separate test sample.
  • 18461 GTApprox: fixed a bug in smart training, which could negatively affect the quality of models with categorical variables if GTApprox/IVSavePredictions is disabled.
  • 18404 GTApprox: fixed the inability to use an initial model with output transformation (GTApprox/OutputTransformation) when training a model with the GBRT or HDAGP technique.
  • 18806 GTApprox: fixed the RSM, SPLT, TA, iTA, and PLA techniques ignoring the linear output dependency option (see GTApprox/DependentOutputs).
  • 18556 GTApprox: fixed a bug in the HDA technique due to which a HDA model could produce false predictions of NaN outputs when the NaN prediction feature is enabled.
  • 18798 GTApprox: fixed a bug in MoA model export to the C source code format, which could lead to incorrect behavior of exported MoA models.
  • 18348 GTApprox: fixed a bug in model export to C due to which exported TA and iTA models with SPLT factors could return NaN outputs for points which are out of the region covered by the training sample, instead of using linear extrapolation as intended.
  • 17968 GTApprox: fixed incomplete model decomposition information in details for models trained to keep linear dependencies between outputs (GTApprox/DependentOutputs set to "PartialLinear").
  • 18392 GTApprox: fixed a bug in distributed training (set_remote_build()) due to which it sometimes did not delete temporary directories on the remote host. Also fixed a bug due to which it sometimes did not close the SSH connection properly, so the training never finished.
  • 17994 GTDoE: fixed the GTDoE/Adaptive/OneStepCount option behavior in generate(): the blackbox-based adaptive DoE algorithm used by generate() always generated 1 point per iteration, even if more points were allowed by GTDoE/Adaptive/OneStepCount — so it was effectively ignoring this option. Note that the intended option behavior remains the same: it sets an upper limit for the number of points generated per iteration, but does not require exactly this number.
  • 18444 GTDoE: fixed incorrect behavior of some techniques when generating a DoE with “frozen” variables (with equal lower and upper bounds).
  • 18452 GTOpt, GTDoE: removed some redundant messages from run logs.
  • 18379 GT: fixed pSeven Core for Windows becoming unresponsive when the system it runs on has a long uptime, or the system event log was cleared recently.
  • 18387 GT: fixed several misleading exception messages.
  • 16644 GT: fixed some inconsistencies in option descriptions returned by the Options interface.

15.6.46. pSeven Core 6.15.1

15.6.46.1. New Features

  • 14017 GTApprox: the Mixture of Approximators technique now supports using an initial model, so it can update existing models with new data or improve their accuracy by retraining. See Initial Model for details.

15.6.46.2. Updates and Changes

  • 14017 GTApprox: added several internal improvements to the Mixture of Approximators technique, in many cases making it faster and more accurate.

15.6.46.3. Documentation

15.6.46.4. Bugfixes

15.6.47. pSeven Core 6.15

15.6.47.1. New Features

  • 15764 GTOpt: significantly improved performance of several internal algorithms used in constrained optimization problems.
  • 17376 GTOpt: added the support for discrete variables — non-integer variables which take values from a predefined set.
  • 17723 GTApprox: added a special training mode, in which GTApprox tests the training sample to find linear dependencies between responses, and trains a special model which keeps these dependencies. See section Output Dependency Modes for details.
  • 17724 GTApprox: the smart training mode was updated with a new, even more accurate method which decides whether to apply log transformation to the outputs data in the training sample. Log transformation may improve model accuracy when training outputs are exponentially distributed. The new method essentially compares various models trained with and without the transformation and selects the best one. Note that this testing can noticeably increase model training time. See section Training Features and the @GTApprox/TryOutputTransformations hint for details.
  • 17989 17886 GTApprox: improved quality and stability of all techniques based on Gaussian processes. In particular, the robust version of the Gaussian processes algorithm was significantly improved and now provides both high accuracy and robustness. This version of the algorithm is now default, see the GTApprox/GPLearningMode option for details.
  • GTDoE: improvements in the Adaptive Design technique (da.p7core.gtdoe.Generator.build_doe()), including:
    • 15764 Improved technique performance and stability when generating a DoE with constraints.
    • 17955 Significantly increased performance when generating a high-dimensional DoE with no objectives but with a high number of linear constraints, solving the specific problem of generating a uniform DoE in a narrow subset of the design space in high dimension.
    • 17935 18042 Better support for sample-based generation when response evaluation is restricted. The technique will now generate uniformly distributed points instead of concentrating them in a limited design space area.
  • 17811 GTSDA: data points containing non-numeric values can now be automatically removed from the analysis (see GTSDA/NanMode).

15.6.47.2. Updates and Changes

  • 9767 GTOpt: removed the compatibility with OpenTURNS since the supported version (OpenTURNS 0.15) is very outdated and is not compatible with its current version, and the feature itself is rarely used. This resulted in simplifying configuration of a robust optimization problem: set_stochastic() now requires only the distribution argument which can be an object of any custom class implementing the probability distribution. This change does not affect compatibility with code based on previous versions of pSeven Core.
  • 18042 GTOpt: NaN values in the response part of an initial sample are now processed as special values indicating that a response failed to evaluate. Previously such points were simply excluded from consideration.
  • 15711 GTOpt: global search mode now always requires bounds for variables. Previously, variables with no bounds were allowed in global search, but GTOpt silently switched to the local search mode in this case. Now it will return a Result with the unsupported problem status (see Statuses).
  • 17366 GTOpt: improved stability when solving problems with stochastic variables (robust optimization methods).
  • 17817 17821 17922 GTOpt: improved stability in mixed-integer linear problems which include variables bound to a very small range (order of \(10^{-6}\) and less).
  • 16693 GTOpt: noticeably reduced memory consumption for some problems.
  • 17959 GTApprox: improved overall quality and performance of smart training, in particular thanks to the latest updates in the HDA and GP techniques.
  • 17846 GTApprox: improved load balancing and computational resource utilization when training models in parallel mode.
  • 17959 GTApprox: improved quality of the HDA technique when used with default settings.
  • 17487 GTApprox: improved the support for categorical variables in the SPLT and GBRT techniques.
  • 17842 GTApprox: when loading models trained in versions prior to 6.8, GTApprox now automatically calculates the \(R^2\) error metric, which was not stored to the model in those old versions, and adds it to model details and iv_info.
  • 17415 GTApprox: when exporting a model, descriptions of its inputs and outputs are now added to the comment in the exported code along with other model information.

15.6.47.3. Documentation

15.6.47.4. Bugfixes

  • 18042 17375 GTOpt: fixed a bug in processing response evaluation results in specific cases when a limit is imposed on the allowed number of evaluations (for example, GTOpt/MaximumExpensiveIterations is set). Due to this bug, when reaching the evaluation limit, GTOpt stopped evaluating the response, but further internally treated new points where the response was not evaluated as points where it failed to evaluate. This did not affect the solution quality, but the skipped points were incorrectly attributed as infeasible, which led to misinterpretation of results.
  • 15711 GTOpt: fixed a minor bug in the surrogate-based multistart method due to which it behavior depended on the GTOpt/ResponsesScalability option while it should not.
  • 15711 GTOpt: fixed several inconsistencies in the GTOpt/Techniques option behavior.
  • 17989 17886 GTApprox: fixed several inconsistencies related to interaction of the GTApprox/ExactFitRequired option with other options and techniques.
  • 17967 GTApprox: fixed a bug in build_smart() which caused a crash with a training sample containing categorical variables, when @GTApprox/TrainingSubsampleRatio is specified and internal validation is disabled.
  • 17817 GTDF, GTDR: fixed gtdf.Builder and gtdr.Builder sometimes ignoring the log level options (GTDF/LogLevel, GTDR/LogLevel).
  • 17672 GTDF: fixed an error in automatic technique selection.
  • 17376 17901 GTDoE: fixed the support for discrete variables in the Adaptive Design technique.
  • 17901 GTDoE: fixed the Adaptive Design technique sometimes ignoring the initial samples.
  • 17916 GTDoE: fixed the Adaptive Design technique (build_doe()) crashing when functional constraints in the generation space are defined in such a way that there is only one feasible point.
  • 17441 GTDoE: fixed the Orthogonal Array technique ignoring the GTDoE/Deterministic option.
  • 17383 GTDoE: fixed incorrect error messages in the blackbox-based DoE mode regarding insufficient blackbox budget.

15.6.48. pSeven Core 6.14.4

15.6.48.1. Updates and Changes

  • 17284 This version contains certain changes in behavior of pseudorandom generators used in adaptive DoE. These changes were required for the GTDoE/Adaptive/OneStepCount option fix described below. The changes do not break compatibility (there is no need to update your code), but affect adaptive DoE results in the deterministic mode (with GTDoE/Deterministic enabled). The results of deterministic adaptive DoE from versions 6.14.3 and below cannot be exactly reproduced in 6.14.4: adaptive DoE algorithms in 6.14.4 and 6.14.3 will generate different DoE for the same GTDoE/Seed. Otherwise, the deterministic adaptive DoE in 6.14.4 is not changed — if you use a fixed seed, generation in 6.14.4 is fully reproducible, although it does not match the results from previous pSeven Core versions. Note also that other DoE techniques (space-filling DoE) were not affected.

15.6.48.2. Bugfixes

  • 17500 GTApprox: fixed a bug in the heteroscedastic Gaussian processes algorithm (see Heteroscedastic data) due to which the GTApprox/Heteroscedastic option had no effect.
  • 17500 GTApprox: fixed a bug in output noise variance handling which could cause the “invalid point weights” error when the output noise data is not available for some output component (the outputNoiseVariance argument to build() contains a column filled with NaN values).
  • 17468 GTApprox: fixed build() requiring numeric values of output noise variance (elements of outputNoiseVariance) for NaN values of outputs (elements of the y training sample).
  • 17284 GTDoE: fixed incorrect behavior of the GTDoE/Adaptive/OneStepCount option, due to which the number of points added to DoE per iteration was always 1 even if a greater number was allowed.

15.6.49. pSeven Core 6.14.3

15.6.49.1. New Features

  • 17496 GTApprox now supports model export to a Functional Mock-up Unit for Model Exchange (FMI standard) in addition to the FMU for Co-Simulation export support added in version 6.9. See section Model Export and export_fmi_me() for details.
  • 17486 You can now remove the accuracy evaluation and smoothing information from GTApprox models. For some models, this can noticeably reduce the model size (memory consumption) or the volume of exported code. See modify() for details.

15.6.49.2. Updates and Changes

  • 17716 GTApprox, GTDF, GTDR: added training time to model details (see section Training Time in Model Details). The training time is now also printed to the training log.
  • 17678 GTDoE: slightly improved generation speed and quality of adaptive DoE produced by build_doe().
  • 17764 GT: removed needless details from some exception messages for clarification.

15.6.49.3. Documentation

15.6.50. pSeven Core 6.14.2

15.6.50.1. Documentation

15.6.50.2. Bugfixes

  • 17766 17779 GT: Fixed a crash when trying to get license information (see License Usage) for a standalone license.

15.6.51. pSeven Core 6.14.1

This is a maintenance release, which does not contain any functional changes or updates.

Note

Since this release, the updates following a release of a stable pSeven Core version will have short version numbers, for example 6.14.1 instead of 6.14 Service Pack 1. This change does not affect the internal version format used when importing the pSeven Core modules.

15.6.52. pSeven Core 6.14

Note

17492 This release requires an updated license file. Licenses issued for the previous versions of pSeven Core are not compatible with versions 6.14 and above. For more details on how to obtain a new license file, see section License Setup.

15.6.52.1. New Features

  • 17074 GTApprox is updated with a new method of model training parallelization, which can provide a significant increase in performance when training composite models on multi-core CPUs. See section Submodels and Parallel Training for details.
  • 15542 Added the support for input and output metainformation to GTApprox, GTDF and GTDR models. See section Model Metainformation for details.
  • 16153 11821 17268 Added the Adaptive Design technique, provided by build_doe(). This technique further improves the blackbox-based adaptive DoE methods:

15.6.52.2. Updates and Changes

  • GTOpt:
    • 17476 Improved multi-objective optimization algorithms for high-dimensional problems.
    • 17012 When provided with an initial data sample, GTOpt now does not request additional evaluations for linear and quadratic functions but restores their analytical form using the initial sample data only.
    • 16688 Some performance improvements in surrogate-based optimization methods.
    • 16153 11821 Added the support for the new common p7core.Result class.
  • GTApprox:
    • 17326 Smart training (build_smart()) now supports the automatic output transformation feature (applies GTApprox/OutputTransformation when needed), which helps in cases when values of some training outputs are exponentially distributed. Also improved accuracy of the statistical tests used to determine whether transformation should be applied when GTApprox/OutputTransformation is manually set to "auto".
    • 16175 Improved the special data analysis algorithm used in smart training to prevent oscillations of model outputs when a smooth model is required.
    • 17419 Optimized the exported C code for the accuracy evaluation (AE) information in Gaussian Processes models. Compiled GP models with AE are now up to 2 times smaller. This change does not affect the models which do not store AE information.
    • 17326 An existing HDA model can now be used as an initial model for the HDAGP technique (see the initial_model parameter to build()).
    • 16951 GTApprox is now more responsive to interrupts when training a model with the GP, HDAGP, or SGP technique.
    • 16951 Added an upper limit of 4000 for the GTApprox/SGPNumberOfBasePoints option.
    • 16951 The maximum training sample size supported by the GP and HDAGP techniques is now limited to 4000 points.
    • 17367 Added new special values for the HDA technique options which enable automatic tuning for them. These values are now default. Updated options are:
    • 17074 Added the "Auto" setting for the GTApprox/RSMStepwiseFit/inmodel option which selects the type of initial RSM model when using the stepwise regression method. This setting (now default) performs automatic selection based on the number of model terms. This modification noticeably speeds up stepwise regression RSM model training in some cases, for example, in case of quadratic dependency with multiple inputs or in presence of categorical variables.
    • 17436 Improved GTApprox stability by addressing a few potential memory issues.
  • 15542 GTDR: added the annotations and comment attributes, modify() method to the GTDR Model.
  • 16661 GTDoE: Improved quality of the OLHS technique in high-dimension cases.
  • 17329 GTSDA: Improved accuracy and performance of outlier detection algorithms.
  • GT:
    • 17389 Increased the timeout between watcher calls (see da.p7core.watchers) so that a watcher now receives no more than 1-2 messages per second.
    • 16222 Added a new common p7core.Result which is now used by GTDoE (build_doe() only) and GTOpt when compatibility mode is disabled (see compatibility in solve()).
    • 16153 16222 ProblemGeneric is now used as the base blackbox class both in GTOpt and GTDoE in order to support the constrained DoE generation method provided by build_doe().
    • 16153 16222 Added the new "@GT/VariableType" hint in order to support discrete and categorical variables in ProblemGeneric (required by build_doe()). See Hint Reference for details.

15.6.52.3. Documentation

15.6.52.4. Compatibility Issues

This release contains a number of changes GTApprox which do not create major compatibility issues but may require some updates in your code and introduce certain changes in its behavior in some cases. For details, see section Version Compatibility Issues.

15.6.52.5. Bugfixes

  • GTApprox:
    • 16174 Fixed incorrect clustering of data samples having tensor structure in the Mixture of Approximators technique, which could lead to inability to train a model if GTApprox/MoATechnique is set to "TA" or "TGP".
    • 17450 Fixed a bug in the Mixture of Approximators technique, which could cause an unhandled error when GTApprox/MoATechnique is set to "iTA".
    • 17404 Fixed a numerical issue which could cause the InvalidProblemError exception when using the Gaussian Process technique with a small training sample containing a constant input and specifying different sample point weights.
    • 17353 Fixed a slowdown of the Tensor Approximation technique in the case when some tensor factors are processed with the HDA technique and GTApprox/MaxAxisRotations is used.
    • 17669 Fixed incorrect (sometimes crashing) internal validation for Tensor Approximation models in the case when the training sample contains ambiguous points.
    • 17673 Fixed a crash in internal validation when all point weights are set to 0.
    • 16174 Several minor fixes in distributed model training (see set_remote_build()).
    • 17436 Minor internal algorithm fixes.
  • GTDoE:
    • 17276 Fixed a bug in the OLHS technique due to which it sometimes did not preserve properties of the initial LHS data sample (see Property Preservation Mode).
    • 17289 Fixed incorrect behavior of the Adaptive DoE technique with initial sample in presence of categorical variables.
  • 17329 GTSDA: fixed incorrect calculation of correlation coefficients in some cases, in particular when the input sample contains constant components or duplicated points.

15.6.53. pSeven Core 6.13 Service Pack 1

15.6.53.1. New Features

  • GTApprox can now automatically apply log transformation to outputs in the training sample, which improves model accuracy in cases when values of some outputs are exponentially distributed. You can also select transformations for specific outputs manually — see GTApprox/OutputTransformation for details.

15.6.53.3. Bugfixes

  • GTOpt: fixed incorrect behavior in some cases when the direct surrogate-based optimization method (see Direct SBO) is applied to a robust optimization problem: the solving process could not finish for a problem which defines an objective with the @GTOpt/EvaluationCostType hint set to "Expensive", adds stochastic variables (see set_stochastic()), and sets GTOpt/GlobalPhaseIntensity to 0.

15.6.54. pSeven Core 6.13

15.6.54.1. New Features

  • pSeven Core for Windows x64 now supports systems with more than 64 logical processors or several processor groups (see the Processor Groups section in Microsoft Docs for details). In general, it means that pSeven Core 6.13 shows significant performance increase when using high degree parallelization on Windows systems with more than 64 logical processors, compared to previous versions of pSeven Core for Windows, where the ineffective distribution of threads among logical processors could negatively affect performance.

15.6.54.2. Updates and Changes

  • GTOpt:
    • Added the @GTOpt/ExpensiveEvaluations hint which allows to specify the limit for the number of evaluations individually for any expensive objective or constraint.
    • Optimization results are now always sorted by values of variables.
  • GTApprox:
    • Improved the algorithm that splits the data sample given to build_smart() into the training and test subsamples when @GTApprox/TrainingSubsampleRatio is specified. The test sample points now better cover the function (response) space.
  • GTDF:
  • General:
    • Exceptions raised when input samples contain invalid data now provide more details explaining why the input was not accepted.
    • The FlexNet License Finder dialog will no longer be shown when a license is not found.

15.6.54.3. Documentation

15.6.54.4. Bugfixes

  • GTOpt:
    • Fixed a bug which in some cases lead to exceeding the evaluation budget set by GTOpt/MaximumIterations.
    • Fixed incorrect calculation of budget for expensive evaluations in some cases: the evaluations of initial guesses expended the budget set by GTOpt/MaximumExpensiveIterations despite they should not.
    • Fixed incorrect caching of expensive response values when working in batch mode: the Solver lost useful information because it did not cache the values of those responses which it did not request to evaluate, even if these responses were returned by the problem.
    • Fixed a bug in handling NaN response values in batch mode which could negatively affect the solution quality.
    • Fixed a bug which sometimes caused a stack corruption warning in Python 2.5.
  • GTApprox:
  • GTDF:
    • Fixed a bug in the low-fidelity model bias compensation algorithm (see GTDF/UnbiasLowFidelityModel) due to which it worked incorrectly when parallelization is enabled.
  • GTApprox, GTDF, GTDR:
    • Fixed a malformed exception on loading a corrupt model.
  • General:
    • Fixed a FutureWarning exception when using pSeven Core with NumPy 1.14.

15.6.55. pSeven Core 6.12 Service Pack 2

15.6.55.1. Updates and Changes

  • GT: if you are using the Linux version of the pSeven Core, you can notice certain performance increase in general, thanks to recent pSeven Core for Linux build optimizations.
  • GTOpt: the GTOpt/BatchSize option is now respected when evaluating points from the initial sample (the sample_x argument to solve()). Previously the initial sample points were never evaluated in batches even if batch mode was enabled for the solver.
  • GTApprox: slightly improved algorithm stability for the Gaussian Processes approximation technique.
  • GTSDA: Mutual Information, Distance Correlation, and Partial Distance Correlation techniques now work significantly faster.
  • GTSDA: reduced memory consumption when calculating Distance Correlation.
  • GTSDA: improved the dependency-based feature selection algorithm (see Dependency-based feature selection).
  • GTSDA: if GTSDA/Checker/PValues/Method is "Auto", GTSDA now selects the p-values calculation algorithm taking sample size into account (previously the selection was based only on the technique specified by GTSDA/Checker/Technique).

15.6.55.3. Bugfixes

  • GT: fixed a bug in handling invalid option names.
  • GT: fixed some exceptions causing an additional warning (“During handling of the above exception, another exception occurred”) in Python 3.
  • GTOpt: fixed a bug due to which some of the evaluation results for points from the initial guesses sample (the sample_x argument to solve()) were sometimes ignored by the solver.
  • GTOpt: fixed small discrepancy between the values of variables provided in the initial guesses sample and their values actually sent to evaluate() in order to get values of objectives and constraints when the latter are not provided as sample_f, sample_c to solve().
  • GTOpt: fixed a rare memory violation when a large number of parallel threads is used by the solver.
  • GTApprox: fixed a bug in Gaussian Processes model evaluation due to which model output for a single input point could be slightly different from the output for the same input point passed as a part of an input sample.
  • GTSDA: fixed GTSDA/Ranker/NoiseCorrection affecting the Screening Indices technique (this option is intended for the Sobol indices only).
  • GTSDA: fixed incorrect Sobol indices calculation for very noisy inputs in case when GTSDA/Ranker/NoiseCorrection is enabled.
  • GTSDA: fixed incomplete RankResult.info (info["Ranker"]["Detailed info"] missing) for Sobol indices calculated using the EASI technique.

15.6.56. pSeven Core 6.12 Service Pack 1

15.6.56.1. Updates and Changes

  • GTSDA: added the GTSDA/MaxParallel option.
  • GTSDA: improved string representation of GTSDA result objects.

15.6.56.2. Documentation

15.6.56.3. Bugfixes

  • GTOpt: fixed the bug due to which GTOpt and some DoE techniques recurrently degraded the default number of parallel threads to use if the GTOpt solver (or DoE generator) was used in a cycle, even if the optimizer (or generator) was re-created at each cycle step.
  • GTOpt: fixed numerical instability in solving some linear problems that lead to infeasible results despite that an internally feasible solution was found.

15.6.57. pSeven Core 6.12

15.6.57.1. Updates and Changes

15.6.57.3. Bugfixes

  • GTApprox: fixed SmartSelection not being able to train a model, if the test sample contained NaN values. Now user-provided test samples are checked for NaN values and constant columns. Depending on GTApprox/InputNanMode and GTApprox/OutputNanMode, either an exception is raised or test point is removed from the test sample.
  • GTApprox: fixed the bug due to which internal validation ignored the GTApprox/MaxParallel option value.
  • GTApprox: fixed a crash that could occur when calling select_subsample function.
  • GTSDA: fixed the bug which sometimes resulted in p_values miscalculation in check().

15.6.58. pSeven Core 6.11 Service Pack 1

15.6.58.1. New Features

  • GT: introduced official Python 3.6 support. The current implementation of pSeven Core will also support Python 2 versions from 2.5 to 2.7. See section System Requirements for details.
  • GTApprox: smart training procedure was extended with special analysis helping to prevent oscillations of the model between the training points when the smooth model requirement is specified (see section Model Features for details).
  • GTApprox: added a new GTApprox/MaxAxisRotations option that enables using model gradients during training in order to obtain smoother models.
  • GTOpt: added a new ProblemFitting class to solve fitting problems (see section Data Fitting Problem).
  • GTOpt: added the GTOpt/ResponsesScalability option that changes the computational model used by surrogate-based and robust optimization methods.
  • GTOpt: added a new scheduler taking into account the value of GTOpt/ResponsesScalability option and using this information to facilitate calculations.

15.6.58.2. Updates and Changes

  • GTOpt: added a watcher implementation to interrupt optimization process using the Ctrl C hotkey.
  • GTOpt: in batch mode optimizer will now determine the batch size automatically, unless GTOpt/BatchSize is set to a non-default value. See option description for details.
  • GTOpt: removed the gtopt.ProblemGeneric.elements_unused_hints() method.

15.6.58.3. Documentation

15.6.58.4. Compatibility Issues

15.6.58.5. Bugfixes

  • GTOpt: eliminated the possible influence of initial designs violating confluent bounds on the initial training sample — now such initial designs are ignored as intended.
  • GTOpt: fixed incorrect behavior when received evaluation results are greater than the maximum floating point value.

15.6.59. pSeven Core 6.11

15.6.59.1. New Features

  • GTOpt: added Direct Surrogate-Based Optimization (Direct SBO) technique which uses an alternative criterion to select new points for evaluation (see section Direct SBO for details). The new technique can be useful for constrained or multi-objective optimization problems when searching for non-trivial trade-offs between different expensive objectives and constraints.
  • GTApprox: introduced weighted descriptive statistics in approximation models. See Quality Assessment for details.

15.6.59.2. Updates and Changes

  • GTOpt: increased performance of stochastic optimization algorithms.
  • GTOpt: intermediate optimization result can now be requested using a watcher. See section Watchers for details.
  • GTApprox: increased robustness of the MoA clustering algorithm.
  • GTApprox: added the ability to stop PLA model building at any time with no intermediate result.
  • GTApprox: updated VBA wrapper code to provide better compatibility with Excel.
  • GTApprox: added details about output noise variance to model info.
  • GTApprox, GTDR: improved compatibility of models exported to the Octave format.
  • GTApprox, GTDF, GTDR: accelerated calculations in case of batch operations.
  • GTApprox, GTDF: added new methods that return the list of sections available in a model or a model file.
  • GTSDA: improved performance of the Kendall Correlation technique.
  • GTDoE: improved OLHS performance to enhance the space-filling criterion of generated DoE.
  • GTDoE: changed generation result structure — now initial designs extended by new generated points are returned.
  • GTDR: added support for the models exported to the Excel DLL format.

15.6.59.3. Documentation

15.6.59.4. Compatibility Issues

  • The GTOpt/GlobalPhaseIntensity option behavior has changed due to the addition of the Direct SBO technique. The default is now "Auto", and 0 turns on Direct SBO that does not consider error estimations of surrogate models. See Direct SBO section for more details.

15.6.59.5. Bugfixes

  • GTOpt: fixed a rare crash that could occur when solving an expensive mixed-integer problem with frozen variables.
  • GTOpt: fixed a crash when a problem had analytical gradients and some frozen variables.
  • GTOpt: fixed incorrect treatment of initial designs violating confluent bounds — now such initial designs are ignored.
  • GTOpt: fixed potential problems with DoE in highly eccentric polytops.
  • GTApprox: fixed a bug in the PLA technique that caused NaN values being returned in unexpected cases.
  • GTApprox: fixed a bug that occurred when training PLA models in case of one-dimensional sorted input data.
  • GTApprox: fixed validation of categorical variables when training models.
  • GTApprox: fixed incorrect handling of point weights in MoA technique.
  • GTApprox: fixed Unicode symbols issue in model annotations.
  • GTDoE: fixed an error which resulted in wrong behavior of adaptive DoE in case of small initial sample.

15.6.60. pSeven Core 6.10

15.6.60.1. Updates and Changes

  • GTOpt: improved the support for NaN objective and constraint values and missing values in the initial designs data. Such points are now taken into consideration if values of variables are within bounds defined by the problem. Also, GTOpt requests additional evaluations to fill the design points with missing values, and these evaluations do not consume the solving budget.
  • GTApprox, GTDR: reduced memory usage when exporting models.
  • GTApprox, GTDR: added str aliases for export formats in export_to(), compress_export_to(), and decompress_export_to().

15.6.60.2. Documentation

15.6.60.3. Bugfixes

  • GTOpt: fixed non-deterministic behavior of surrogate based optimization algorithms in multithreaded environment.
  • GTOpt: fixed incorrect behavior when a floating point number is specified as an initial guess for an integer variable. GTOpt will now raise the InvalidProblemError exception in this case.
  • GTApprox: fixed internal validation (see section Internal Validation) ignoring GTApprox options when training the cross-validation models.
  • GTApprox: fixed a bug in the HDA technique that caused accuracy degradation of HDA models.
  • GTApprox: fixed a bug in tensor approximation due to which model smoothing could not be applied to some TA and TGP models.
  • GTApprox: fixed incorrect behavior of the @GTApprox/TrainingSubsampleRatio smart training hint in some cases.
  • GTApprox, GTDF, GTDR: correct exceptions are now raised when trying to load a model from an invalid file.
  • GTDR: fixed incorrect export of Feature Extraction models to C.

15.6.61. pSeven Core 6.9 Service Pack 1

This is a maintenance release, which does not contain any functional changes or updates.

15.6.61.2. Bugfixes

  • Fixed incompatibility with some old Linux distributions that was caused by a build inconsistency: pSeven Core required a newer version of the GNU C Library (glibc). Note that the current requirements (Linux kernel 2.6.18, glibc 2.5) are still different from older pSeven Core versions — see System Requirements for details.
  • Fixed a bug that could cause conflicts with third-party Python modules.

15.6.62. pSeven Core 6.9

Note

This release depends on the updated licensing system: floating licenses for pSeven Core 6.9 and above require the updated version of the vendor daemon. To get the updated daemon, re-download the License Server package for pSeven Core. The new version of the daemon is compatible with earlier versions of the license server and existing pSeven Core licenses. For more details on the server and daemon setup, see Server Configuration.

15.6.62.1. New Features

  • GTApprox: added the support for exporting an approximation model as a Functional Mock-up Unit for Co-Simulation (FMI standard). See section Model Export and export_fmi_cs() for details.
  • GTApprox: optimized model training on HPC clusters for less load and higher performance.
  • GTApprox: added the capability to train a model on a remote host which is not a cluster submit node (remote training not using a HPC cluster). See details in set_remote_build().
  • GTOpt: you can now specify which optimization methods to use when solving the problem. By default, GTOpt selects algorithms automatically, but now you can also explicitly configure the problem as single- or multi-objective, enable robust or surrogate based optimization, select global search methods, and other. See GTOpt/Techniques for details.

15.6.62.2. Updates and Changes

  • GTApprox: added the capability to set remote environment variables in remote model training (see set_remote_build()).
  • GTApprox: added the support for calculating all model outputs and writing them to a cell range for the models exported to the Excel DLL format.
  • GTApprox: improved performance of GTApprox models exported to the Excel DLL format.
  • GTApprox: if the model includes submodels, the technique name in model details will now be "Composite", not "Auto".
  • GTApprox: improved handling of categorical variables in SmartSelection algorithms.
  • GTApprox: clarified error messages in SmartSelection.
  • GTOpt: reworked the computational budget allocation policies in various GTOpt algorithms, making them more consistent and as a result increasing stability of affected optimization methods. This change primarily affects methods involving global optimization stages, such as the surrogate based optimization that deal with computationally expensive problems (also including usage of these methods in robust optimization problems).
  • GTOpt: improved results filtering in multi-objective robust optimization.
  • GTOpt: added proper support for NaN objective and constraint values and missing values in the initial designs data.

15.6.62.3. Documentation

15.6.62.4. Compatibility Issues

  • Due to updates in computational budget allocation policies in GTOpt, you can observe changes in optimization results as compared to the results from previous versions. This is not a compatibility issue at the user level and should not require changes in code, except possibly adjusting optimization options in specific cases — for details, see sections Version Compatibility Issues and Local and Global Methods.

15.6.62.5. Bugfixes

  • GTApprox: fixed SmartSelection not being able to train the model if given a test sample that contains only one point.
  • GTApprox: fixed incorrect SmartSelection behavior in some tasks where the training sample includes categorical variables.
  • GTApprox: fixed smoothing methods accepting NaN smoothness factor values.
  • GTApprox: fixed incorrect order of compiler options in the Model Export example.
  • GTApprox: fixed incorrect export of model to C in case when the model has many independent outputs.
  • GTApprox: fixed model details sometimes including option values that were not set by user (showing defaults).
  • GTApprox: fixed a bug in data type conversion that could crash the builder.
  • GTApprox: fixed builder always raising exception when interrupted by a watcher, even if the model is already built.
  • GTOpt: fixed a bug due to which optimization log could appear in the console even if a logger is not set.
  • GTSDA: fixed a bug in Kendall correlation due to which it could be less than -1.

15.6.63. pSeven Core 6.8

15.6.63.1. New Features

  • GTApprox: approximation models can now predict undefined function behavior, and training samples including non-numeric values are supported. Depending on option settings, data points with non-numeric values can be either accepted (and used in prediction), or automatically removed from the training sample — see section Sample Cleanup or details.
  • GTApprox: smart training can now select train and test subsets from the given data sample. See section Training Features for details.
  • GTApprox: added coefficient of determination (\(R^2\)) to model error metrics. See section Error metrics for details.
  • GTApprox: added the support for exporting models into a special format that provides better compatibility with Excel. It exports C code ready to be compiled into a DLL that can be imported in Excel without making changes in the code manually.
  • GTApprox: added the Table Function technique (simple table lookup). See section Table Function for details.
  • GTDF, GTDR: data points containing non-numeric values can now be automatically removed from the training sample (see GTDF/InputNanMode, GTDR/InputNanMode).

15.6.63.2. Updates and Changes

  • GTApprox: significantly improved performance of cross validation in smart training. Also increased speed of smart training in general thanks to internal improvements in technique parameter selection.
  • GTApprox: pre- and post-processing functionality is now completely integrated into Smart Selection (see Smart Training) and is no longer available as a separate tool.
  • GTApprox: training data saved to the model (training_sample) now also contains the test sample if it was used when training the model.
  • GTApprox: model details now include smart training hints, test sample statistics, and detailed information about component-wise models. See section Model Details.
  • GTApprox: improved performance of the High-Dimensional Approximation (HDA) technique.
  • GTApprox: removed the deprecated Linear Regression (LR) technique.
  • GTApprox: the number of numeric values in componentwise iv_info (iv_info["Componentwise"]["Count"]) for constant outputs is now set to 0 to indicate that the validation sample is empty in this case.
  • GTOpt: improved solving algorithm for mixed-integer surrogate based optimization (SBO) problems — the points to evaluate that are generated at later solving stages are now more evenly distributed in the design space.
  • GTOpt: narrowed the selection of optimum solutions in robust optimization problems to avoid adding dominated points into the optimal set.

15.6.63.3. Documentation

15.6.63.4. Compatibility Issues

  • The deprecated Linear Regression (LR) technique in GTApprox is finally removed in this release. It can no longer be selected; if you used LR, it has to be replaced with the RSM technique and GTApprox/RSMType set to "Linear". This will result in training the same model as before, since LR actually used the linear RSM technique internally.
  • Since GTApprox pre- and post-processing functionality is now a part of the Smart Selection algorithm (see Smart Training), the older methods were removed in this release as deprecated. See section Version Compatibility Issues for details.

15.6.63.5. Bugfixes

  • GTApprox: fixed incorrect smart training behavior in the case when no suitable training technique can be found. A correct warning is now raised by build_smart().
  • GTApprox: fixed a bug in anisotropic smoothing (smooth_anisotropic()) that lead to an exception when the x_weights argument contained a row of zeros.
  • GTApprox: fixed a few bugs in error-based smoothing (smooth_errbased()) related to incorrect handling of method arguments.
  • GTApprox: fixed empty build_log after using modify().
  • GTApprox: fixed incorrect calculation of internal validation errors for models trained with the TA, iTA, or TGP technique in component-wise mode.
  • GTApprox: corrected the text of exception message from validate().
  • GTApprox: fixed the private key request when remote model training is used (set_remote_build()).
  • GTDF: corrected the text of exception message from validate().
  • GTOpt: fixed a bug in DoE generation at the model training stage in multi-objective surrogate based optimization (SBO) problems due to which the generated DoE did not fill the design space properly if problem variables had significantly different scales.
  • GTOpt: fixed a bug in multi-objective surrogate based optimization due to which it did not count NaN results as performed evaluations, consequently exceeding the GTOpt/MaximumExpensiveIterations budget.
  • GTOpt: fixed incorrect behavior (exceeding the budget) of the robust surrogate based optimization algorithm when NaN values are found in results of evaluations performed at the model training stage.
  • GTOpt: fixed a bug in surrogate based optimization (SBO) due to which it could crash when training an internal approximation model.
  • GTOpt: fixed the possibility of repeating evaluations in the global optimization mode.
  • GTOpt: fixed ignoring invalid names or values of optimization hints. Correct warnings are now raised.
  • GTOpt: fixed a rare crash that could occur when problem solving is finished.
  • GTOpt: fixed incompatibility with Python 2.5 in the Surrogate Based Robust Optimization example.
  • GTSDA: fixed a bug in the Mutual Information technique that could cause an index error.

15.6.64. pSeven Core 6.7

Note

Since this release, MACROS is renamed to pSeven Core. The algorithms and methods implemented in MACROS are the algorithmic core of the pSeven design space exploration platform, hence the name. The new name, pSeven Core, is now used throughout this manual.

15.6.64.1. Updates and Changes

  • GTApprox: improved smart training performance, in particular when training GBRT models.
  • GTApprox: you can now limit the total time of smart training using the @GTApprox/TimeLimit hint. See section Smart Training for details.
  • GTApprox: significantly reduced memory usage of high-dimensional RSM and HDA models.
  • GTApprox: added the capability to control the median, the 95th percentile, and the 99th percentile of absolute error when performing error-based model smoothing (see the error_type and error_thresholds arguments to smooth_errbased()).

15.6.64.2. Documentation

15.6.64.3. Compatibility Issues

  • Due to the project rename, the main macros module was also renamed to p7core. The old module name can still be used as an alias, so this rename does not affect compatibility with scripts implemented for previous versions. However, the names macros and p7core cannot be used interchangeably in the same scope — see section Version Compatibility Issues for details.

15.6.64.4. Bugfixes

  • GTApprox: fixed an error in smart training when the effective size of the input training sample is 1 point and this point is ambiguous.
  • GTApprox: fixed incorrect smoothing of RSM models trained using the ElasticNet algorithm (see section Parameters Estimation in Response Surface Model).
  • GTApprox: fixed incomplete model details (the "Regression Model" key missing) for RSM models trained using the ElasticNet algorithm.
  • GTApprox: fixed incorrect handling of very large point weight values which could result in training invalid models when GBRT, iTA, HDA, HDAGP, or RSM technique is used.
  • GTApprox: fixed a rare bug due to which invalid data could be saved in iv_info or training_sample for models with multidimensional output or models including categorical variables.
  • GTOpt: fixed a bug in the surrogate based optimization algorithm which could cause to it to work indefinitely in some problems.

15.6.65. MACROS 6.6

15.6.65.1. New Features

  • GTOpt: implemented a new robust optimization algorithm for problems that include expensive objectives or constraints. The new method is aimed to significantly reduce the required number of function evaluations at the cost of increasing the computational time spent in solving internal subproblems. Note that this overhead can be significant, since the method is intended for cases where function evaluations are time consuming or limited in number.
  • GTApprox: added a new model training method that automatically chooses an approximation technique and tunes values of its options in order to obtain the most accurate model. See build_smart() and section Smart Training for details.
  • GTApprox: introduced the new model storage format that allows to store training samples with the model, add detailed comments, and work with specific parts of the model in order to save or load it faster and consume less memory. See section Approximation Model Structure for full details.
  • GTApprox: distributed model training (see set_remote_build()) has become even more effective thanks to the support for parallel training of componentwise models (which is now the default mode) and a higher degree of parallelization possible for models that include categorical variables (sub-models for different combinations of categorical variables can be trained in parallel).
  • GTDF: data fusion models now use a storage format similar to GTApprox, providing the same capabilities. See section Data Fusion Model Structure for full details.
  • GTSDA: added score2rank() — a convenience method to transform ranker scores returned by rank() into a sorted list which can be directly passed to select() as the ranking argument.
  • GTSDA: added an option to calculate confidence intervals for the screening and first-order Sobol indices — see GTSDA/Ranker/VarianceEstimateRequired.
  • GTSDA: added noise correction capability to the algorithm calculating Sobol indices. See sections Sobol Indices and Noise Correction for details.

15.6.65.2. Updates and Changes

  • GTOpt: solve() now accepts initial samples containing points with values of variables that are outside of bounds set in add_variable(). Previously these points were automatically removed from the initial sample. Now they are used, in particular, to build internal approximation models thus increasing model accuracy and stability.
  • GTOpt: improved result filtering — now it excludes solutions that are insufficiently close to optimum even if they satisfy problem constraints. Due to this the number of points in optimization result can decrease compared to previous versions, but the result will contain more high-quality points.
  • GTOpt: the @GTOpt/LinearityType hint for objectives and constraints is no longer ignored in robust optimization problems. However, GTOpt now assumes that objectives hinted as linear or quadratic do not depend on any stochastic variable. Note that it can lead to unexpected behavior in some problems with invalid formulation that were solved in previous versions of MACROS (see section Version Compatibility Issues for details).
  • GTOpt: clarified messages describing reasons of stopping the optimization process when the it is interrupted by a watcher (see Watchers).
  • GTSDA: added p-value correction in correlation checks in order to avoid overconfident results (zero p-values).
  • GTSDA: the screening method in GTSDA ranker (Screening Indices) no longer sets all scores to NaN when it encounters a NaN value in analyzed outputs.

15.6.65.3. Documentation

15.6.65.4. Compatibility Issues

  • Potential issue is the change in treatment of the @GTOpt/LinearityType hint for objectives and constraints in robust optimization problems. It does not directly affect compatibility but can lead to unexpected behavior in some problems with invalid formulation that were solved in previous versions of MACROS. See section Version Compatibility Issues for details.

15.6.65.5. Bugfixes

  • GTApprox: fixed the incompatibility of models exported to the Octave format with older versions of Octave (3.0.5 and below).
  • GTApprox: fixed incorrect behavior of postprocess() when validation is requested but the test sample is not given.
  • GTApprox: fixed a crash when the training sample includes only 1 point and this point is passed as an array slice.
  • GTApprox: fixed the Training with Limited Memory example not working in Python 2.5.
  • GTApprox: fixed a runtime warning when the training sample contains NaN values.
  • GTDF: fixed build_MF() mutating the list passed as the samples argument.
  • GTDoE: fixed an incorrect error message in the case when the initial sample provided for adaptive DoE is too small.
  • GTOpt: fixed a bug which could cause incorrect initial sample generation in multi-objective and surrogate based optimization algorithms.
  • GTSDA: fixed a bug due to which Sobol indices calculated using the CSTA method (see Sobol Indices: CSTA Method) could have values greater than 1.
  • GTSDA: fixed incorrect calculation of main Sobol indices (see Sobol Indices) for very noisy data.
  • GTSDA: fixed runtime warnings when using partial Pearson correlation.
  • GTSDA: fixed some minor typos in error messages.
  • Statistical utilities: fixed incorrect handling of memory overflow errors in detect_outliers().
  • General: fixed incorrect error messages about invalid option values for options that have special default values that are outside the valid range.

15.6.66. MACROS 6.5 Service Pack 1

This is a maintenance release, which does not contain any functional changes or updates.

15.6.67. MACROS 6.5

15.6.67.1. New Features

  • GTApprox: added the support for model export, sample weighting and categorical variables to the piecewise-linear approximation (PLA) technique.

15.6.67.2. Updates and Changes

  • GTOpt: the internal limit on the maximum number of expensive function evaluations will no longer override GTOpt/MaximumExpensiveIterations if the latter is set to a higher value.
  • GTOpt: added several internal changes in multi-objective and surrogate based optimization algorithms that improve their stability in certain cases.

15.6.67.3. Documentation

15.6.67.4. Bugfixes

  • GTOpt: fixed a bug due to which a solution to a robust optimization problem or a problem with an expensive function (surrogate based optimization) could contain a non-optimal point if an initial sample containing values of variables and objectives is given.

15.6.68. MACROS 6.4

15.6.68.1. New Features

  • GTApprox: the GBRT technique now supports memory overflow prevention, allowing to use the GBRT incremetal training feature to automatically process very large training samples — see GTApprox/MaxExpectedMemory and the Training with Limited Memory example for details.

15.6.68.2. Documentation

15.6.68.3. Bugfixes

  • GTApprox: fixed the GBRT technique crashing when the minimum weight of points in a leaf set by GTApprox/GBRTMinChildWeight is too high for the given (small) sample size.
  • GTSDA: fixed excessive memory consumption when performing sensitivity analysis on high-dimensional data samples.

15.6.69. MACROS 6.3

15.6.69.1. New Features

  • GTApprox: added new piecewise-linear approximation technique. See section Piecewise Linear Approximation for details.
  • GTApprox: added an option to tolerate deviations of input values from a grid-like DoE, applying input rounding and allowing to use tensor approximation techniques with noisy samples with “almost factorial” DoE. See GTApprox/InputsTolerance and section Sample Cleanup (in particular, Input Rounding) for more details.
  • GTApprox: added the support for discrete (categorical) variables in HDA, GP, SGP, HDAGP, TGP, and iTA techniques. Discrete variables are specified by GTApprox/CategoricalVariables; note that the same option now specifies discrete variables for the RSM and TA techniques, making the GTApprox/RSMCategoricalVariables and GTApprox/TADiscreteVariables deprecated. The latter options are kept for version compatibility only and should not be used further, as they are going to be removed in future versions.
  • GTDoE: added the support for grid-based adaptive DoE generation. Setting GTDoE/CategoricalVariables now forces the adaptive generation algorithm to work on a user-defined grid, so the generated sample includes only grid points.

15.6.69.2. Updates and Changes

  • GTApprox, GTDF: componentwise approximation is now enabled by default in order to avoid problems when training models with many independent outputs. Old behavior (disabling componentwise approximation) can be achieved using the GTApprox/DependentOutputs option (or GTDF/DependentOutputs, respectively). Note that the GTApprox/Componentwise and GTDF/Componentwise options are now deprecated and are kept for version compatibility only. These options should not be used further, as they are going to be removed in future versions.
  • General: added new exception type OutOfMemoryError to indicate memory allocation problems that previously raised generic exceptions such as InternalError. This allows to detect if a problem is actually related to data size — for example, a too big training data set in approximation.

15.6.69.3. Documentation

15.6.69.4. Bugfixes

  • General: fixed incorrect import of MACROS modules if the path to the current working directory contains Unicode characters.
  • General: fixed a thread safety issue that could cause a crash when running multiple MACROS threads in parallel — for example, using the threading module (this issue does not affect the built-in parallelization, such as with GTApprox/MaxParallel).
  • GTOpt: fixed a fatal error when solving a mixed-integer problem without constraints.
  • GTApprox: fixed a bug that sometimes lead to inability to train a tensor approximation model when model reduction (GTApprox/TAModelReductionRatio) is enabled.

15.6.70. MACROS 6.2

15.6.70.1. New Features

  • GTApprox: added an option to reduce the complexity of tensor approximation models — see GTApprox/TAModelReductionRatio and section Model Complexity Reduction for details.
  • GTDoE: the adaptive DoE method now supports updating a sample generated using the LHS or OLHS technique in such a way that preserves sample’s space-filling properties — that is, ensures that the new sample is also an (optimized) Latin hypercube. See generate() for details.
  • GTSDA: added the robust Pearson correlation technique, see section Robust Pearson Correlation.
  • GTSDA: added calculation of second order Sobol indices in the CSTA method, see section Sobol Indices: CSTA Method for details.
  • GTSDA: added Scott’s method of determining the bin size for histogram based estimation of mutual information. This method noticeably increases the mutual information technique performance and is now default. To switch back to the previously used full search method, use the GTSDA/Checker/MutualInformation/BinsMethod option.

15.6.70.2. Updates and Changes

  • GTOpt: an improvement in the surrogate based optimization (SBO) method enables solving problems with a lower limit for GTOpt/MaximumExpensiveIterations if an initial sample containing variable, objective, and constraint values is supplied.

    This update finalizes improvement of SBO for large-scale optimization that was gradually implemented in previous MACROS versions and now includes:

    • The support for high dimensional problems with hundreds of design variables (see sections High-Dimensional SBO and Parallel SBO for details).
    • Noticeable runtime reduction thanks to hierarchical and multilevel surrogate based optimization.
    • Improved support for NaN responses.
  • GTOpt: specific methods of handling cusp-like singularities in problem functions are no longer used if analytical gradients are enabled in the problem.

  • GTApprox: significantly reduced size of models trained using the GBRT technique.

  • GTApprox: reduced size of componentwise (GTApprox/Componentwise on) models trained using the GP technique.

  • GTApprox: improved the algorithm that selects the subset of the training sample to be stored into a GBRT model. The stored subset is now more smoothly distributed over the initial training sample, which positively affects quality of incremental model training.

  • GTApprox: added the support for SGP (sparse Gaussian processes) factors to the Tensor Approximation technique.

15.6.70.3. Documentation

15.6.70.4. Bugfixes

  • GTOpt: fixed a bug which could lead to incorrect behavior of the surrogate based optimization method in mixed-integer problems with small values of the GTOpt/MaximumExpensiveIterations option.
  • GTApprox: fixed a bug in determining the GTApprox/HDAMultiMax bounds that did not allow to set a value less than 5 (the default value for GTApprox/HDAMultiMin).
  • GTDoE: fixed a bug in the OLHS technique due to which it could generate a sample which, in terms of the \(\phi_p\) metric (see phi_p()), is worse than a non-optimized LHS with the same random seed.
  • GTSDA: fixed incorrect calculation of total Sobol indices if the function (output) is constant.
  • GTSDA: fixed a bug in the mutual information technique due to which it produced different results if the order of inputs is changed.
  • GTSDA: fixed a bug in select() due to which it could enter an infinite loop.
  • GTSDA: fixed incorrect behavior of the rank() and check() methods regarding watchers (see section Watchers).

15.6.71. MACROS 6.1

15.6.71.1. New Features

15.6.71.2. Updates and Changes

  • GTOpt: solving mixed-integer problems no longer requires using the surrogate based optimization (SBO) method. Previously to solve a mixed-integer problem GTOpt required to define at least one objective or constraint as expensive using the @GTOpt/EvaluationCostType hint.
  • GTOpt: improved handling of cusp-like singularities in objective and constraint functions.
  • GTApprox: GBRT models now store a portion of the training sample, which improves quality of incremental training (see the initial_model argument to build()).
  • GTApprox: internal improvements in the GBRT algorithm.

15.6.71.3. Documentation

15.6.71.4. Bugfixes

  • GTOpt: fixed malformed header in the evaluation history file if analytical constraint gradients are enabled in the problem.
  • GTOpt: fixed the enormous amount of logger output when solving a mixed integer problem with multi-threading enabled.
  • GTSDA: fixed incorrect behavior of the partial Pearson correlation technique in the case when the analyzed sample contains constant columns.
  • GTDF: fixed an algorithmic bug in the MFGP technique due to which it could be unable to train a model if the difference in responses in the training samples of different fidelity is low enough.
  • GTDF: fixed GTDF/MaxParallel not being applied to the approximator used internally by GTDF.

15.6.72. MACROS 6.0

  • GTApprox
    • More information added to model details — see the attribute’s description.
    • Parallel training is now disabled for small training samples by default due to computational inefficiency. See GTApprox/MaxParallel description for more details.
    • Fixed a bug in pre-processing due to which it crashed when the input dimension is higher than the number of sample points.
    • Updated non-regression test results — see section GTApprox Test Report.
  • GTDoE
    • Added a new technique for generating mixed orthogonal arrays — see section Orthogonal Array for details.
  • GTOpt
    • All gradient-based methods are now capable to deal with cusp-like singularities in objective and constraint functions (codimension-one discontinuities of first derivatives). In particular, this can result in better performance when solving problems like structural optimization, where cusp singularities are common.
    • Updated non-regression test results — see section GTOpt Test Report.
  • GTSDA
    • Significantly improved overall performance.
    • Updated the FAST method of calculating Sobol sensitivity indices so it provides more accurate estimates.
    • Corrected partial Pearson correlation coefficients calculation in the case when only the input dataset is provided.
    • Added a guide on using GTSDA instead of GTIVE – see section GTIVE to GTSDA Migration Guide.
    • Various smaller documentation updates and fixes.

15.6.73. MACROS 6.0 Release Candidate 1

Warning

This release removes the deprecated GTIVE tool. Sensitivity analysis methods are available in GTSDA.

15.6.74. MACROS 5.3

  • GTApprox
    • GBRT models now support incremental training (see the initial_model argument to build()).
    • Distributed training on a HPC cluster is now supported for all componentwise models (previously was available for MoA models only). See set_remote_build() for details.
    • Fixed a bug in the SPLT technique which sometimes caused discontinuities in SPLT models.
  • GTSDA
    • Significantly improved the performance of the Kendall correlation technique in check().
    • Added a fast correlation-based feature selection algorithm. See GTSDA/Selector/QualityMeasure and section Dependency-based feature selection for details.
    • Added a correlation-based method to compute Sobol sensitivity indices. See section Sobol Indices: CSTA Method for details.
    • Correlation scores calculated using the Mutual Information technique are now normalized to \([0, 1]\) by default. Normalization can be disabled by GTSDA/Checker/MutualInformation/Normalize.
    • Fixed a bug in the partial Pearson correlation technique which lead to incorrect results when the analyzed sample contains the input part only.

15.6.75. MACROS 5.2

  • GTApprox
    • RSM models now provide detailed information that allows to obtain the model in analytical form (see details).
    • Added the support for model pickling and unpickling.
    • Fixed the GBRT technique incorrectly treating deterministic mode options (GTApprox/Deterministic, GTApprox/Seed).
    • Fixed exception when GTApprox/GBRTNumberOfTrees is set to 0 and allowed 0 as a special value (auto setting, default).
    • Fixed GTApprox/Accelerator working incorrectly for the HDAGP technique.
    • Prohibited infinite point weights in the training sample (see the weights argument to build()) because they can lead to numerical instability in certain cases.
  • GTDF
    • Added the support for model pickling and unpickling.
    • Added option GTDF/Deterministic to switch between the deterministic and non-deterministic training modes.
    • Added option GTDF/Seed that sets the fixed seed used in the deterministic training mode.
    • Fixed GTDF/Accelerator not working for the SVFGP and MFGP techniques.
  • GTDR
    • Added the support for model pickling and unpickling.
  • GTOpt
    • Fixed a bug due to which the surrogate based optimization (SBO) could end prematurely and without notable improvement in case of limited GTOpt/MaximumExpensiveIterations.

15.6.76. MACROS 5.1

  • GTDF
    • GTDF no longer has separate sample- and blackbox-based model classes. Instead, build_BB() returns an instance of Model which supports blackbox-based calculations (see the updated calc(), grad(), calc_ae(), validate() methods and has_bb, has_ae_bb). This change fixes several inconsistencies related to blackbox-based training techniques support. Version compatibility was not affected: ModelWithBlackbox is kept as an alias for Model, and ModelWithBlackbox methods are added to Model (though considered deprecated).
  • GTOpt
    • Improved evaluation history: added designs, a more compact and convenient representation of evaluation data; both designs and history now distinguish between None and NaN special values (see attribute descriptions for details).

15.6.77. MACROS 5.0

  • GTApprox

    • Reworked options related to various randomized algorithms used in GTApprox:

      • Added option GTApprox/Deterministic to switch between the deterministic and non-deterministic training modes.
      • Added option GTApprox/Seed that sets the fixed seed used in the deterministic training mode.
      • The GTApprox/HDAInitialization, GTApprox/HDARandomSeed, and GTApprox/SGPSeedValue options were removed. Randomized aspects of the HDA, HDAGP and SGP techniques are now controlled with GTApprox/Deterministic and GTApprox/Seed.
      • Added GTApprox/IVDeterministic to switch between the deterministic and non-deterministic cross validation modes.
      • Renamed GTApprox/IVRandomSeed to GTApprox/IVSeed and changed it to work with GTApprox/IVDeterministic.
      • Renamed the GTApprox/Postprocess/IVRandomSeed hint to GTApprox/Postprocess/IVSeed.

      For more details, see Version Compatibility Issues and option descriptions.

15.6.78. MACROS 5.0 Release Candidate 3

  • GTSDA
    • Fixed incorrect behavior of the Kendall Correlation technique.
    • Reworked GTSDA options interface. Most options were renamed to make their functions more clear; see GTSDA Option Reference for details.
  • Statistical Utilities
    • Fixed incorrect calculation of Kendall rank correlation coefficient for rank components in calculate_statistics().

15.6.79. MACROS 5.0 Release Candidate 2

  • GT
    • Added an option to set the maximum number of parallel threads MACROS tools can use, so there is no need to change the OMP_NUM_THREADS environment variable. This option is currently available in GTApprox, GTDF, GTDoE, GTDR and GTOpt (see GTApprox/MaxParallel, GTDF/MaxParallel, GTDoE/MaxParallel, GTDR/MaxParallel, and GTOpt/MaxParallel). Note that the option applies to a specific tool instance only, which is also more convenient than the system-wide OMP_NUM_THREADS setting.
  • GTApprox
    • Extended sample point weighting support: point weights are now supported in the LR, RSM, HDA, GP, SGP, HDAGP, iTA, and MoA techniques (previously was available in the iTA technique only). Point weights affect the model fit to the training sample — see build() for details (the weights argument).
  • GTDF
    • Added sample point weighting support similar to GTApprox. See build() and build_MF() for details.
  • GTOpt
    • Performance improvements and significant internal bugfixes in mixed-integer linear problems.
    • Fixed undefined behavior in case of low GTOpt/TimeLimit.

15.6.80. MACROS 5.0 Release Candidate 1

  • GTSDA
    • Added mutual information technique in correlation analysis. See section Mutual Information for details.
  • GTOpt
    • Fixed a crash when solving a robust constraint satisfaction problem (a special problem type which includes no objectives, and constraint functions depend on some stochastic variables).

15.6.81. MACROS 4.3

  • GTApprox
    • Added initial support for running approximation model training on a remote host or a HPC cluster (currently only LSF clusters are supported). See set_remote_build() for details.

15.6.82. MACROS 4.2 Service Pack 1

This is a maintenance release, which does not contain any functional changes or updates.

15.6.83. MACROS 4.2

  • GTApprox
    • Improved the Tensor Approximation (TA) model quality in certain cases with noisy training data. This update can have an effect if the technique is configured to use only BSPL and DV factors (see GTApprox/TensorFactors), and GTApprox/ExactFitRequired is off.
    • Improved the incomplete Tensor Approximation (iTA) model quality in similar cases of noisy training data. Note that this update can have an effect only if GTApprox/ExactFitRequired is off. However, it is not affected by GTApprox/TensorFactors (iTA always uses BSPL factors and ignores the latter option).
    • Added exact fit support to the iTA technique. Note it is now affected by the GTApprox/ExactFitRequired option.
    • Fixed a bug in the Tensor Approximation (TA) technique due to which grad() returned nonsensical gradient values for discrete variables. All partial derivatives with respect to discrete variables are now NaN.
  • GTOpt
    • Updated the Surrogate Based Optimization (SBO) algorithm in such a way that it includes a finalization stage where convergence to the optimum is more smooth and the search becomes more localized. This can potentially result in better solutions because the algorithm is now able to “push” to the optimum more actively.

15.6.84. MACROS 4.1

  • GTOpt
    • Added mixed integer linear programming support. Single-objective problems with a mix of integer and continuous variables can now be solved without using the Surrogate Based Optimization (SBO) method, provided that all objectives and constraints are linear functions (see @GTOpt/LinearityType in Hint Reference) and the problem supports analytical gradients (see enable_objectives_gradient(), enable_constraints_gradient()).
    • Overall performance increased thanks to significant improvements of internal algorithms — in particular, the methods to solve saddle point systems and quadratic problems.
    • The trace level log output (see GTOpt/VerboseOutput) is now more readable.

15.6.85. MACROS 4.0

  • GTDF
    • New technique, Multiple Fidelity Gaussian Processes (MFGP) which allows using more than two samples of different fidelity to train a data fusion model. MFGP also supports additional output noise variance data in training samples — see build_MF() description for details.
  • GTDoE
    • Fixed a bug which could cause generate() to freeze when used in the blackbox-based adaptive mode without an initial sample.
  • GTSDA
    • GTSDA is no longer in beta. Note that it means GTIVE will become deprecated in future versions; it is now recommended to use GTSDA for sensitivity analysis and related tasks. For details on features and methods implemented in GTSDA, see the GTSDA Guide.

15.6.86. MACROS 4.0 Release Candidate 1

  • GTDoE
    • Fixed a bug in the sample-based adaptive DoE algorithm that works with a training sample (both init_x and init_y, see generate()). Due to this bug, the algorithm performed iterative retraining of the internal approximation model, updating the training sample with data obtained from the model itself.

15.6.87. MACROS 4.0 Beta 1

  • GT
    • Example scripts illustrating MACROS usage are now automatically installed with the package and can be run without manually copying the .py files, using the python -m option. See the updated section Running Examples for details.
  • GTDF
    • Fixed a bug in preprocessing of the high-fidelity sample for the blackbox-based VFGP_BB technique, due to which presence of duplicates in the sample could alter the trained model.
  • GTDoE
    • The Optimal Design technique is now available as an initial sampling technique for the blackbox-based adaptive DoE (see GTDoE/Adaptive/InitialDoeTechnique).
    • Fixed blackbox-based adaptive DoE always including all points of the initial sample into the final result, even if some of them violate specified generation bounds. Now the result will include only those initial points that do not violate the bounds.
    • Fixed adaptive DoE crashing with certain values of GTDoE/Adaptive/TrainIterations.
  • GTOpt

    • Major improvements in Surrogate Based Optimization (SBO):

      • The method now uses a new family of internal surrogate models with greatly decreased training time and complexity which are specifically tuned for SBO.
      • Internal SBO algorithms are now multi-scale and multi-resolution capable, meaning that functions with intricate landscape could be modelled easily.

      Note that the updated SBO algorithms are slightly more demanding with respect to required budget: the total time to solve usually decreases due to the increased algorithm efficiency, but new algorithms require more function evaluations.

15.6.88. MACROS 3.4

  • GT
    • Lowered the NumPy requirement to version 1.6.0 (was 1.6.1). As before, MACROS installation can proceed without NumPy — see the System Requirements section for details.
    • Changed the licensing system to count license features on a per-process basis. See section License Usage for details.
  • GTApprox
    • MACROS now provides a simple C interface to evaluate GTApprox models. See the GTModel Guide for details.
    • Improved pre-processing: added an option to average output values for coincident input points and to calculate variance for averaged output (useful for build() with outputNoiseVariance). See GTApprox/Preprocess/AverageCoincidentPointValues for details.

15.6.89. MACROS 3.3

  • GTApprox
    • Added smoothing support to response surface models (the models built with the RSM technique).
    • Fixed a bug in model export to C due to which exported models performed the following evaluations incorrectly:

15.6.90. MACROS 3.2

  • GTApprox
    • Allowed automatic selection of the Tensor Gaussian Processes (TGP) technique.
    • Updated gradient accuracy evaluation method (see grad_ae()) provides significantly faster gradient AE calculation for Tensor Gaussian Processes (TGP) models.
    • Added quadratic trend support to the Gaussian Processes (GP) technique. Due to this the GTApprox/GPLinearTrend option becomes deprecated and is replaced by GTApprox/GPTrendType.
    • GTApprox/RSMType default changed to "purequadratic".
    • The Linear Regression (LR) technique is added back to automatic selection (was excluded in MACROS 1.11.1). This is due to changing the default GTApprox/RSMType option value.
  • GTDoE
  • GTOpt
    • Added information on applied hints (see Hint Reference) to the string representation of GTOpt problem classes.

15.6.91. MACROS 3.1

  • GTDoE
    • The adaptive DoE technique now supports functions with multidimensional output (in generate(): init_y with 2 or more columns, and/or blackbox with size_f() 2 or greater).
    • Adaptive DoE can now handle NaN values in the response part of an initial sample and in function responses (init_y and blackbox in generate(), respectively).
  • GTOpt
    • Fixed an error in results processing due to which the infeasible set always contained no points in case of an infeasible problem, while in fact (assuming GTOpt/OptimalSetType is "Extended") it should contain the points that did not violate the threshold set by GTOpt/OptimalSetRigor, even if there were no feasible points (that is, no evaluated points did satisfy problem constraints).

15.6.92. MACROS 3.0

  • GT
    • Added da.p7core.gtsda module. GTSDA allows to perform global sensitivity analysis, correlation tests and forward/backward feature selection and is meant to replace GTIVE in future releases. Note it is currently in beta state.

  • GTApprox

  • GTOpt
    • Implemented mixed-integer optimization support in SBO (see Surrogate Based Optimization). SBO now allows to solve single- and multi-objective problems with a mix of integer and continuous variables.
    • Removed the converged point set from Result since it is in fact intended for internal purposes and usually is of no interest to the end user. See Version Compatibility Issues for more details.

15.6.93. MACROS 3.0 Release Candidate 2

  • GTApprox
    • New technique, Tensor Gaussian Processes (TGP), which is a further development of the methods first introduced in the Tensor Approximation (TA) technique. TGP modifies the Gaussian Processes (GP) algorithm so it is able to handle a very large data set, provided it was obtained using a Cartesian product DoE, and also provides the accuracy evaluation support not available in TA.
    • Fixed a bug in Octave code generated by export_to() in case of a Tensor Approximation model featuring discrete variables, which caused errors when evaluating the exported model.

15.6.94. MACROS 3.0 Release Candidate 1

  • GT

  • GTOpt
    • Improved quality of internal approximations used in Surrogate Based Optimization (SBO).
    • Changed queryx argument type in evaluate() from list[list[float]] to ndarray. As a result, evaluate() no longer supports name indexing of variables (such as queryx[0]["var_name"] or queryx[0].var_name). See section Version Compatibility Issues for details.
    • All Result attributes now contain NumPy arrays instead of array-like data type. Name indexing for variables is no longer supported.
    • Fixed exception type in case of too low GTOpt/MaximumIterations value.

15.6.95. MACROS 3.0 Beta 2

  • GTApprox
    • Added the support for incomplete output noise variance data. Missing values can now be specified using NaN elements in noise variance array (see the outputNoiseVariance argument to build()).
    • Removed the GTApprox/InterpolationRequired option (deprecated since 1.8.0) in favor of GTApprox/ExactFitRequired.
    • Removed save_to_octave() method (deprecated since 1.8.0) in favor of export_to().
    • Fixed smoothing methods not working for some models trained with GTApprox/Componentwise on.
    • Fixed a bug in the Gaussian Processes technique which could cause an exception if the training sample contains values greater than \(10^{15}\).
  • GTDF
    • Removed the GTDF/InterpolationRequired option (deprecated since 1.8.1) in favor of GTDF/ExactFitRequired.

15.6.96. MACROS 3.0 Beta 1

  • GTApprox
    • Fixed a defect in the internal validation procedure due to which all RRMS error values in iv_info appeared to be NaN in case of leave-one-out cross-validation (GTApprox/IVSubsetCount set equal to the effective size of training sample).
    • Better support for 64-bit compilation in the Model Export example. The script shall now compile a shared library type model properly if run by a 64-bit Python interpreter under 64-bit Windows.
  • GTOpt
    • Optimization result can now include additional points that satisfy optimality criteria but violate problem constraints and feasibility measures to a certain extent (the infeasible point set). Related new options are:
    • New option, GTOpt/RestoreAnalyticResponses — allows to restore analytic forms of problem objectives and constraints hinted as linear or quadratic, so they are evaluated internally by solver without calling evaluate().

15.6.97. MACROS 2.4

This is a maintenance release, which does not contain any functional changes or updates.

15.6.98. MACROS 2.3

This is a maintenance release, which does not contain any functional changes or updates.

15.6.99. MACROS 2.2

  • GTOpt
    • Fixed an important bug in multi-objective Surrogate Based Optimization. Incorrect processing of constraints at anchor search stage in this mode could lead to selecting infeasible solutions as anchor points, potentially causing problems in solving.

15.6.100. MACROS 2.1

  • GTOpt
    • Fixed a bug in evaluated set filtering which could sometimes lead to incorrect identification of optimal points in multi-objective problems (a point that is marginally worse than optimal could be included in result).

15.6.101. MACROS 2.1 Release Candidate 2

This is a maintenance release, which does not contain any functional changes or updates.

15.6.102. MACROS 2.1 Release Candidate 1

  • GTOpt
    • Fixed a bug that possibly lead to incorrect results in case of multi-objective Surrogate Based Optimization in badly scaled design space.

15.6.103. MACROS 2.0

  • GTDoE
    • Implemented the sample-based adaptive DoE technique that allows to perform the adaptive DoE process without a blackbox (see option GTDoE/Technique and combinations of arguments in generate()).

15.6.104. MACROS 2.0 Release Candidate 2

  • GTApprox
    • Added the possibility to save model values calculated during internal validation (see option GTApprox/IVSavePredictions).
    • Fixed an internal bug in the Mixture of Approximators (MoA) technique which created version compatibility issues in side-by-side installations.
    • Fixed model smoothing being unavailable in Mixture of Approximators (MoA) models that provide accuracy evaluation (trained with GTApprox/AccuracyEvaluation on).

15.6.105. MACROS 2.0 Release Candidate 1

  • GT
    • Since this release, MACROS provides separate setup packages for 32-bit and 64-bit platforms.
  • GTApprox
    • Fixed incorrect training cache size calculation which could cause excessive memory consumption when using the Sparse Gaussian Processes (SGP) technique.
    • Fixed a bug in model evaluation which resulted in calc() returning incorrect values when evaluating a Gaussian Processes (GP) model built using a large training sample (more than 1024 points). Techniques other than GP were not affected. Also, since the evaluation method is not stored within a model when saving it to a file with save(), models saved in previous MACROS versions will now evaluate correctly even if earlier they produced unexpected results due to this bug. Note there is no need to rebuild such models.
    • When using internal validation (see GTApprox/InternalValidation option), predicted IV outputs are now saved in the iv_info model property.
  • GTDoE
    • Fixed incorrect handling of NumPy slices when using Adaptive DoE which corrupted initial sample data if init_x, init_y arguments to generate() are NumPy slices.
  • GTOpt
    • Implemented initial data sample support, see the sample_x, sample_f, and sample_c arguments to da.p7core.gtopt.Solver.solve(). Initial sample allows to specify multiple initial guesses for solver (if only sample_x is specified) or use cached data when solving a problem (if sample_f and/or sample_c is specified in addition to sample_x).
    • Increased maximum points batch size in the batch optimization mode to 16384 (see GTOpt/BatchSize).

15.6.106. MACROS 1.11.1

  • GTApprox
    • The Tensor Approximation (TA) and incomplete Tensor Approximation (iTA) techniques now may be selected automatically if the training sample fits TA/iTA technique requirements. Note that due to this the GTApprox/EnableTensorFeature option is now on by default.
    • Removed the Linear Regression (LR) technique from automatic selection. In cases when LR was selected previously, the RSM technique will be selected instead. Note that RSM options (for example, GTApprox/RSMType) do apply when RSM is selected automatically. This change does not affect the manual LR technique selection.
    • Corrected the exported function name generation in export_to().
  • GTOpt
    • Implemented the support for stochastic variables in Surrogate Based Optimization (SBO), thus allowing to combine the robust optimization and SBO approaches.
    • Enabled using analytical gradients in SBO problems. Note that in this case GTOpt requests gradient values for cheap functions only, so there is no need to calculate expensive gradients in evaluate(). However, if gradients were enabled as dense (see the arguments for enable_objectives_gradient() and enable_constraints_gradient()), then evaluate() should still return some placeholder values to preserve the response structure. The placeholder value itself is ignored, only the return structure is meaningful.
    • Fixed budget violation in multi-objective SBO.

15.6.107. MACROS 1.11.0

  • GT
    • All blackbox-based techniques now use the common Blackbox class. Deprecated blackbox classes (da.p7core.blackbox.InteractiveBlackbox, GTDR blackboxes) were removed, and their functionality has been moved to the common blackbox. This change breaks compatibility with previous versions, see the Version Compatibility Issues section for details.
    • Added the support for analytical gradients and evaluation history to Blackbox. See enable_gradients() and enable_history().
  • GTDF
    • Blackbox-based data fusion now uses the common Blackbox class; da.p7core.blackbox.InteractiveBlackbox was removed.
    • Fixed an algorithmic bug which in rare cases could freeze the builder.
    • Fixed the GTDF/IVSubsetCount option not being applied correctly.
  • GTDoE
    • Adaptive Doe now uses the common Blackbox class; da.p7core.blackbox.InteractiveBlackbox was removed.
  • GTDR
    • Blackbox-based Feature Extraction now uses the common Blackbox class; da.p7core.gtdr.Blackbox was removed.
  • GTOpt
  • Statistical Utilities

15.6.108. MACROS 1.10.5

  • GT
    • Corrected NumPy version checking which previously allowed using MACROS with an older NumPy version, leading to unexpected behavior in various cases. From now if MACROS detects an older version of NumPy on initialization, it will raise an exception. See the System Requirements section for the required version.
  • GTApprox
    • Fixed old GP-based models (trained using versions 1.8.4 and older) setting their has_smoothing attribute to True on load despite they do not support smoothing.
    • Fixed a bug in model export to C which resulted in an assertion from export_to() in a specific case when the exported model was trained using the MoA technique, and the training sample contained constant input or output components.
  • GTDF
    • Fixed RRMS value not being found in validate() result.
  • GTOpt
    • Improved global search: now allows finer tuning via the GTOpt/GlobalPhaseIntensity option controlling the complexity of applied globalization algorithms. This option also deprecates GTOpt/GlobalSearch.

15.6.109. MACROS 1.10.4

  • GTApprox
    • Corrected processing of strings, NaN and Inf values in training samples when using the MoA technique.
    • Fixed MoA technique to support the output noise variance properly (see outputNoiseVariance argument to build()).
    • Updated automatic technique selection logic: in 1-dimensional case, if sample size is less than 5 points, LR technique is now selected provided that both GTApprox/AccuracyEvaluation and GTApprox/ExactFitRequired are off (previously tried SPLT and stopped).
    • Corrected exception type and message in case of duplicate values in X sample corresponding to different values in Y.
  • GTDF
  • GTDoE
    • Adaptive DoE example added.
    • Updated measures module documentation.
  • GTOpt
    • Fixed an internal solver bug which could cause a fatal error if GTOpt/TimeLimit is set.

15.6.110. MACROS 1.10.3

  • GT
    • Better exception handling in user functions: exceptions from user-defined methods will now contain informative error messages.
  • GTApprox
    • The GP technique was updated to support Gaussian processes with additive kernel. Primarily, this feature improves model quality in high-dimensional cases or when the functional dependence contains interaction terms. To use it, the additive covariance function type has to be specified via the GTApprox/GPType option.
    • Updated automatic technique selection logic. Selecting the HDAGP technique now depends on the response dimensionality: if it is greater that 15, GTApprox will select the GP technique where it selected HDAGP before. For more details, see Section 5.2 in GTApprox User Manual.
  • GTDoE
    • Fixed a bug in the measures module which did not allow to calculate metrics if the input sample included only two points.
  • GTOpt
    • Fixed set_stochastic() always requiring a name for the stochastic distribution (the name argument).
  • Statistical Utilities

15.6.111. MACROS 1.10.2

  • GTApprox
    • Lowered the sample size requirements; in particular, even 1 point is now enough to build a model if using LR or RSM techniques, and accuracy evaluation and exact fit options are off. See the GTApprox Sample Size Requirements section for details.
    • Fixed a bug which lead to a division by zero error in MoA technique when GTApprox/MoAWeightsConfidence is explicitly set to its default value.
    • Fixed GTApprox/Technique appearing in the options list twice.

15.6.112. MACROS 1.10.1

  • GT
    • The seed for random number generation in Generic Tools modules is now received from the system random device instead of using the time-based seed. This fixes the weakness in the behavior of non-deterministic random number generators under Windows (previously it was possible to occasionally initialize different generators with the same seed).

15.6.113. MACROS 1.10.0

  • GTApprox
    • Fixed a bug in model gradients calculation (grad()) which made the results not reproducible.
    • Fixed the GTApprox/GPLearningMode option not appearing in the public interface.
  • GTDF
    • Fixed a bug in internal validation in the blackbox-based mode which caused the error estimates to be NaN when some of the high-fidelity sample points were outside the blackbox domain. There is still a possibility that internal validation results are NaN, but it happens only when cross-validation subsets were generated in such a way that all sample points in at least one of them are outside the blackbox domain. See section Version Compatibility Issues for details and workarounds.
  • GTDR
    • The GTDR/SurrogateModelType option for sample-based Feature Extraction now allows to select any approximation technique except LR and SPLT.
    • Fixed a bug in the Feature Extraction algorithm which lead to incorrect results in classification tasks.
    • Fixed the decompress() method mutating the compressed vector which was given as an argument.

15.6.114. MACROS 1.9.6

  • GT
    • MACROS now requires NumPy version 1.6.1 or newer to run. Installation will proceed without NumPy, but the installer will issue a warning; see the System Requirements section for details. The decision to include NumPy into system requirements is due to the fact that it is a widely used package in scientific computing, and usually it is already installed on target systems; MACROS, in turn, will benefit from the capabilities provided by NumPy. This update does not break version compatibility: scripts made for previous MACROS versions will work as is.
  • GTApprox
    • Implemented several significant improvements to the Gaussian Processes (GP) technique, which, as a result:
      • Resolved the problem of GP models degrading to constant in certain parameter space investigation tasks.
      • Allowed to handle noisy sample data better, in particular avoiding automatic exact fit when it is not wanted.
      • Added a new option to control the trade-off between model accuracy and robustness in GP models — see GTApprox/GPLearningMode for details.
    • GTApprox models exported to C will now contain two dedicated methods to calculate model values and accuracy estimation. Both methods also allow to calculate values and gradients separately.
    • Fixed incorrect processing of point weights in the incomplete Tensor Approximation technique which sometimes lead to an undefined Builder behavior.
  • GTDF
    • Updated the data fusion algorithm to prevent degenerate behavior of Data Fusion models in the cases when the high- and low-fidelity training samples are located in different design space regions.
  • GTDR
    • Added an option to specify the algorithm for the internal approximator in the sample-based Feature Extraction mode — see GTDR/SurrogateModelType.
  • GTOpt
    • Fixed unstable methods in multithreading implementation which could lead to crashes.

15.6.115. MACROS 1.9.5

  • GTApprox
    • Added sample point weighting support to the incomplete Tensor Approximation technique (iTA). Point weights affect iTA model fit to the training sample — see the build() method description for details (new optional parameter, weights).
  • Mixture of Approximators
    • Added the method to evaluate model gradients to MoA models.

15.6.116. MACROS 1.9.4

  • GTApprox
    • Implemented linear extrapolation support for BSPL factors in Tensor Approximation models. See options GTApprox/TALinearBSPLExtrapolation and GTApprox/TALinearBSPLExtrapolationRange.
    • Model export to C: fixed a bug in C99 header generation which made the generated headers unusable in C++ code.
    • Fixed a few bugs with parameter types in smoothing methods which could lead to incorrect parameter interpretation in smooth_anisotropic() and smooth_errbased().
    • Fixed Builder not updating internal validation accuracy data in model info properly which, when the same Builder instance was used to train several models in succession, lead to the same data (actually related to the first model) appearing in the internal validation accuracy information for all models.
    • Certain internal dependencies of the GTApprox techniques required to allow degenerate smoothing for linear models. To clarify: linear models will no longer throw an exception on an attempt to use smoothing (has_smoothing is set to True for linear models), but all smoothing methods will simply return a copy of the original model. This includes all models built by the LR technique, and RSM models if they include only constant and linear terms. Note the latter may happen even if GTApprox/RSMType was set to something other than "linear" because this option only restricts the term types allowed in model and, for example, will not create a quadratic model if the original dependency is linear. Proper smoothing for interaction and quadratic RSM models is to be implemented in future releases.

15.6.117. MACROS 1.9.3

  • GTApprox
    • Corrected the Model Export example so it will work on Windows (provided that gcc is installed).

15.6.118. MACROS 1.9.2

  • GTApprox
    • New technique, incomplete Tensor Approximation (iTA), allows to apply the tensor approximation approach when the training sample was obtained using an incomplete Cartesian product DoE (such as a full factorial with some points missing) or a combination of several complete sets with different complete Cartesian product DoE.
    • New option GTApprox/TAReducedBSPLModel allows trade-off between model size and accuracy for Tensor Approximation models.
    • Fixed a bug in Tensor Approximation models smoothing which made smooth() to fail randomly when used from a TA model.
    • Fixed Tensor Approximation bug which resulted in an exception from build() if the training sample contained repeating values.
    • Fixed incorrect parsing of the array form of the x_weights parameters in smooth_anisotropic() which caused the method to apply wrong smoothing settings.

15.6.119. MACROS 1.9.1

  • GTDoE
    • Added the support for classic DoE methods (discrete parameters sampling) to a greater number of GTDoE techniques: now supported by Full Factorial, Latin Hypercube Sampling, Optimal Latin Hypercube Sampling, and Optimal Design for RSM. See also the GTDoE/CategoricalVariables option.

15.6.120. MACROS 1.9.0

  • GT

    • In addition to Generic Tools modules, MACROS package will now include derived tools for specific tasks — the MACROS Extras. First of them is added in this release — Mixture of Approximators (approximation based on space partitioning).

      Changed in version 1.10.0: the Mixture of Approximators functionality has been moved to the GTApprox module, making it available as one of the approximation techniques (see GTApprox/Technique).

    • MACROS switched to a new build system, increasing the number of compatible platforms (see the System Requirements section for details).

  • GTApprox
    • The dynamic smoothing feature of GTApprox models (the one which used the smoothness parameter of the calc(), grad(), calc_ae() and grad_ae() methods) was completely replaced by a more convenient and easy to use smooth() method. Also, two advanced smoothing methods are implemented — see smooth_anisotropic() and smooth_errbased(). This change breaks compatibility with previous versions, see the Version Compatibility Issues section for details.
    • Increased the number of pre- and postprocessing hints.
    • Implemented a new version of the GP and HDAGP techniques (based on Gaussian Processes) for processing input samples with heteroscedastic noise variance, see the GTApprox/Heteroscedastic option description.
    • Added an option to limit the number of points selected from the training set to calculate model accuracy due to this test may be time consuming when large training samples are used. See the GTApprox/TrainingAccuracySubsetSize option description for details.

15.6.121. MACROS 1.8.5

  • GTApprox
    • BSPL technique (one of the techniques available in Tensor Approximation, see the GTApprox/TensorFactors option description) is now affected by the GTApprox/ExactFitRequired option: the technique will not try to fit training data exactly when the option is off.
    • Corrected exception messages for the Tensor Approximation technique.
    • Fixed a bug in the implementation of the Gaussian Processes based techniques (GP, HDAGP, SGP) which made GTApprox to build different models depending on the number of OpenMP threads set by user.
  • GTDF
    • Removed superfluous training set accuracy information from the GTDF model info. It will now contain accuracy values only for the model it belongs to.

15.6.122. MACROS 1.8.4

  • GTApprox
    • Fixed wrong exception type when the vector of mean values specified by the GTApprox/GPMeanValue option has incorrect dimension.
    • Fixed an internal bug in the compactness calculation of an array which could lead to a segmentation fault when building a model.
    • Preprocessing/postprocessing fixes:
      • Fixed incorrect parsing of the GTApprox/Postprocess/Artifacts hint.
      • Corrected the mentions of default option values in the recommendations received as the result of preprocessing to avoid confusion.
      • Corrected exception types in pre- and postprocessing functions.
  • GTDF
    • Fixed a bug in the blackbox-based version of the Difference Approximation technique which could lead to a segmentation fault when building a model.
  • GTOpt
    • Fixed Surrogate Based Optimization bug which caused a noticeable slowdown when solving 1-dimensional SBO problems.

15.6.123. MACROS 1.8.3

  • GTApprox
    • Fixed the pre- and postprocessing functions to remove the internal NumPy dependency.

15.6.124. MACROS 1.8.2

  • GTApprox
    • Improved model export to C:
      • Now supports exporting models built using the Splines with Tension (SPLT) technique.
      • Supports exporting model gradient and accuracy evaluation features if they are available in the model.
  • GTOpt
    • Global search now works in multi-objective optimization problems.

15.6.125. MACROS 1.8.1

  • GT
    • Added section “Regression Tests”.
    • Fixed an installation bug which could lead to a file access violation error under Windows when reinstalling the same version of MACROS on top of the installed one (without prior uninstall).
  • GTApprox
    • Updated the documentation on the pre- and postprocessing modes of the builder and added examples.
    • Added the Model Export example to illustrate the functionality of exporting a GTApprox model to C code implemented before in 1.8.0.
    • Fixed a bug in the GTApprox/Accelerator option processing which made it apply wrong settings if accelerator was set to 5 while Gaussian Processes or another GP-based technique was used. This bugfix also affects other tools using GTApprox internally (for example, GTDF).
  • GTDF
    • The GTDF/InterpolationRequired option was renamed to GTDF/ExactFitRequired for the name to be consistent with the GTApprox/ExactFitRequired option and for the same disambiguation purposes (see the MACROS 1.8.0 changelog). The old name is kept for the compatibility but is considered deprecated.

15.6.126. MACROS 1.8.0

  • GTApprox
    • Added model export to Octave, MEX file and C (see the export_to() function description). Makes save_to_octave() deprecated in GTApprox.
    • The Builder now allows to save user comments with the model - see the comment parameter in build() function description.
    • Added RRMS factor to the da.p7core.gtapprox.Model.validate() output.
    • Significantly increased the quality of noisy functions approximation with the Gaussian Process technique.
    • Due to an ambiguity in meaning of “interpolation”, the GTApprox/InterpolationRequired option was renamed to GTApprox/ExactFitRequired. Old name is kept for compatibility but is considered deprecated.
    • Sample pre- and postprocessing functionality added, see preprocess() and postprocess().
  • GTOpt

    • New option to force searching for the global optimum, GTOpt/GlobalSearch.

      Changed in version 1.10.5: GTOpt/GlobalSearch replaced by the advanced GTOpt/GlobalPhaseIntensity option.

15.6.127. MACROS 1.7.7

  • GTApprox
    • Fixed the strong dependency of internal validation results on the random seed.
  • GTDF
    • Fixed a bug in the internal validation procedure which lead to crash in the blackbox mode.
  • GTDoE
    • Fixed da.p7core.gtdoe.Generator.generate() not accepting a NumPy array as an initial sample in adaptive DOE mode.
    • In adaptive DoE mode, if the generator and the blackbox have different bounds, generator will now correctly request values only for the points which belong to the intersection of these two domains.

15.6.128. MACROS 1.7.6

Warning

This release drops Python 2.4 support (see the System Requirements section).

  • GT

15.6.129. MACROS 1.7.5

  • GTApprox
    • Added response noise variance information to the internal validation output.
    • Added method to calculate accuracy evaluation gradients, see da.p7core.gtapprox.Model.grad_ae().

15.6.130. MACROS 1.7.4

  • GTApprox
    • Added an optional parameter to specify the variance of response values in the training sample (improves the quality of noisy approximations when given an estimate of noise variance over the sample data). See da.p7core.gtapprox.Builder.build() description.

15.6.131. MACROS 1.7.3

This is a maintenance release, which does not contain any functional changes or updates.

15.6.132. MACROS 1.7.2

  • GT
    • Improved the support for non-English system locales.

15.6.133. MACROS 1.7.1

  • GT
    • The Examples section was reworked and now contains annotated examples. More examples can also be found in section the Code Samples.
  • GTApprox

    • Implemented new smoothing method which allows user to create and save models with default smoothness, see ironing().

      Changed in version 1.9.0: this method was replaced with more advanced smoothing methods, see the MACROS 1.9.0 changelog.

    • Multiple Tensor Approximation technique improvements:

      • Speed up TA values calculation.
      • Added exporting to human-readable format (Octave).
      • Implemented internal validation for TA.
      • Implemented discrete variables support, see the GTApprox/TADiscreteVariables option.
      • Fixed automatic technique selection logic in case of conflicting options.
    • Fixed a bug that made smoothing unusable for exact fit models.

  • GTDF
    • Fixed multithreading bug that could make GTDF stop or crash.
    • Fixed gradient calculation problem in case of model trained on a sample with constant output value.
    • Fixed a bug in automatic technique selection in blackbox mode.
    • Evaluation of a model built by a blackbox technique now does not require an input from the blackbox which was used when creating the model.
    • Implemented cross-validation for blackbox techniques.
  • GTOpt
    • Implemented new internal solver algorithm (SQCQP - Sequential Quadratic Constraints Quadratic Programming) which further reduces the number of objective functions and constraints evaluations.
    • Updated GTOpt documentation to disambiguate using optimizer hints — see add_variable(), add_objective(), add_constraint() descriptions and the Hint Reference.
    • New example in the Optimization tutorial.

15.6.134. MACROS 1.7.0

  • GTApprox
    • Added new technique: TA (Tensor Approximation).
    • Added capability to perform model validation using separate test set in addition to internal validation.
    • New options for Gaussian Processes technique: GTApprox/GPLinearTrend and GTApprox/GPMeanValue.
  • GTDF

    • Completely reworked tool. Added support of large training sets, O(100 000) points. Blackbox-based model training support added.

    • Implemented internal validation, a number of corresponding options added (see GTDF/InternalValidation, GTDF/IVRandomSeed, GTDF/IVSubsetCount, GTDF/IVTrainingCount).

      Changed in version 5.0: GTDF/IVRandomSeed renamed to GTDF/IVSeed.

    • Added capability to perform model validation using separate test set in addition to InternalValidation.

    • Documentation updated.

15.6.135. MACROS 1.6.3

  • GTApprox
    • Memory footprint of approximation training process with SGP technique significantly reduced for big samples (100 000 points and more).
    • Fixed wrong model serialization on some platforms.
  • GTOpt
    • Bugfixes in internal algorithms.

15.6.136. MACROS 1.6.2

  • GT
    • Fixed warning loglevel messages sometimes showing on error loglevel.
    • More wrong exception type bugfixes.
    • Documentation has undergone complete revision fixing more inconsistencies with current development state, expanding descriptions and disambiguating various issues.
  • GTDR
    • Added has_variable_compression() method which allows user to check whether the model supports variable compressed vector size.
    • It is now possible to serialize model to/from string using tostring() and fromstring() methods.
    • Applied some corrections to dimension estimation procedure.

15.6.137. MACROS 1.6.1

  • GT
    • Multiple wrong exception type bugfixes.
    • Fixed problem with the model trained and saved on 32-bit OS being impossible to load on 64-bit platform.
    • Multiple minor documentation inconsistencies are fixed.
    • All interfaces became more strict and safe, they check type and values, for instance, for NaN and Inf values.
  • GTApprox
    • Technique based on sparse Hessian for additional high precision tuning of HDA approximation for the case of big samples is introduced. New option GTApprox/HDAHessianReduction was added for HDA. This option controls the trade-off between time and accuracy. Option GTApprox/HDAHPMode is removed as obsolete.
    • Implemented algorithm for sparse approximation of inverse covariance matrix and its determinant to accelerate GP training process.
    • Additional tuning of acceleration switch for GPHDA technique.
    • Fixed lower bound of GTApprox/SGPNumberOfBasePoints.
    • Setting options GTApprox/SGPSeedValue, GTApprox/SGPNumberOfBasePoints to -1 now causes exception to be thrown immediately instead of at the time of building approximation.
    • Fixed allowed range for GTApprox/SGPNumberOfBasePoints.
  • GTDR
    • DR in FE mode outputs cumulative loadings matrix into model info. This is important for getting qualitative information about which input coordinate has the most ifluence on the output.
    • Dimensionality of compressed space in FE mode became optional parameter. It will be chosen automatically if omitted.
    • Added check to ensure that number of points in X is equal to number of points in F.
  • GTOpt
    • It is possible to hint optimizer with additional information about components of variables, objectives and constraints, such as ‘LinearityType’ for objectives and constraints.

15.6.138. MACROS 1.6.0

  • GT
    • Trained models now store build log inside.
    • Multiple improvements in GenericTools algorithms.
    • All calc-like methods now accept both scalar and vector inputs.
    • Multiple bugfixes.
  • GTApprox
    • New approximation technique added: SGP - Sparse Gaussian Processes. It makes accuracy evaluation available for large data samples.
    • Handling of constant columns in training set is improved, especially for GP and HDAGP techniques.
    • Fixed AccuracyEvaluation not working correctly in Componentwise modes.
    • Multiple bugfixes in option handling.
  • GTDF
    • Introduced new public interface.
    • DF tool completely reworked.
    • Added new DF algorithms (techniques): DA (Difference Approximation), HFA (High Fidelity Approximation) and VFGP (Variable Fidelity Gaussian Process).
    • Implemented Decision Tree for automatic selection of DF algorithms.
  • GTDoE
    • Several minor bugfixes.
  • GTDR
    • DR model now provides gradients.
    • It is possible to force DR to work in PCA mode by setting GTDR/Technique option.
    • Feature extractor FE model can be built on black box.
    • Algorithms of sample based techniques are improved. In particular, additional iterations of gradient estimation are done for improvement of FE model accuracy.
  • GTOpt
    • Added robust optimization functionality.
    • Robust optimization functionality uses well-known OpenTURNS distribution generation mechanisms.

15.6.139. MACROS 1.5.3

  • GT
    • Fixed minor bug causing zero byte appearing in log output.
    • Overall improvement of exception handling. Tools now throw more adequate exceptions on errors.
  • GTApprox
    • Constant columns are removed automatically in HDA technique.
    • Fixed rare occasional crashing on model save/load.
  • GTDoE
    • Fixed a bug which caused second request for generated points to return empty list (for batch technique).
  • GTOpt
    • Initial guess values are now checked for being correct.

15.6.140. MACROS 1.5.2

  • GT
    • Default log level set to ‘Info’.
  • GTDR
    • Zero-dimensional compressed space is forbidden.
  • GTOpt
    • Intermediate result callback is simplified and is now only used to interrupt the optimizer.

15.6.141. MACROS 1.5.1

  • GTDF
    • Added new option GTDF/Accelerator. It allows control over trade-off between speed and accuracy.
  • GTDR
    • Model builder accepts dimensionality of compressed space even in case of FE technique. It will be treated as default value in model.
  • GTOpt
    • Optimizer options are now split in two groups: Basic and Advanced for convenience.
    • Two basic options added: GTOpt/ObjectivesSmoothness and GTOpt/ConstraintsSmoothness. They allow user to hint optimizer what kind of optimization problem it deals with.
    • Some CUTEr issues.
    • Number of evaluations of objectives and constraints was decreased significantly (approximately in half).
    • Added examples of solving problem with DFO methods.
    • Added Optimization tutorial.
    • Fixed a bug which led to fail when NumPy arrays were used in objective functions.
    • Fixed missing status messages.