Model Evaluation

This tutorial shows how to use an existing approximation model: given a model file, evaluate model responses for values of variables from a test sample and save the results.

Note

This tutorial requires the approximation model trained in the Sample-Based Approximation tutorial.

Before You Begin

This tutorial requires an existing prjTutorials project. If you have not created this project yet, see Tutorial Project first. You will also need the approximation model trained in the Sample-Based Approximation tutorial (model.gtapprox should be found in the prjTutorials project directory).

  • Open the prjTutorials project.
  • Create a new empty workflow. If you have never created a workflow before, see the Simple Workflow tutorial.
  • Save the workflow as wfApproxEvaluation.
  • Switch to Workspace and verify that the workflow file (ApproxEvaluation.p7wf) is added to the project. You can see it in the Project pane.
  • While in Workspace, verify that a file named model.gtapprox exists in the project.
  • Switch to Edit and select the workflow you have just created to continue with the tutorial.

Task

The task in this tutorial is to use an existing approximation model to predict function values for the values of variables from a test sample (evaluate model responses).

The test sample can be prepared beforehand (stored in a file) or generated by the workflow itself. This tutorial uses a randomly generated sample, since loading sample files is already described in the Sample-Based Approximation tutorial.

The end result is two samples, input and model responses, which are stored to the project database and can be used to study the model.

Solution

Evaluating an approximation model in pSeven can be divided into the following general steps:

  1. Prepare a test sample — load from the file or generate it in the workflow.
  2. Configure an ApproxPlayer to load the model from the file and send the test sample to this block to get responses.
  3. Save the results for analysis.

Test Sample Generation

To generate a random test sample, you can use a DoE block.

  • Add a DoE block. Name it Test Sample.

Basic settings for DoE are the number of points to generate and generation space bounds. These settings have to be sent to the count and bounds input ports, respectively. Since in general they are specified by the user, Test Sample.bounds and Test Sample.count should be added to workflow parameters.

Note

For more details on workflow parameters, see Workflow as a Tool.

../_images/page_tutorials_approx_eval_01_doe.png
  • Open Test Sample configuration and switch to the Ports tab.
  • Select the bounds and count ports as parameters.
  • Add parameter defaults:
    • Count: any integer, 500-1000 is recommended so that you can get enough data for plotting.
    • Bounds: should be a 2x2 matrix, for example ((-5.0, 0.0), (10.0, 15.0)). The first pair sets lower bounds, the second sets upper bounds. Note that this matrix implicitly sets the number of input variables (sample dimension): it is equal to the number of matrix columns.
  • Click b_ok to save settings and close the configuration dialog.

When the workflow starts, Test Sample will read parameter values from Run or, if they are not specified, the parameter defaults above. Generated sample is a RealMatrix which is output to the Test Sample.points port and has to be sent to an ApproxPlayer block for evaluation.

Model

Approximation models are loaded and evaluated by ApproxPlayer.

  • Add an ApproxPlayer block. Name it Model.

The input sample has to be sent to Model.x.

  • Link Test Sample.points to to Model.x.

Also the block has to load the model from file (alternatively, you can send the model to Model.model if you get in from ApproxBuilder).

  • Open Model configuration and click b_browse in the Model File pane to bring up the file selection dialog.
../_images/page_tutorials_approx_eval_02_modelfile.png
  • In the File Origin pane choose the Project origin.
  • Input model.gtapprox in the File path field (or click b_browse and navigate to this file).
  • Leave other settings default and click b_ok to close the dialog.

After selecting the model file you can see model details in the Model Info pane.

../_images/page_tutorials_approx_eval_03_modelinfo.png

Verify that the model is loaded and click b_ok to close the configuration dialog.

Saving Results

To save the results, enable monitoring the Model.x and Model.f ports.

../_images/page_tutorials_approx_eval_04_runconf.png
  • Open workflow configuration b_runconf and switch to the Monitoring tab.
  • Click b_blconf_add in the toolbar to open the Add Port dialog. Select Model.x and Model.f.
  • Set monitoring aliases, for example, Input for Model.x and Response for Model.f.
  • Click b_ok to save settings and close the dialog.

Monitored data (model inputs and responses) will be stored to the project database when the workflow runs, allowing you to process the results in Analyze.

Workflow

Finished workflow is very simple: it contains two blocks only. Test Sample generates a random data sample with specified number of points and value bounds, Model loads the model from model.gtapprox in the project directory and evaluates it.

../_images/page_tutorials_approx_eval_05_wf.png

Generated sample properties can be changed in Run (Configuration tab).

../_images/page_tutorials_approx_eval_06_run.png

Note that model input and output ports are monitored; in this workflow, the results are saved to the project database only.

  • Verify that input and response monitors are enabled.
  • Save the workflow and run it with default parameter settings.

When the workflow finishes, switch to Analyse to view the results.

Results

The data from Model.x and Model.f ports is saved to the project database — the Input and Response records, respectively (note that records in Monitoring are named the same as the monitoring aliases in workflow configuration).

../_images/page_tutorials_approx_eval_07_pdb.png

To visualize the results you can, for example, plot the input sample in 2D and plot the model’s response surface.

  • In Analyze, create a new report and open the Data Series pane to add report data.
../_images/page_tutorials_approx_eval_08_pdb2ds.png
  • Select the Input and Response records in the project database and drag them to the Data Series pane. Note that the Input record creates 2 data series because it contains 2-dimensional data, and data series are always 1-dimensional.

Add a 2D input sample plot (for more details on how to add a 2D plot, see Results and Reports tutorial):

../_images/page_tutorials_approx_eval_09_inplot.png
  • Dataset: input components as X and Y.

Add a 3D response surface plot:

../_images/page_tutorials_approx_eval_10_surface.png
  • Dataset: input components as X and Y, response as Z.

Conclusion

This tutorial explains the basics of evaluating an approximation model in pSeven in batch mode — that is, the input contains a number of sample points, and all responses are evaluated at once. ApproxPlayer, in fact, is not limited to batch calculations and can process sequential input. For example, you can remove the Test Sample block and uplink Model.x and Model.f instead to study the model manually, or use this block in an iterative workflow (such as an optimization cycle).