.. _model-surrogate-global-experimental_python-class_path_and_name: """"""""""""""""""" class_path_and_name """"""""""""""""""" Specify the module and class names of the python surrogate .. toctree:: :hidden: :maxdepth: 1 **Specification** - *Alias:* None - *Arguments:* STRING **Description** Specify the module and class names of the python surrogate. At minimum, the class must contain the methods 'construct' and 'predict'. Dakota passes to the former a matrix (num samples x num variables) of variable values and a matrix (num samples x num components) of responses to use to train the surrogate, where num components is 1 for scalar surrogates. The `predict` method receives a matrix (num samples x num variables) and expects a matrix (num samples x num components) of predicted values to be returned. Optional 'gradient' and 'hessian' methods are supported as well. **Examples** Use of python-based surrogates in Dakota is enabled with the model block via: .. code-block:: model, id_model = 'SURR' surrogate global, dace_method_pointer = 'DACE' experimental_python class_path_and_name = "surrogate_polynomial.Surrogate" An example python class that implements a linear regression surrogate that works for both scalar and vector-valued responses and that includes gradient and (trivial) hessian values is the following: .. code-block:: python class Surrogate: def __init__(self): self.coeffs = None def construct(self, var, resp): var2 = np.hstack((np.ones((var.shape[0], 1)), var)) self.coeffs = np.zeros(var.shape[1]) z = np.linalg.inv(np.dot(var2.T, var2)) self.coeffs = np.dot(z, np.dot(var2.T, resp)) return def predict(self, pts): return self.coeffs[0]+pts.dot(self.coeffs[1:]) def gradient(self, pts): grad = self.coeffs[1:] return grad.T def hessian(self, pts): num_vals = pts.shape[1] hess = np.zeros(shape=(num_vals, num_vals)) return hess