method
Begins Dakota method selection and behavioral settings.
Topics
block
Specification
Alias: None
Arguments: None
Child Keywords:
Required/Optional 
Description of Group 
Dakota Keyword 
Dakota Keyword Description 

Optional 
Name the method block; helpful when there are multiple 

Optional 
Control how much method information is written to the screen and output file 

Optional 
Number of designs returned as the best solutions 

Required (Choose One) 
Method (Iterative Algorithm) 
Strategy in which a set of methods synergistically seek an optimal design 

MultiStart Optimization Method 

Pareto set optimization 

(Experimental Capability) Solves a mixed integer nonlinear optimization problem 

Local Surrogate Based Optimization 

Adaptive Global SurrogateBased Optimization 

DOT conjugate gradient optimization method 

DOT modified method of feasible directions 

DOT BFGS optimization method 

DOT Sequential Linear Program 

DOT Sequential Quadratic Program 

CONMIN conjugate gradient optimization method 

CONMIN method of feasible directions 

(Experimental) Dynamicallyloaded solver 

NPSOL Sequential Quadratic Program 

Sequential Quadratic Program for nonlinear least squares 

NLPQL Sequential Quadratic Program 

A conjugate gradient optimization method 

QuasiNewton optimization method 

Finite Difference Newton optimization method 

Newton method based leastsquares calbration 

Newton method based optimization 

Simplexbased derivative free optimization method 

Thirdparty optimization library integration demonstration. 

Rapid Optimization Library (ROL) is a largescale optimization package within Trilinos. 

Pattern search, derivative free optimization method 

Finds optimal variable values using adaptive meshbased search 

Gradientfree inequalityconstrained optimization using Nonlinear Optimization With Path Augmented Constraints (NOWPAC). 

Stochastic version of NOWPAC that incorporates error estimates and noise mitigation. 

Multiobjective Genetic Algorithm (a.k.a Evolutionary Algorithm) 

Singleobjective Genetic Algorithm (a.k.a Evolutionary Algorithm) 

Pattern search, derivative free optimization method 

Simple greedy local search method 

Constrained Optimization BY Linear Approximations (COBYLA) 

DIviding RECTangles method 

Evolutionary Algorithm 

(Experimental) Coliny beta solver 

Trustregion method for nonlinear least squares 

(Experimental) nonlinear conjugate gradient optimization 

DIviding RECTangles method 

Voronoibased highdimensional global Lipschitzian optimization 

Classical highdimensional global Lipschitzian optimization Classical highdimensional global Lipschitzian optimization 

Global Surrogate Based Optimization, a.k.a. EGO 

Generic UQ method for constructing and interrogating a surrogate model. 

UQ method leveraging a functional tensor train surrogate model. 

Multifidelity uncertainty quantification using function train expansions 

Multilevel uncertainty quantification using function train expansions 

Uncertainty quantification using polynomial chaos expansions 

Multifidelity uncertainty quantification using polynomial chaos expansions 

Multilevel uncertainty quantification using polynomial chaos expansions 

Uncertainty quantification with stochastic collocation 

Multifidelity uncertainty quantification using stochastic collocation 

Randomly samples variables according to their distributions 

Multilevel Monte Carlo (MLMC) sampling method for UQ 

Multifidelity Monte Carlo sampling method for UQ 

MultilevelMultifidelity sampling methods for UQ 

Approximate control variate (ACV) sampling methods for UQ 

The multilevel best linear unbiased estimator (ML BLUE) sampling method for UQ 

Importance sampling 

Gaussian Process Adaptive Importance Sampling 

(Experimental) Adaptively refine a Gaussian process surrogate 

ProbabilityofFailure (POF) darts is a novel method for estimating the probability of failure based on random spherepacking. 

Recursive kd (RKD) Darts: Recursive Hyperplane Sampling for Numerical Integration of HighDimensional Functions. 

Evidence theory with evidence measures computed with global optimization methods 

Interval analysis using global optimization methods 

Bayesian calibration 

Design and Analysis of Computer Experiments 

Design of Computer Experiments  Centroidal Voronoi Tessellation 

Morris OneataTime 

Evidence theory with evidence measures computed with local optimization methods 

Interval analysis using local optimization 

Local reliability method 

Global reliability methods 

Design of Computer Experiments  QuasiMonte Carlo sampling 

Samples variables along a userdefined vector 

Samples variables as a specified values 

Samples variables along points moving out from a center point 

Samples variables on full factorial grid of study points 

Estimate order of convergence of a response as model fidelity increases 
Description
The method
keyword signifies the start of a block in the Dakota
input file. A method block contains the various keywords necessary to
select a method and to control its behavior.
Method Block Requirements
At least one method
block must appear in the Dakota input
file. Multiple method
blocks may be needed to fully define
advanced analysis approaches.
Each method
block must specify one method and, optionally, any
associated keywords that govern the behavior of the method.
The Methods
Each method
block must select one method.
Starting with Dakota v6.0, the methods are grouped into two types: standard methods and multicomponent methods.
The standard methods are
standalone and selfcontained in the sense that they only require
a model to perform a study. They do not call other methods.
While methods such as
polynomial_chaos
and efficient_global
internally utilize multiple
iterator and surrogate model components, these components are
generally hidden from user control due to restrictions on modularity;
thus, these methods are standalone.
The multicomponent group
of methods provides a higher level “metaalgorithm” that points to
other methods and models that support subiteration.
For example, in a sequential hybrid method, the hybrid
method specification must identify a list of subordinate methods, and
the “metaalgorithm” executes these methods in sequence and transfers
information between them. Surrogatebased minimizers provide another
example in that they point both to other methods (e.g. what
optimization method is used to solve the approximate subproblem) as
well as to models (e.g. what type of surrogate model is employed).
Multicomponent methods generally provide some level of “plug and
play” modularity, through their flexible support of a variety of
method and model selections.
ComponentBased Iterator Commands
Componentbased iterator specifications include hybrid, multistart,
pareto set, surrogatebased local, surrogatebased global, and branch
and bound methods. Whereas a standard iterator specification only
needs an optional model pointer string (specified with
model_pointer
), componentbased iterator specifications can include
method pointer, method name, and model pointer specifications in order
to define the components employed in the “metaiteration.” In
particular, these specifications identify one or more methods (by
pointer or by name) to specify the subordinate iterators that will be
used in the toplevel algorithm. Identifying a subiterator by name
instead of by pointer is a lightweight option that relaxes the need
for a separate method specification for the subiterator; however, a
model pointer may be required in this case to provide the
specification connectivity normally supported by the method pointer.
Refer to these individual method descriptions for specific
requirements for these advanced methods.
Method Independent Controls
In addition to the method, there are 10 optional keywords, which are referred to as method independent controls. These controls are valid for enough methods that it was reasonable to pull them out of the method dependent blocks and consolidate the specifications, however, they are NOT universally respected by all methods.
Examples
Several examples follow. The first example shows a minimal specification for an optimization method.
method
dot_sqp
This example uses all of the defaults for this method.
A more sophisticated example would be
method,
id_method = 'NLP1'
dot_sqp
max_iterations = 50
convergence_tolerance = 1e4
output verbose
model_pointer = 'M1'
This example demonstrates the use of identifiers and pointers
as well as some method independent and method
dependent controls for the sequential quadratic programming (SQP)
algorithm from the DOT library. The max_iterations
,
convergence_tolerance
, and output
settings are method independent
controls, in that they are defined for a variety of methods.
The next example shows a specification for a least squares method.
method
optpp_g_newton
max_iterations = 10
convergence_tolerance = 1.e8
search_method trust_region
gradient_tolerance = 1.e6
Some of the same method independent controls are present along with
several method dependent controls ( search_method
and
gradient_tolerance
) which are only meaningful for OPT++ methods (see
Package: OPT++).
The next example shows a specification for a nondeterministic method
with several method dependent controls (refer to sampling
).
method
sampling
samples = 100
seed = 12345
sample_type lhs
response_levels = 1000. 500.
The last example shows a specification for a parameter study method
where, again, each of the controls are method dependent (refer to
vector_parameter_study
).
method
vector_parameter_study
step_vector = 1. 1. 1.
num_steps = 10