.. _method: """""" method """""" Begins Dakota method selection and behavioral settings. **Topics** block .. toctree:: :hidden: :maxdepth: 1 method-id_method method-output method-final_solutions method-hybrid method-multi_start method-pareto_set method-branch_and_bound method-surrogate_based_local method-surrogate_based_global method-dot_frcg method-dot_mmfd method-dot_bfgs method-dot_slp method-dot_sqp method-conmin_frcg method-conmin_mfd method-dl_solver method-npsol_sqp method-nlssol_sqp method-nlpql_sqp method-optpp_cg method-optpp_q_newton method-optpp_fd_newton method-optpp_g_newton method-optpp_newton method-optpp_pds method-demo_tpl method-rol method-asynch_pattern_search method-mesh_adaptive_search method-nowpac method-snowpac method-moga method-soga method-coliny_pattern_search method-coliny_solis_wets method-coliny_cobyla method-coliny_direct method-coliny_ea method-coliny_beta method-nl2sol method-nonlinear_cg method-ncsu_direct method-genie_opt_darts method-genie_direct method-efficient_global method-surrogate_based_uq method-function_train method-multifidelity_function_train method-multilevel_function_train method-polynomial_chaos method-multifidelity_polynomial_chaos method-multilevel_polynomial_chaos method-stoch_collocation method-multifidelity_stoch_collocation method-sampling method-multilevel_sampling method-multifidelity_sampling method-multilevel_multifidelity_sampling method-approximate_control_variate method-importance_sampling method-gpais method-adaptive_sampling method-pof_darts method-rkd_darts method-global_evidence method-global_interval_est method-bayes_calibration method-dace method-fsu_cvt method-psuade_moat method-local_evidence method-local_interval_est method-local_reliability method-global_reliability method-fsu_quasi_mc method-vector_parameter_study method-list_parameter_study method-centered_parameter_study method-multidim_parameter_study method-richardson_extrap **Specification** - *Alias:* None - *Arguments:* None **Child Keywords:** +-------------------------+--------------------+---------------------------------------+---------------------------------------------+ | Required/Optional | Description of | Dakota Keyword | Dakota Keyword Description | | | Group | | | +=========================+====================+=======================================+=============================================+ | Optional | `id_method`__ | Name the method block; helpful when there | | | | are multiple | +----------------------------------------------+---------------------------------------+---------------------------------------------+ | Optional | `output`__ | Control how much method information is | | | | written to the screen and output file | +----------------------------------------------+---------------------------------------+---------------------------------------------+ | Optional | `final_solutions`__ | Number of designs returned as the best | | | | solutions | +-------------------------+--------------------+---------------------------------------+---------------------------------------------+ | Required (Choose One) | Method (Iterative | `hybrid`__ | Strategy in which a set of methods | | | Algorithm) | | synergistically seek an optimal design | | | +---------------------------------------+---------------------------------------------+ | | | `multi_start`__ | Multi-Start Optimization Method | | | +---------------------------------------+---------------------------------------------+ | | | `pareto_set`__ | Pareto set optimization | | | +---------------------------------------+---------------------------------------------+ | | | `branch_and_bound`__ | (Experimental Capability) Solves a mixed | | | | | integer nonlinear optimization problem | | | +---------------------------------------+---------------------------------------------+ | | | `surrogate_based_local`__ | Local Surrogate Based Optimization | | | +---------------------------------------+---------------------------------------------+ | | | `surrogate_based_global`__ | Adaptive Global Surrogate-Based | | | | | Optimization | | | +---------------------------------------+---------------------------------------------+ | | | `dot_frcg`__ | DOT conjugate gradient optimization method | | | +---------------------------------------+---------------------------------------------+ | | | `dot_mmfd`__ | DOT modified method of feasible directions | | | +---------------------------------------+---------------------------------------------+ | | | `dot_bfgs`__ | DOT BFGS optimization method | | | +---------------------------------------+---------------------------------------------+ | | | `dot_slp`__ | DOT Sequential Linear Program | | | +---------------------------------------+---------------------------------------------+ | | | `dot_sqp`__ | DOT Sequential Quadratic Program | | | +---------------------------------------+---------------------------------------------+ | | | `conmin_frcg`__ | CONMIN conjugate gradient optimization | | | | | method | | | +---------------------------------------+---------------------------------------------+ | | | `conmin_mfd`__ | CONMIN method of feasible directions | | | +---------------------------------------+---------------------------------------------+ | | | `dl_solver`__ | (Experimental) Dynamically-loaded solver | | | +---------------------------------------+---------------------------------------------+ | | | `npsol_sqp`__ | NPSOL Sequential Quadratic Program | | | +---------------------------------------+---------------------------------------------+ | | | `nlssol_sqp`__ | Sequential Quadratic Program for nonlinear | | | | | least squares | | | +---------------------------------------+---------------------------------------------+ | | | `nlpql_sqp`__ | NLPQL Sequential Quadratic Program | | | +---------------------------------------+---------------------------------------------+ | | | `optpp_cg`__ | A conjugate gradient optimization method | | | +---------------------------------------+---------------------------------------------+ | | | `optpp_q_newton`__ | Quasi-Newton optimization method | | | +---------------------------------------+---------------------------------------------+ | | | `optpp_fd_newton`__ | Finite Difference Newton optimization | | | | | method | | | +---------------------------------------+---------------------------------------------+ | | | `optpp_g_newton`__ | Newton method based least-squares | | | | | calbration | | | +---------------------------------------+---------------------------------------------+ | | | `optpp_newton`__ | Newton method based optimization | | | +---------------------------------------+---------------------------------------------+ | | | `optpp_pds`__ | Simplex-based derivative free optimization | | | | | method | | | +---------------------------------------+---------------------------------------------+ | | | `demo_tpl`__ | Third-party optimization library | | | | | integration demonstration. | | | +---------------------------------------+---------------------------------------------+ | | | `rol`__ | Rapid Optimization Library (ROL) is a | | | | | large-scale optimization package within | | | | | Trilinos. | | | +---------------------------------------+---------------------------------------------+ | | | `asynch_pattern_search`__ | Pattern search, derivative free | | | | | optimization method | | | +---------------------------------------+---------------------------------------------+ | | | `mesh_adaptive_search`__ | Finds optimal variable values using | | | | | adaptive mesh-based search | | | +---------------------------------------+---------------------------------------------+ | | | `nowpac`__ | Gradient-free inequality-constrained | | | | | optimization using Nonlinear Optimization | | | | | With Path Augmented Constraints (NOWPAC). | | | +---------------------------------------+---------------------------------------------+ | | | `snowpac`__ | Stochastic version of NOWPAC that | | | | | incorporates error estimates and noise | | | | | mitigation. | | | +---------------------------------------+---------------------------------------------+ | | | `moga`__ | Multi-objective Genetic Algorithm (a.k.a | | | | | Evolutionary Algorithm) | | | +---------------------------------------+---------------------------------------------+ | | | `soga`__ | Single-objective Genetic Algorithm (a.k.a | | | | | Evolutionary Algorithm) | | | +---------------------------------------+---------------------------------------------+ | | | `coliny_pattern_search`__ | Pattern search, derivative free | | | | | optimization method | | | +---------------------------------------+---------------------------------------------+ | | | `coliny_solis_wets`__ | Simple greedy local search method | | | +---------------------------------------+---------------------------------------------+ | | | `coliny_cobyla`__ | Constrained Optimization BY Linear | | | | | Approximations (COBYLA) | | | +---------------------------------------+---------------------------------------------+ | | | `coliny_direct`__ | DIviding RECTangles method | | | +---------------------------------------+---------------------------------------------+ | | | `coliny_ea`__ | Evolutionary Algorithm | | | +---------------------------------------+---------------------------------------------+ | | | `coliny_beta`__ | (Experimental) Coliny beta solver | | | +---------------------------------------+---------------------------------------------+ | | | `nl2sol`__ | Trust-region method for nonlinear least | | | | | squares | | | +---------------------------------------+---------------------------------------------+ | | | `nonlinear_cg`__ | (Experimental) nonlinear conjugate gradient | | | | | optimization | | | +---------------------------------------+---------------------------------------------+ | | | `ncsu_direct`__ | DIviding RECTangles method | | | +---------------------------------------+---------------------------------------------+ | | | `genie_opt_darts`__ | Voronoi-based high-dimensional global | | | | | Lipschitzian optimization | | | +---------------------------------------+---------------------------------------------+ | | | `genie_direct`__ | Classical high-dimensional global | | | | | Lipschitzian optimization Classical | | | | | high-dimensional global Lipschitzian | | | | | optimization | | | +---------------------------------------+---------------------------------------------+ | | | `efficient_global`__ | Global Surrogate Based Optimization, a.k.a. | | | | | EGO | | | +---------------------------------------+---------------------------------------------+ | | | `surrogate_based_uq`__ | Generic UQ method for constructing and | | | | | interrogating a surrogate model. | | | +---------------------------------------+---------------------------------------------+ | | | `function_train`__ | UQ method leveraging a functional tensor | | | | | train surrogate model. | | | +---------------------------------------+---------------------------------------------+ | | | `multifidelity_function_train`__ | Multifidelity uncertainty quantification | | | | | using function train expansions | | | +---------------------------------------+---------------------------------------------+ | | | `multilevel_function_train`__ | Multilevel uncertainty quantification using | | | | | function train expansions | | | +---------------------------------------+---------------------------------------------+ | | | `polynomial_chaos`__ | Uncertainty quantification using polynomial | | | | | chaos expansions | | | +---------------------------------------+---------------------------------------------+ | | | `multifidelity_polynomial_chaos`__ | Multifidelity uncertainty quantification | | | | | using polynomial chaos expansions | | | +---------------------------------------+---------------------------------------------+ | | | `multilevel_polynomial_chaos`__ | Multilevel uncertainty quantification using | | | | | polynomial chaos expansions | | | +---------------------------------------+---------------------------------------------+ | | | `stoch_collocation`__ | Uncertainty quantification with stochastic | | | | | collocation | | | +---------------------------------------+---------------------------------------------+ | | | `multifidelity_stoch_collocation`__ | Multifidelity uncertainty quantification | | | | | using stochastic collocation | | | +---------------------------------------+---------------------------------------------+ | | | `sampling`__ | Randomly samples variables according to | | | | | their distributions | | | +---------------------------------------+---------------------------------------------+ | | | `multilevel_sampling`__ | Multilevel sampling methods for UQ | | | +---------------------------------------+---------------------------------------------+ | | | `multifidelity_sampling`__ | Multifidelity sampling methods for UQ | | | +---------------------------------------+---------------------------------------------+ | | | `multilevel_multifidelity_sampling`__ | Multilevel-Multifidelity sampling methods | | | | | for UQ | | | +---------------------------------------+---------------------------------------------+ | | | `approximate_control_variate`__ | Approximate control variate (ACV) sampling | | | | | methods for UQ | | | +---------------------------------------+---------------------------------------------+ | | | `importance_sampling`__ | Importance sampling | | | +---------------------------------------+---------------------------------------------+ | | | `gpais`__ | Gaussian Process Adaptive Importance | | | | | Sampling | | | +---------------------------------------+---------------------------------------------+ | | | `adaptive_sampling`__ | (Experimental) Adaptively refine a Gaussian | | | | | process surrogate | | | +---------------------------------------+---------------------------------------------+ | | | `pof_darts`__ | Probability-of-Failure (POF) darts is a | | | | | novel method for estimating the probability | | | | | of failure based on random sphere-packing. | | | +---------------------------------------+---------------------------------------------+ | | | `rkd_darts`__ | Recursive k-d (RKD) Darts: Recursive | | | | | Hyperplane Sampling for Numerical | | | | | Integration of High-Dimensional Functions. | | | +---------------------------------------+---------------------------------------------+ | | | `global_evidence`__ | Evidence theory with evidence measures | | | | | computed with global optimization methods | | | +---------------------------------------+---------------------------------------------+ | | | `global_interval_est`__ | Interval analysis using global optimization | | | | | methods | | | +---------------------------------------+---------------------------------------------+ | | | `bayes_calibration`__ | Bayesian calibration | | | +---------------------------------------+---------------------------------------------+ | | | `dace`__ | Design and Analysis of Computer Experiments | | | +---------------------------------------+---------------------------------------------+ | | | `fsu_cvt`__ | Design of Computer Experiments - Centroidal | | | | | Voronoi Tessellation | | | +---------------------------------------+---------------------------------------------+ | | | `psuade_moat`__ | Morris One-at-a-Time | | | +---------------------------------------+---------------------------------------------+ | | | `local_evidence`__ | Evidence theory with evidence measures | | | | | computed with local optimization methods | | | +---------------------------------------+---------------------------------------------+ | | | `local_interval_est`__ | Interval analysis using local optimization | | | +---------------------------------------+---------------------------------------------+ | | | `local_reliability`__ | Local reliability method | | | +---------------------------------------+---------------------------------------------+ | | | `global_reliability`__ | Global reliability methods | | | +---------------------------------------+---------------------------------------------+ | | | `fsu_quasi_mc`__ | Design of Computer Experiments - | | | | | Quasi-Monte Carlo sampling | | | +---------------------------------------+---------------------------------------------+ | | | `vector_parameter_study`__ | Samples variables along a user-defined | | | | | vector | | | +---------------------------------------+---------------------------------------------+ | | | `list_parameter_study`__ | Samples variables as a specified values | | | +---------------------------------------+---------------------------------------------+ | | | `centered_parameter_study`__ | Samples variables along points moving out | | | | | from a center point | | | +---------------------------------------+---------------------------------------------+ | | | `multidim_parameter_study`__ | Samples variables on full factorial grid of | | | | | study points | | | +---------------------------------------+---------------------------------------------+ | | | `richardson_extrap`__ | Estimate order of convergence of a response | | | | | as model fidelity increases | +-------------------------+--------------------+---------------------------------------+---------------------------------------------+ .. __: method-id_method.html __ method-output.html __ method-final_solutions.html __ method-hybrid.html __ method-multi_start.html __ method-pareto_set.html __ method-branch_and_bound.html __ method-surrogate_based_local.html __ method-surrogate_based_global.html __ method-dot_frcg.html __ method-dot_mmfd.html __ method-dot_bfgs.html __ method-dot_slp.html __ method-dot_sqp.html __ method-conmin_frcg.html __ method-conmin_mfd.html __ method-dl_solver.html __ method-npsol_sqp.html __ method-nlssol_sqp.html __ method-nlpql_sqp.html __ method-optpp_cg.html __ method-optpp_q_newton.html __ method-optpp_fd_newton.html __ method-optpp_g_newton.html __ method-optpp_newton.html __ method-optpp_pds.html __ method-demo_tpl.html __ method-rol.html __ method-asynch_pattern_search.html __ method-mesh_adaptive_search.html __ method-nowpac.html __ method-snowpac.html __ method-moga.html __ method-soga.html __ method-coliny_pattern_search.html __ method-coliny_solis_wets.html __ method-coliny_cobyla.html __ method-coliny_direct.html __ method-coliny_ea.html __ method-coliny_beta.html __ method-nl2sol.html __ method-nonlinear_cg.html __ method-ncsu_direct.html __ method-genie_opt_darts.html __ method-genie_direct.html __ method-efficient_global.html __ method-surrogate_based_uq.html __ method-function_train.html __ method-multifidelity_function_train.html __ method-multilevel_function_train.html __ method-polynomial_chaos.html __ method-multifidelity_polynomial_chaos.html __ method-multilevel_polynomial_chaos.html __ method-stoch_collocation.html __ method-multifidelity_stoch_collocation.html __ method-sampling.html __ method-multilevel_sampling.html __ method-multifidelity_sampling.html __ method-multilevel_multifidelity_sampling.html __ method-approximate_control_variate.html __ method-importance_sampling.html __ method-gpais.html __ method-adaptive_sampling.html __ method-pof_darts.html __ method-rkd_darts.html __ method-global_evidence.html __ method-global_interval_est.html __ method-bayes_calibration.html __ method-dace.html __ method-fsu_cvt.html __ method-psuade_moat.html __ method-local_evidence.html __ method-local_interval_est.html __ method-local_reliability.html __ method-global_reliability.html __ method-fsu_quasi_mc.html __ method-vector_parameter_study.html __ method-list_parameter_study.html __ method-centered_parameter_study.html __ method-multidim_parameter_study.html __ method-richardson_extrap.html **Description** The ``method`` keyword signifies the start of a block in the Dakota input file. A method block contains the various keywords necessary to select a method and to control its behavior. *Method Block Requirements* At least one ``method`` block must appear in the Dakota input file. Multiple ``method`` blocks may be needed to fully define advanced analysis approaches. Each ``method`` block must specify one method and, optionally, any associated keywords that govern the behavior of the method. *The Methods* Each ``method`` block must select one method. Starting with Dakota v6.0, the methods are grouped into two types: standard methods and multi-component methods. The standard methods are stand-alone and self-contained in the sense that they only require a model to perform a study. They do not call other methods. While methods such as ``polynomial_chaos`` and ``efficient_global`` internally utilize multiple iterator and surrogate model components, these components are generally hidden from user control due to restrictions on modularity; thus, these methods are stand-alone. The multi-component group of methods provides a higher level "meta-algorithm" that points to other methods and models that support sub-iteration. For example, in a sequential hybrid method, the ``hybrid`` method specification must identify a list of subordinate methods, and the "meta-algorithm" executes these methods in sequence and transfers information between them. Surrogate-based minimizers provide another example in that they point both to other methods (e.g. what optimization method is used to solve the approximate subproblem) as well as to models (e.g. what type of surrogate model is employed). Multi-component methods generally provide some level of "plug and play" modularity, through their flexible support of a variety of method and model selections. *Component-Based Iterator Commands* Component-based iterator specifications include hybrid, multi-start, pareto set, surrogate-based local, surrogate-based global, and branch and bound methods. Whereas a standard iterator specification only needs an optional model pointer string (specified with ``model_pointer``), component-based iterator specifications can include method pointer, method name, and model pointer specifications in order to define the components employed in the "meta-iteration." In particular, these specifications identify one or more methods (by pointer or by name) to specify the subordinate iterators that will be used in the top-level algorithm. Identifying a sub-iterator by name instead of by pointer is a lightweight option that relaxes the need for a separate method specification for the sub-iterator; however, a model pointer may be required in this case to provide the specification connectivity normally supported by the method pointer. Refer to these individual method descriptions for specific requirements for these advanced methods. *Method Independent Controls* In addition to the method, there are 10 optional keywords, which are referred to as method independent controls. These controls are valid for enough methods that it was reasonable to pull them out of the method dependent blocks and consolidate the specifications, however, they are NOT universally respected by all methods. **Examples** Several examples follow. The first example shows a minimal specification for an optimization method. .. code-block:: method dot_sqp This example uses all of the defaults for this method. A more sophisticated example would be .. code-block:: method, id_method = 'NLP1' dot_sqp max_iterations = 50 convergence_tolerance = 1e-4 output verbose model_pointer = 'M1' This example demonstrates the use of identifiers and pointers as well as some method independent and method dependent controls for the sequential quadratic programming (SQP) algorithm from the DOT library. The ``max_iterations``, ``convergence_tolerance``, and ``output`` settings are method independent controls, in that they are defined for a variety of methods. The next example shows a specification for a least squares method. .. code-block:: method optpp_g_newton max_iterations = 10 convergence_tolerance = 1.e-8 search_method trust_region gradient_tolerance = 1.e-6 Some of the same method independent controls are present along with several method dependent controls ( ``search_method`` and ``gradient_tolerance``) which are only meaningful for OPT++ methods (see :ref:`topic-package_optpp`). The next example shows a specification for a nondeterministic method with several method dependent controls (refer to :dakkw:`method-sampling`). .. code-block:: method sampling samples = 100 seed = 12345 sample_type lhs response_levels = 1000. 500. The last example shows a specification for a parameter study method where, again, each of the controls are method dependent (refer to :dakkw:`method-vector_parameter_study`). .. code-block:: method vector_parameter_study step_vector = 1. 1. 1. num_steps = 10