optpp_fd_newton
Finite Difference Newton optimization method
Topics
package_optpp, local_optimization_methods
Specification
Alias: None
Arguments: None
Child Keywords:
Required/Optional |
Description of Group |
Dakota Keyword |
Dakota Keyword Description |
---|---|---|---|
Optional |
Select a search method for Newton-based optimizers |
||
Optional |
Balance goals of reducing objective function and satisfying constraints |
||
Optional |
Controls how close to the boundary of the feasible region the algorithm is allowed to move |
||
Optional |
Controls how closely the algorithm should follow the “central path” |
||
Optional |
Max change in design point |
||
Optional |
Stopping critiera based on L2 norm of gradient |
||
Optional |
Number of iterations allowed for optimizers and adaptive UQ methods |
||
Optional |
Stopping criterion based on objective function or statistics convergence |
||
Optional |
Compute speculative gradients |
||
Optional |
Number of function evaluations allowed for optimizers |
||
Optional |
Turn on scaling for variables, responses, and constraints |
||
Optional |
Identifier for model block to be used by a method |
Description
This is a Newton method that expects a gradient and computes a finite-difference approximation to the Hessian. Each of the Newton-based methods are automatically bound to the appropriate OPT++ algorithm based on the user constraint specification (unconstrained, bound-constrained, or generally-constrained). In the generally-constrained case, the Newton methods use a nonlinear interior-point approach to manage the constraints.
See Package: OPT++ for info related to all optpp
methods.
Expected HDF5 Output
If Dakota was built with HDF5 support and run with the
hdf5
keyword, this method
writes the following results to HDF5:
Best Objective Functions (when
objective_functions
) are specified)Calibration (when
calibration_terms
are specified)