branch_and_bound
(Experimental Capability) Solves a mixed integer nonlinear optimization problem
Specification
Alias: None
Arguments: None
Child Keywords:
Required/Optional |
Description of Group |
Dakota Keyword |
Dakota Keyword Description |
---|---|---|---|
Required (Choose One) |
Local Optimizer Selection |
Pointer to sub-method to apply to a surrogate or branch-and-bound sub-problem |
|
Specify sub-method by name |
|||
Optional |
Turn on scaling for variables, responses, and constraints |
Description
The branch-and-bound optimization methods solves mixed integer nonlinear optimization problems. It does so by partitioning the parameter space according to some criteria along the integer or discrete variables. It then relaxes (i.e., treats all variables as continuous) the sub-problems created by the partitions and solves each sub-problem with a continuous nonlinear optimization method. Results of the sub-problems are combined in such a way that yields the solution to the original optimization problem.
Default Behavior
Branch-and-bound expects all discrete variables to be relaxable. If your problem has categorical or otherwise non-relaxable discrete variables, then this is not the optimization method you are looking for.
Expected Output
The optimal solution and associated parameters will be printed to the screen output.
Usage Tips
The user must choose a nonlinear optimization method to solve the sub- problems. We recommend choosing a method that would be chosen to solve a continuous problem that has similar form to the mixed integer problem.
Examples
environment
method_pointer = 'BandB'
method
id_method = 'BandB'
branch_and_bound
output verbose
method_pointer = 'SubNLP'
method
id_method = 'SubNLP'
coliny_ea
seed = 12345
max_iterations = 100
max_function_evaluations = 100
variables,
continuous_design = 3
initial_point -1.0 1.5 2.0
upper_bounds 10.0 10.0 10.0
lower_bounds -10.0 -10.0 -10.0
descriptors 'x1' 'x2' 'x3'
discrete_design_range = 2
initial_point 2 2
lower_bounds 1 1
upper_bounds 4 9
descriptors 'y1' 'y2'
interface,
fork
analysis_driver = 'text_book'
responses,
objective_functions = 1
nonlinear_inequality_constraints = 2
numerical_gradients
no_hessians