Dakota  Version 6.21
Explore and Predict with Confidence
Public Member Functions | List of all members
NLPQLPTraits Class Reference

Wrapper class for the NLPQLP optimization library, Version 2.0. More...

Inheritance diagram for NLPQLPTraits:
TraitsBase

Public Member Functions

 NLPQLPTraits ()
 default constructor
 
virtual ~NLPQLPTraits ()
 destructor
 
virtual bool is_derived ()
 A temporary query used in the refactor.
 
bool supports_continuous_variables ()
 Return the flag indicating whether method supports continuous variables.
 
bool supports_linear_equality ()
 Return the flag indicating whether method supports linear equalities.
 
bool supports_linear_inequality ()
 Return the flag indicating whether method supports linear inequalities.
 
bool supports_nonlinear_equality ()
 Return the flag indicating whether method supports nonlinear equalities.
 
bool supports_nonlinear_inequality ()
 Return the flag indicating whether method supports nonlinear inequalities.
 
NONLINEAR_INEQUALITY_FORMAT nonlinear_inequality_format ()
 Return the format used for nonlinear inequality constraints.
 
- Public Member Functions inherited from TraitsBase
 TraitsBase ()
 default constructor
 
virtual ~TraitsBase ()
 destructor
 
virtual bool requires_bounds ()
 Return the flag indicating whether method requires bounds.
 
virtual LINEAR_INEQUALITY_FORMAT linear_inequality_format ()
 Return the format used for linear inequality constraints.
 
virtual NONLINEAR_EQUALITY_FORMAT nonlinear_equality_format ()
 Return the format used for nonlinear equality constraints.
 
virtual bool expects_nonlinear_inequalities_first ()
 Return the flag indicating whether method expects nonlinear inequality constraints followed by nonlinear equality constraints.
 
virtual bool supports_scaling ()
 Return the flag indicating whether method supports parameter scaling.
 
virtual bool supports_least_squares ()
 Return the flag indicating whether method supports least squares.
 
virtual bool supports_multiobjectives ()
 Return flag indicating whether method supports multiobjective optimization.
 
virtual bool supports_discrete_variables ()
 Return the flag indicating whether method supports continuous variables.
 
virtual bool provides_best_objective ()
 Return the flag indicating whether method provides best objective result.
 
virtual bool provides_best_parameters ()
 Return the flag indicating whether method provides best parameters result.
 
virtual bool provides_best_constraint ()
 Return the flag indicating whether method provides best constraint result.
 
virtual bool provides_final_gradient ()
 Return the flag indicating whether method provides final gradient result.
 
virtual bool provides_final_hessian ()
 Return the flag indicating whether method provides final hessian result.
 

Detailed Description

Wrapper class for the NLPQLP optimization library, Version 2.0.


  AN IMPLEMENTATION OF A SEQUENTIAL QUADRATIC PROGRAMMING
   METHOD FOR SOLVING NONLINEAR OPTIMIZATION PROBLEMS BY
    DISTRIBUTED COMPUTING AND NON-MONOTONE LINE SEARCH

This subroutine solves the general nonlinear programming problem

      minimize    F(X)
      subject to  G(J,X)   =  0  ,  J=1,...,ME
                  G(J,X)  >=  0  ,  J=ME+1,...,M
                  XL  <=  X  <=  XU

and is an extension of the code NLPQLD. NLPQLP is specifically tuned to run under distributed systems. A new input parameter L is introduced for the number of parallel computers, that is the number of function calls to be executed simultaneously. In case of L=1, NLPQLP is identical to NLPQLD. Otherwise the line search is modified to allow L parallel function calls in advance. Moreover the user has the opportunity to used distributed function calls for evaluating gradients.

The algorithm is a modification of the method of Wilson, Han, and Powell. In each iteration step, a linearly constrained quadratic programming problem is formulated by approximating the Lagrangian function quadratically and by linearizing the constraints. Subsequently, a one-dimensional line search is performed with respect to an augmented Lagrangian merit function to obtain a new iterate. Also the modified line search algorithm guarantees convergence under the same assumptions as before.

For the new version, a non-monotone line search is implemented which allows to increase the merit function in case of instabilities, for example caused by round-off errors, errors in gradient approximations, etc.

The subroutine contains the option to predetermine initial guesses for the multipliers or the Hessian of the Lagrangian function and is called by reverse communication.

A version of TraitsBase specialized for NLPQLP optimizers


The documentation for this class was generated from the following file: