Optimization Under Uncertainty (OUU)
Reliability-Based Design Optimization (RBDO)
Reliability-based design optimization (RBDO) methods are used to perform design optimization accounting for reliability metrics. The reliability analysis capabilities described in Section Local Reliability Methods provide a substantial foundation for exploring a variety of gradient-based RBDO formulations. [EAP+07] investigated bi-level, fully-analytic bi-level, and first-order sequential RBDO approaches employing underlying first-order reliability assessments. [EB06] investigated fully-analytic bi-level and second-order sequential RBDO approaches employing underlying second-order reliability assessments. These methods are overviewed in the following sections.
Bi-level RBDO
The simplest and most direct RBDO approach is the bi-level approach in which a full reliability analysis is performed for every optimization function evaluation. This involves a nesting of two distinct levels of optimization within each other, one at the design level and one at the MPP search level.
Since an RBDO problem will typically specify both the
And PMA reliability analysis maps
where
An important performance enhancement for bi-level methods is the use of sensitivity analysis to analytically compute the design gradients of probability, reliability, and response levels. When design variables are separate from the uncertain variables (i.e., they are not distribution parameters), then the following first-order expressions may be used [AM04, HR86, KC92]:
where it is evident from Eqs. (57)
that
where
To capture second-order probability estimates within an RIA RBDO
formulation using well-behaved eq:beta_cdf
,
for second-order
where
When the design variables are distribution parameters of the uncertain
variables,
where the design Jacobian of the transformation
(eq:trans_zx
and eq:trans_zu
_ with respect to the distribution
parameters.
Eqs. (267) - (272) remain the same as before. For this design variable case, all required
information for the sensitivities is available from the MPP search.
Since
Eqs. (267) - (274)
are derived using the Karush-Kuhn-Tucker optimality conditions for a
converged MPP, they are appropriate for RBDO using AMV+,
AMV
Sequential/Surrogate-based RBDO
An alternative RBDO approach is the sequential approach, in which additional efficiency is sought through breaking the nested relationship of the MPP and design searches. The general concept is to iterate between optimization and uncertainty quantification, updating the optimization goals based on the most recent probabilistic assessment results. This update may be based on safety factors [WSSC01] or other approximations [DC04].
A particularly effective approach for updating the optimization goals is
to use the
In particular, RIA trust-region surrogate-based RBDO employs surrogate
models of
and for second-order surrogates:
For PMA trust-region surrogate-based RBDO, surrogate models of
and for second-order surrogates:
where the sense of the
Stochastic Expansion-Based Design Optimization (SEBDO)
Stochastic Sensitivity Analysis
Section Expansion RVSA describes sensitivity analysis of the polynomial chaos expansion with respect to the expansion variables. Here we extend this analysis to include sensitivity analysis of probabilistic moments with respect to nonprobabilistic (i.e., design or epistemic uncertain) variables.
Local sensitivity analysis: first-order probabilistic expansions
With the introduction of nonprobabilistic variables
For computing sensitivities of response mean and variance, the
eq:mean_pce
and eq:covar_pce
, simplifying to
Sensitivities of Eq. (280) with respect to the
nonprobabilistic variables are as follows, where independence of
where
has been used. Due to independence, the coefficients calculated in
Eq. (283) may be interpreted as either
the derivatives of the expectations or the expectations of the
derivatives, or more precisely, the nonprobabilistic sensitivities of
the chaos coefficients for the response expansion or the chaos
coefficients of an expansion for the nonprobabilistic sensitivities of
the response. The evaluation of integrals involving
Similarly for stochastic collocation,
leads to
Local sensitivity analysis: zeroth-order combined expansions
Alternatively, a stochastic expansion can be formed over both
In this case, design sensitivities for the mean and variance do not
require response sensitivity data, but this comes at the cost of forming
the PCE over additional dimensions. For this combined variable
expansion, the mean and variance are evaluated by performing the
expectations over only the probabilistic expansion variables, which
eliminates the polynomial dependence on
The remaining polynomials may then be differentiated with respect to
Similarly for stochastic collocation,
leads to
where the remaining polynomials not eliminated by the expectation over
Inputs and outputs
There are two types of nonprobabilistic variables for which
sensitivities must be calculated: “augmented,” where the
nonprobabilistic variables are separate from and augment the
probabilistic variables, and “inserted,” where the nonprobabilistic
variables define distribution parameters for the probabilistic
variables. Any inserted nonprobabilistic variable sensitivities must be
handled using
Eqs. (281)
and
Eqs. (285)
where
While moment sensitivities directly enable robust design optimization
and interval estimation formulations which seek to control or bound
response variance, control or bounding of reliability requires
sensitivities of tail statistics. In this work, the sensitivity of
simple moment-based approximations to cumulative distribution function
(CDF) and complementary cumulative distribution function (CCDF) mappings
(Eqs. eq:mv_ria
– eq:mv_pma
) are
employed for this purpose, such that it is straightforward to form
approximate design sensitivities of reliability index
Optimization Formulations
Given the capability to compute analytic statistics of the response along with design sensitivities of these statistics, Dakota supports bi-level, sequential, and multifidelity approaches for optimization under uncertainty (OUU). The latter two approaches apply surrogate modeling approaches (data fits and multifidelity modeling) to the uncertainty analysis and then apply trust region model management to the optimization process.
Bi-level SEBDO
The simplest and most direct approach is to employ these analytic statistics and their design derivatives from Section SEBDO SSA directly within an optimization loop. This approach is known as bi-level OUU, since there is an inner level uncertainty analysis nested within an outer level optimization.
Consider the common reliability-based design example of a deterministic objective function with a reliability constraint:
where
Another common example is robust design in which the constraint
enforcing a reliability lower-bound has been replaced with a constraint
enforcing a variance upper-bound
Solving these problems using a bi-level approach involves computing
Sequential/Surrogate-Based SEBDO
An alternative OUU approach is the sequential approach, in which additional efficiency is sought through breaking the nested relationship of the UQ and optimization loops. The general concept is to iterate between optimization and uncertainty quantification, updating the optimization goals based on the most recent uncertainty assessment results. This approach is common with the reliability methods community, for which the updating strategy may be based on safety factors [WSSC01] or other approximations [DC04].
A particularly effective approach for updating the optimization goals is to use data fit surrogate models, and in particular, local Taylor series models allow direct insertion of stochastic sensitivity analysis capabilities. In Ref. [EAP+07], first-order Taylor series approximations were explored, and in Ref. [EB06], second-order Taylor series approximations are investigated. In both cases, a trust-region model management framework [ED06] is used to adaptively manage the extent of the approximations and ensure convergence of the OUU process. Surrogate models are used for both the objective and the constraint functions, although the use of surrogates is only required for the functions containing statistical results; deterministic functions may remain explicit is desired.
In particular, trust-region surrogate-based optimization for
reliability-based design employs surrogate models of
and trust-region surrogate-based optimization for robust design employs
surrogate models of
Second-order local surrogates may also be employed, where the Hessians
are typically approximated from an accumulation of curvature information
using quasi-Newton updates [NJ99] such as
Broyden-Fletcher-Goldfarb-Shanno (BFGS, Eq. (75) or
symmetric rank one (SR1, Eq. eq:sr1
. The sequential
approach is available for probabilistic expansions using PCE and SC.
Multifidelity SEBDO
The multifidelity OUU approach is another trust-region surrogate-based
approach. Instead of the surrogate UQ model being a simple data fit (in
particular, first-/second-order Taylor series model) of the truth UQ
model results, distinct UQ models of differing fidelity are now
employed. This differing UQ fidelity could stem from the fidelity of the
underlying simulation model, the fidelity of the UQ algorithm, or both.
In this section, we focus on the fidelity of the UQ algorithm. For
reliability methods, this could entail varying fidelity in approximating
assumptions (e.g., Mean Value for low fidelity, SORM for high fidelity),
and for stochastic expansion methods, it could involve differences in
selected levels of
Here, we define UQ fidelity as point-wise accuracy in the design space
and take the high fidelity truth model to be the probabilistic expansion
PCE/SC model, with validity only at a single design point. The low
fidelity model, whose validity over the design space will be adaptively
controlled, will be either the combined expansion PCE/SC model, with
validity over a range of design parameters, or the MVFOSM reliability
method, with validity only at a single design point. The combined
expansion low fidelity approach will span the current trust region of
the design space and will be reconstructed for each new trust region.
Trust region adaptation will ensure that the combined expansion approach
remains sufficiently accurate for design purposes. By taking advantage
of the design space spanning, one can eliminate the cost of multiple low
fidelity UQ analyses within the trust region, with fallback to the
greater accuracy and higher expense of the probabilistic expansion
approach when needed. The MVFOSM low fidelity approximation must be
reformed for each change in design variables, but it only requires a
single evaluation of a response function and its derivative to
approximate the response mean and variance from the input mean and
covariance
(Eqs. eq:mv_mean1
– eq:mv_std_dev
from which forward/inverse CDF/CCDF reliability mappings can be
generated using
Eqs. eq:mv_ria
– eq:mv_pma
. This is
the least expensive UQ option, but its limited accuracy [1] may dictate
the use of small trust regions, resulting in greater iterations to
convergence. The expense of optimizing a combined expansion, on the
other hand, is not significantly less than that of optimizing the high
fidelity UQ model, but its representation of global trends should allow
the use of larger trust regions, resulting in reduced iterations to
convergence. The design derivatives of each of the PCE/SC expansion
models provide the necessary data to correct the low fidelity model to
first-order consistency with the high fidelity model at the center of
each trust region, ensuring convergence of the multifidelity
optimization process to the high fidelity optimum. Design derivatives of
the MVFOSM statistics are currently evaluated numerically using forward
finite differences.
Multifidelity optimization for reliability-based design can be formulated as:
and multifidelity optimization for robust design can be formulated as:
where the deterministic objective function is not approximated and
where correction functions
Sampling-based OUU
Gradient-based OUU can also be performed using random sampling methods. In this case, the sample-average approximation to the design derivative of the mean and standard deviation are:
This enables design sensitivities for mean, standard deviation or
variance (based on final_moments
type), and forward/inverse
reliability index mappings (