mutual_info
Calculate the mutual information between prior and posterior
Specification
Alias: None
Arguments: None
Child Keywords:
Required/Optional |
Description of Group |
Dakota Keyword |
Dakota Keyword Description |
---|---|---|---|
Optional |
Use second Kraskov algorithm to compute mutual information |
Description
The mutual information quantifies how much information two random variables
contain about each other. It is a measure of the mutual dependence of two
random variables. The mutual information is a non-negative measure, with zero
representing complete independence of the two random variables. For continuous
random variables
The mutual information can also be interpreted as the reduction in
uncertainty of one random variable due to the knowledge of another. By
specifying, mutual_info
, the mutual information between the posterior
parameters and the prior parameters is calculated.
The mutual information is calculated using a ksg2
. Further details can be found in the theory section
section of the User’s Guide.
Expected Output
If mutual_information
is specified, the calculated value will be reported
to the screen at the end of the calibration.
Additional Discussion
Due to the necessary approximation of the multidimensional integral above, a
negative mutual information may be reported for applications whose true value
is close to or equal to zero.
As of Dakota 6.6, mutual information calculations are primarily used in the
implementation of the experimental_design
algorithm.