ksg2

Use second Kraskov algorithm to compute mutual information

Specification

  • Alias: None

  • Arguments: None

Description

This algorithm is derived in [KStogbauerG04] . The mutual information between m random variables is approximated by

I2(X1,X2,,Xm)=ψ(k)+(m1)ψ(N)(m1)/k<ψ(nx1)+ψ(nx2)++ψ(nxm)>,

where ψ is the digamma function, k is the number of nearest neighbors being used, and N is the number of samples available for the joint distribution of the random variables. For each point zi=(x1,i,x2,i,,xm,i) in the joint distribution, zi and its k nearest neighbors are projected into each marginal subpsace. For each subspace j=1,,m , ϵj,i is defined as the radius of the l -ball containing all k+1 points. Then, nxj,i is the number of points in the j -th subspace within a distance of ϵj,i from the point xj,i . The angular brackets denote that the average of ψ(nxj,i) is taken over all points i=1,,N .

Examples

method
 bayes_calibration queso
   dram
   seed = 34785
   chain_samples = 1000
   posterior_stats mutual_info
  ksg2
method
 bayes_calibration
   queso
   dram
   chain_samples = 1000 seed = 348
  experimental_design
   initial_samples = 5
   num_candidates = 10
   max_hifi_evaluations = 3
   ksg2