1-D kernel density-based estimation of Kullback-Leibler divergence
Source:R/kld-estimation-kernel-density.R
kld_est_kde1.Rd
This estimation method approximates the densities of the unknown distributions \(P\) and \(Q\) by a kernel density estimate using function 'density' from package 'stats'. Only the two-sample, not the one-sample problem is implemented.
Arguments
- X, Y
Numeric vectors or single-column matrices, representing samples from the true distribution \(P\) and the approximate distribution \(Q\), respectively.
- MC
A boolean: use a Monte Carlo approximation instead of numerical integration via the trapezoidal rule (default:
FALSE
)?- ...
Further parameters to passed on to
stats::density
(e.g., argumentbw
)
Examples
# KL-D between two samples from 1D Gaussians:
set.seed(0)
X <- rnorm(100)
Y <- rnorm(100, mean = 1, sd = 2)
kld_gaussian(mu1 = 0, sigma1 = 1, mu2 = 1, sigma2 = 2^2)
#> [1] 0.4431472
kld_est_kde1(X,Y)
#> [1] 0.3773503
kld_est_kde1(X,Y, MC = TRUE)
#> [1] 0.4347723