Performance

All the tests were done on an Arch Linux x86_64 machine with an Intel(R) Core(TM) i7 CPU (1.90GHz).

Empirical likelihood computation

We show the performance of computing empirical likelihood with el_mean(). We test the computation speed with simulated data sets in two different settings: 1) the number of observations increases with the number of parameters fixed, and 2) the number of parameters increases with the number of observations fixed.

Increasing the number of observations

We fix the number of parameters at p = 10, and simulate the parameter value and n × p matrices using rnorm(). In order to ensure convergence with a large n, we set a large threshold value using el_control().

library(ggplot2)
library(microbenchmark)
set.seed(3175775)
p <- 10
par <- rnorm(p, sd = 0.1)
ctrl <- el_control(th = 1e+10)
result <- microbenchmark(
  n1e2 = el_mean(matrix(rnorm(100 * p), ncol = p), par = par, control = ctrl),
  n1e3 = el_mean(matrix(rnorm(1000 * p), ncol = p), par = par, control = ctrl),
  n1e4 = el_mean(matrix(rnorm(10000 * p), ncol = p), par = par, control = ctrl),
  n1e5 = el_mean(matrix(rnorm(100000 * p), ncol = p), par = par, control = ctrl)
)

Below are the results:

result
#> Unit: microseconds
#>  expr        min         lq        mean     median         uq        max neval
#>  n1e2    432.377    464.752    499.0994    480.095    527.269    807.306   100
#>  n1e3   1175.823   1344.602   1519.6058   1434.640   1551.794   5391.225   100
#>  n1e4  10460.709  12636.301  14433.3757  14804.760  15869.997  19244.580   100
#>  n1e5 159966.649 184297.491 220401.4271 214413.050 248981.007 342243.547   100
#>  cld
#>  a  
#>  a  
#>   b 
#>    c
autoplot(result)

Increasing the number of parameters

This time we fix the number of observations at n = 1000, and evaluate empirical likelihood at zero vectors of different sizes.

n <- 1000
result2 <- microbenchmark(
  p5 = el_mean(matrix(rnorm(n * 5), ncol = 5),
    par = rep(0, 5),
    control = ctrl
  ),
  p25 = el_mean(matrix(rnorm(n * 25), ncol = 25),
    par = rep(0, 25),
    control = ctrl
  ),
  p100 = el_mean(matrix(rnorm(n * 100), ncol = 100),
    par = rep(0, 100),
    control = ctrl
  ),
  p400 = el_mean(matrix(rnorm(n * 400), ncol = 400),
    par = rep(0, 400),
    control = ctrl
  )
)
result2
#> Unit: microseconds
#>  expr        min         lq        mean      median         uq        max neval
#>    p5    707.890    740.140    807.9447    771.6395    815.311   3598.171   100
#>   p25   2701.378   2754.948   2986.4114   2783.7660   2850.315   9651.599   100
#>  p100  21054.605  23589.879  25708.7416  24000.3195  28588.897  44322.719   100
#>  p400 238951.282 261733.630 295717.7222 282131.8765 311443.441 467720.946   100
#>  cld
#>  a  
#>  a  
#>   b 
#>    c
autoplot(result2)

On average, evaluating empirical likelihood with a 100000×10 or 1000×400 matrix at a parameter value satisfying the convex hull constraint takes less than a second.