R/posterior_samples.R
posterior_samples.Rd
Extract posterior samples from a rater fit object
posterior_samples(fit, pars = c("pi", "theta"))
A rater fit object.
A character vector of parameter names to return. By default
c("pi", "theta")
.
A named list of the posterior samples for each parameters. For each
parameter the samples are in the form returned by rstan::extract()
.
Posterior samples can only be returned for models fitting using MCMC not optimisation. In addition, posterior samples cannot be returned for the latent class due to the marginalisation technique used internally.
For the class conditional model the 'full' theta parameterisation (i.e. appearing to have the same number of parameters as the standard Dawid-Skene model) is calculated and returned. This is designed to allow easier comparison with the full Dawid-Skene model.
# \donttest{
fit <- rater(anesthesia, "dawid_skene")
#>
#> SAMPLING FOR MODEL 'dawid_skene' NOW (CHAIN 1).
#> Chain 1:
#> Chain 1: Gradient evaluation took 0.000198 seconds
#> Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 1.98 seconds.
#> Chain 1: Adjust your expectations accordingly!
#> Chain 1:
#> Chain 1:
#> Chain 1: Iteration: 1 / 2000 [ 0%] (Warmup)
#> Chain 1: Iteration: 200 / 2000 [ 10%] (Warmup)
#> Chain 1: Iteration: 400 / 2000 [ 20%] (Warmup)
#> Chain 1: Iteration: 600 / 2000 [ 30%] (Warmup)
#> Chain 1: Iteration: 800 / 2000 [ 40%] (Warmup)
#> Chain 1: Iteration: 1000 / 2000 [ 50%] (Warmup)
#> Chain 1: Iteration: 1001 / 2000 [ 50%] (Sampling)
#> Chain 1: Iteration: 1200 / 2000 [ 60%] (Sampling)
#> Chain 1: Iteration: 1400 / 2000 [ 70%] (Sampling)
#> Chain 1: Iteration: 1600 / 2000 [ 80%] (Sampling)
#> Chain 1: Iteration: 1800 / 2000 [ 90%] (Sampling)
#> Chain 1: Iteration: 2000 / 2000 [100%] (Sampling)
#> Chain 1:
#> Chain 1: Elapsed Time: 2.555 seconds (Warm-up)
#> Chain 1: 2.623 seconds (Sampling)
#> Chain 1: 5.178 seconds (Total)
#> Chain 1:
#>
#> SAMPLING FOR MODEL 'dawid_skene' NOW (CHAIN 2).
#> Chain 2:
#> Chain 2: Gradient evaluation took 0.000183 seconds
#> Chain 2: 1000 transitions using 10 leapfrog steps per transition would take 1.83 seconds.
#> Chain 2: Adjust your expectations accordingly!
#> Chain 2:
#> Chain 2:
#> Chain 2: Iteration: 1 / 2000 [ 0%] (Warmup)
#> Chain 2: Iteration: 200 / 2000 [ 10%] (Warmup)
#> Chain 2: Iteration: 400 / 2000 [ 20%] (Warmup)
#> Chain 2: Iteration: 600 / 2000 [ 30%] (Warmup)
#> Chain 2: Iteration: 800 / 2000 [ 40%] (Warmup)
#> Chain 2: Iteration: 1000 / 2000 [ 50%] (Warmup)
#> Chain 2: Iteration: 1001 / 2000 [ 50%] (Sampling)
#> Chain 2: Iteration: 1200 / 2000 [ 60%] (Sampling)
#> Chain 2: Iteration: 1400 / 2000 [ 70%] (Sampling)
#> Chain 2: Iteration: 1600 / 2000 [ 80%] (Sampling)
#> Chain 2: Iteration: 1800 / 2000 [ 90%] (Sampling)
#> Chain 2: Iteration: 2000 / 2000 [100%] (Sampling)
#> Chain 2:
#> Chain 2: Elapsed Time: 2.525 seconds (Warm-up)
#> Chain 2: 2.658 seconds (Sampling)
#> Chain 2: 5.183 seconds (Total)
#> Chain 2:
#>
#> SAMPLING FOR MODEL 'dawid_skene' NOW (CHAIN 3).
#> Chain 3:
#> Chain 3: Gradient evaluation took 0.000179 seconds
#> Chain 3: 1000 transitions using 10 leapfrog steps per transition would take 1.79 seconds.
#> Chain 3: Adjust your expectations accordingly!
#> Chain 3:
#> Chain 3:
#> Chain 3: Iteration: 1 / 2000 [ 0%] (Warmup)
#> Chain 3: Iteration: 200 / 2000 [ 10%] (Warmup)
#> Chain 3: Iteration: 400 / 2000 [ 20%] (Warmup)
#> Chain 3: Iteration: 600 / 2000 [ 30%] (Warmup)
#> Chain 3: Iteration: 800 / 2000 [ 40%] (Warmup)
#> Chain 3: Iteration: 1000 / 2000 [ 50%] (Warmup)
#> Chain 3: Iteration: 1001 / 2000 [ 50%] (Sampling)
#> Chain 3: Iteration: 1200 / 2000 [ 60%] (Sampling)
#> Chain 3: Iteration: 1400 / 2000 [ 70%] (Sampling)
#> Chain 3: Iteration: 1600 / 2000 [ 80%] (Sampling)
#> Chain 3: Iteration: 1800 / 2000 [ 90%] (Sampling)
#> Chain 3: Iteration: 2000 / 2000 [100%] (Sampling)
#> Chain 3:
#> Chain 3: Elapsed Time: 2.454 seconds (Warm-up)
#> Chain 3: 2.742 seconds (Sampling)
#> Chain 3: 5.196 seconds (Total)
#> Chain 3:
#>
#> SAMPLING FOR MODEL 'dawid_skene' NOW (CHAIN 4).
#> Chain 4:
#> Chain 4: Gradient evaluation took 0.000181 seconds
#> Chain 4: 1000 transitions using 10 leapfrog steps per transition would take 1.81 seconds.
#> Chain 4: Adjust your expectations accordingly!
#> Chain 4:
#> Chain 4:
#> Chain 4: Iteration: 1 / 2000 [ 0%] (Warmup)
#> Chain 4: Iteration: 200 / 2000 [ 10%] (Warmup)
#> Chain 4: Iteration: 400 / 2000 [ 20%] (Warmup)
#> Chain 4: Iteration: 600 / 2000 [ 30%] (Warmup)
#> Chain 4: Iteration: 800 / 2000 [ 40%] (Warmup)
#> Chain 4: Iteration: 1000 / 2000 [ 50%] (Warmup)
#> Chain 4: Iteration: 1001 / 2000 [ 50%] (Sampling)
#> Chain 4: Iteration: 1200 / 2000 [ 60%] (Sampling)
#> Chain 4: Iteration: 1400 / 2000 [ 70%] (Sampling)
#> Chain 4: Iteration: 1600 / 2000 [ 80%] (Sampling)
#> Chain 4: Iteration: 1800 / 2000 [ 90%] (Sampling)
#> Chain 4: Iteration: 2000 / 2000 [100%] (Sampling)
#> Chain 4:
#> Chain 4: Elapsed Time: 2.481 seconds (Warm-up)
#> Chain 4: 2.717 seconds (Sampling)
#> Chain 4: 5.198 seconds (Total)
#> Chain 4:
samples <- posterior_samples(fit)
# Look at first 6 samples for each of the pi parameters
head(samples$pi)
#>
#> iterations [,1] [,2] [,3] [,4]
#> [1,] 0.3995005 0.3874546 0.13791766 0.07512725
#> [2,] 0.3931214 0.3851786 0.08253312 0.13916692
#> [3,] 0.4636171 0.3602178 0.15750993 0.01865515
#> [4,] 0.4447642 0.3757060 0.11184783 0.06768193
#> [5,] 0.2541869 0.4726756 0.21493228 0.05820518
#> [6,] 0.2701043 0.4984787 0.16514536 0.06627164
# Look at the first 6 samples for the theta[1, 1, 1] parameter
head(samples$theta[, 1, 1, 1])
#> [1] 0.8722952 0.8664702 0.8717628 0.9103630 0.8823822 0.8932618
# Only get the samples for the pi parameter:
pi_samples <- posterior_samples(fit, pars = "pi")
# }