R/point_estimate.R
class_probabilities.Rd
Extract latent class probabilities from a rater fit object
class_probabilities(fit, ...)
# S3 method for mcmc_fit
class_probabilities(fit, ...)
# S3 method for optim_fit
class_probabilities(fit, ...)
A rater fit object.
Extra arguments.
A I * K matrix where each element is the probably of item i being of class k. (I is the number of items and K the number of classes).
The latent class probabilities are obtained by marginalising out the latent class and then calculating, for each draw of pi and theta, the conditional probability of the latent class given the other parameters and the data. Averaging these conditional probabilities gives the (unconditional) latent class probabilities retuned by this function.
# \donttest{
fit <- rater(anesthesia, "dawid_skene")
#>
#> SAMPLING FOR MODEL 'dawid_skene' NOW (CHAIN 1).
#> Chain 1:
#> Chain 1: Gradient evaluation took 0.0002 seconds
#> Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 2 seconds.
#> Chain 1: Adjust your expectations accordingly!
#> Chain 1:
#> Chain 1:
#> Chain 1: Iteration: 1 / 2000 [ 0%] (Warmup)
#> Chain 1: Iteration: 200 / 2000 [ 10%] (Warmup)
#> Chain 1: Iteration: 400 / 2000 [ 20%] (Warmup)
#> Chain 1: Iteration: 600 / 2000 [ 30%] (Warmup)
#> Chain 1: Iteration: 800 / 2000 [ 40%] (Warmup)
#> Chain 1: Iteration: 1000 / 2000 [ 50%] (Warmup)
#> Chain 1: Iteration: 1001 / 2000 [ 50%] (Sampling)
#> Chain 1: Iteration: 1200 / 2000 [ 60%] (Sampling)
#> Chain 1: Iteration: 1400 / 2000 [ 70%] (Sampling)
#> Chain 1: Iteration: 1600 / 2000 [ 80%] (Sampling)
#> Chain 1: Iteration: 1800 / 2000 [ 90%] (Sampling)
#> Chain 1: Iteration: 2000 / 2000 [100%] (Sampling)
#> Chain 1:
#> Chain 1: Elapsed Time: 2.745 seconds (Warm-up)
#> Chain 1: 2.913 seconds (Sampling)
#> Chain 1: 5.658 seconds (Total)
#> Chain 1:
#>
#> SAMPLING FOR MODEL 'dawid_skene' NOW (CHAIN 2).
#> Chain 2:
#> Chain 2: Gradient evaluation took 0.000177 seconds
#> Chain 2: 1000 transitions using 10 leapfrog steps per transition would take 1.77 seconds.
#> Chain 2: Adjust your expectations accordingly!
#> Chain 2:
#> Chain 2:
#> Chain 2: Iteration: 1 / 2000 [ 0%] (Warmup)
#> Chain 2: Iteration: 200 / 2000 [ 10%] (Warmup)
#> Chain 2: Iteration: 400 / 2000 [ 20%] (Warmup)
#> Chain 2: Iteration: 600 / 2000 [ 30%] (Warmup)
#> Chain 2: Iteration: 800 / 2000 [ 40%] (Warmup)
#> Chain 2: Iteration: 1000 / 2000 [ 50%] (Warmup)
#> Chain 2: Iteration: 1001 / 2000 [ 50%] (Sampling)
#> Chain 2: Iteration: 1200 / 2000 [ 60%] (Sampling)
#> Chain 2: Iteration: 1400 / 2000 [ 70%] (Sampling)
#> Chain 2: Iteration: 1600 / 2000 [ 80%] (Sampling)
#> Chain 2: Iteration: 1800 / 2000 [ 90%] (Sampling)
#> Chain 2: Iteration: 2000 / 2000 [100%] (Sampling)
#> Chain 2:
#> Chain 2: Elapsed Time: 2.589 seconds (Warm-up)
#> Chain 2: 2.458 seconds (Sampling)
#> Chain 2: 5.047 seconds (Total)
#> Chain 2:
#>
#> SAMPLING FOR MODEL 'dawid_skene' NOW (CHAIN 3).
#> Chain 3:
#> Chain 3: Gradient evaluation took 0.000179 seconds
#> Chain 3: 1000 transitions using 10 leapfrog steps per transition would take 1.79 seconds.
#> Chain 3: Adjust your expectations accordingly!
#> Chain 3:
#> Chain 3:
#> Chain 3: Iteration: 1 / 2000 [ 0%] (Warmup)
#> Chain 3: Iteration: 200 / 2000 [ 10%] (Warmup)
#> Chain 3: Iteration: 400 / 2000 [ 20%] (Warmup)
#> Chain 3: Iteration: 600 / 2000 [ 30%] (Warmup)
#> Chain 3: Iteration: 800 / 2000 [ 40%] (Warmup)
#> Chain 3: Iteration: 1000 / 2000 [ 50%] (Warmup)
#> Chain 3: Iteration: 1001 / 2000 [ 50%] (Sampling)
#> Chain 3: Iteration: 1200 / 2000 [ 60%] (Sampling)
#> Chain 3: Iteration: 1400 / 2000 [ 70%] (Sampling)
#> Chain 3: Iteration: 1600 / 2000 [ 80%] (Sampling)
#> Chain 3: Iteration: 1800 / 2000 [ 90%] (Sampling)
#> Chain 3: Iteration: 2000 / 2000 [100%] (Sampling)
#> Chain 3:
#> Chain 3: Elapsed Time: 2.61 seconds (Warm-up)
#> Chain 3: 2.862 seconds (Sampling)
#> Chain 3: 5.472 seconds (Total)
#> Chain 3:
#>
#> SAMPLING FOR MODEL 'dawid_skene' NOW (CHAIN 4).
#> Chain 4:
#> Chain 4: Gradient evaluation took 0.000185 seconds
#> Chain 4: 1000 transitions using 10 leapfrog steps per transition would take 1.85 seconds.
#> Chain 4: Adjust your expectations accordingly!
#> Chain 4:
#> Chain 4:
#> Chain 4: Iteration: 1 / 2000 [ 0%] (Warmup)
#> Chain 4: Iteration: 200 / 2000 [ 10%] (Warmup)
#> Chain 4: Iteration: 400 / 2000 [ 20%] (Warmup)
#> Chain 4: Iteration: 600 / 2000 [ 30%] (Warmup)
#> Chain 4: Iteration: 800 / 2000 [ 40%] (Warmup)
#> Chain 4: Iteration: 1000 / 2000 [ 50%] (Warmup)
#> Chain 4: Iteration: 1001 / 2000 [ 50%] (Sampling)
#> Chain 4: Iteration: 1200 / 2000 [ 60%] (Sampling)
#> Chain 4: Iteration: 1400 / 2000 [ 70%] (Sampling)
#> Chain 4: Iteration: 1600 / 2000 [ 80%] (Sampling)
#> Chain 4: Iteration: 1800 / 2000 [ 90%] (Sampling)
#> Chain 4: Iteration: 2000 / 2000 [100%] (Sampling)
#> Chain 4:
#> Chain 4: Elapsed Time: 2.523 seconds (Warm-up)
#> Chain 4: 2.46 seconds (Sampling)
#> Chain 4: 4.983 seconds (Total)
#> Chain 4:
class_probabilities(fit)
#>
#> [,1] [,2] [,3] [,4]
#> 1 9.999993e-01 1.275314e-07 2.503712e-08 5.610626e-07
#> 2 9.109086e-08 2.511349e-05 9.724752e-01 2.749962e-02
#> 3 3.899988e-01 6.092180e-01 1.231041e-04 6.601339e-04
#> 4 5.421173e-03 9.940366e-01 3.871833e-04 1.550763e-04
#> 5 2.195843e-07 9.999694e-01 2.823555e-05 2.137849e-06
#> 6 1.414481e-06 9.994701e-01 5.025837e-04 2.588354e-05
#> 7 9.994963e-01 4.937372e-04 9.798016e-07 8.984391e-06
#> 8 1.118295e-08 2.771170e-05 9.995261e-01 4.461804e-04
#> 9 1.589944e-06 9.999397e-01 5.324895e-05 5.450668e-06
#> 10 2.676622e-06 9.977420e-01 2.218047e-03 3.725791e-05
#> 11 1.274383e-08 1.687643e-08 1.434031e-04 9.998566e-01
#> 12 2.930217e-05 8.852578e-01 1.073745e-01 7.338393e-03
#> 13 9.999993e-01 1.275314e-07 2.503712e-08 5.610626e-07
#> 14 3.651652e-05 9.991932e-01 7.236374e-04 4.662949e-05
#> 15 9.999872e-01 1.090749e-05 1.196273e-07 1.769692e-06
#> 16 9.999916e-01 5.800806e-06 2.041575e-07 2.433157e-06
#> 17 9.999993e-01 1.275314e-07 2.503712e-08 5.610626e-07
#> 18 9.999993e-01 1.275314e-07 2.503712e-08 5.610626e-07
#> 19 3.978020e-05 9.999505e-01 4.097372e-06 5.623851e-06
#> 20 2.242806e-04 9.993151e-01 3.365363e-04 1.240984e-04
#> 21 5.176194e-07 9.999970e-01 1.404793e-06 1.041193e-06
#> 22 3.978020e-05 9.999505e-01 4.097372e-06 5.623851e-06
#> 23 2.195843e-07 9.999694e-01 2.823555e-05 2.137849e-06
#> 24 7.439569e-05 9.999131e-01 5.459363e-06 7.053563e-06
#> 25 9.999993e-01 1.275314e-07 2.503712e-08 5.610626e-07
#> 26 9.999993e-01 1.275314e-07 2.503712e-08 5.610626e-07
#> 27 9.440780e-07 9.999125e-01 8.062360e-05 5.903065e-06
#> 28 9.999993e-01 1.275314e-07 2.503712e-08 5.610626e-07
#> 29 9.999993e-01 1.275314e-07 2.503712e-08 5.610626e-07
#> 30 9.977295e-01 2.241100e-03 2.909531e-06 2.648208e-05
#> 31 9.999993e-01 1.275314e-07 2.503712e-08 5.610626e-07
#> 32 2.399107e-08 9.673980e-04 9.989271e-01 1.054398e-04
#> 33 9.999993e-01 1.275314e-07 2.503712e-08 5.610626e-07
#> 34 5.176194e-07 9.999970e-01 1.404793e-06 1.041193e-06
#> 35 6.752394e-07 9.975831e-01 2.400187e-03 1.604949e-05
#> 36 3.511779e-07 1.002874e-04 7.802366e-01 2.196628e-01
#> 37 2.045770e-04 9.992671e-01 4.426335e-04 8.567261e-05
#> 38 2.995573e-06 7.109613e-01 2.887699e-01 2.658125e-04
#> 39 9.054916e-08 6.417169e-04 9.978566e-01 1.501601e-03
#> 40 9.999993e-01 1.275314e-07 2.503712e-08 5.610626e-07
#> 41 9.999993e-01 1.275314e-07 2.503712e-08 5.610626e-07
#> 42 9.994963e-01 4.937372e-04 9.798016e-07 8.984391e-06
#> 43 9.440780e-07 9.999125e-01 8.062360e-05 5.903065e-06
#> 44 9.999872e-01 1.090749e-05 1.196273e-07 1.769692e-06
#> 45 5.176194e-07 9.999970e-01 1.404793e-06 1.041193e-06
# }