Sparse Latent Class Model for Cognitive Diagnosis (SLCM)
Description
Performs the Gibbs sampling routine for a sparse latent class model as described in Chen et al. (2020) <doi: 10.1007/s11336-019-09693-2>
Usage
slcm(
y,
k,
burnin = 1000L,
chain_length = 10000L,
psi_invj = c(1, rep(2, 2^k - 1)),
m0 = 0,
bq = 1
)
Arguments
y
|
Item Matrix |
k
|
Dimension to estimate for Q matrix |
burnin
|
Amount of Draws to Burn |
chain_length
|
Number of Iterations for chain. |
psi_invj , m0 , bq
|
Additional tuning parameters. |
Details
The estimates
list contains the mean information from the sampling procedure. Meanwhile, the chain
list contains full MCMC values. Lastly, the details
list provides information regarding the estimation call.
Value
An slcm
object containing three named lists:
-
estimates
-
beta
: Average beta coefficients -
theta
: Average theta coefficients -
delta
: Average activeness of coefficients -
class
: Average class membership -
pi
: Average attribute class probability. -
omega
: Average omega -
q
: Average activeness of Q matrix entries based on heuristic transformation. -
m2ll
: Average negative two times log-likelihood
-
-
chain
-
theta
: theta coefficients iterations -
beta
: beta coefficients iterations -
class
: class membership iterations -
pi
: attribute class probability iterations -
omega
: omega iterations -
m2ll
: Negative two times log-likelihood iterations
-
-
details
-
n
: Number of Subjects -
j
: Number of Items -
k
: Number of Traits -
l1
: Slab parameter -
m0
,bq
: Additional tuning parameters -
burnin
: Number of Iterations to discard -
chain_length
: Number of Iterations to keep -
runtime
: Duration of model run inside of the C++ code. (Does not include summarization of MCMC chain.) -
package_version
: Version of the package the SLCM model was fit with. -
date_time
: Date and Time the model was fit.
-
Examples
library("slcm")
# Use a demo data set from the paper
data("items_matrix_reasoning", package = "edmdata")
= 50 # Set for demonstration purposes, increase to at least 1,000 in practice.
burnin = 100 # Set for demonstration purposes, increase to at least 10,000 in practice.
chain_length
= slcm(items_matrix_reasoning, k = 4,
model_reasoning burnin = burnin, chain_length = chain_length)
print(model_reasoning)
Model Details:
- Observations (n): 400
- Items (j): 25
- Attributes (k): 4
- Runtime: 0.259
- Date: 2025-01-22 19:09:08.608414
- Package Version: 0.1.0
Chain properties:
- Burn in: 50
- Chain Length: 100
- Total Iterations: 150
Hyperparameter Details:
- m0: 400
- bq: 25
- l1:
[,1] [,2] [,3] [,4] [,5] [,6] [,7] [,8] [,9] [,10] [,11] [,12]
[1,] 1 2 2 2 2 2 2 2 2 2 2 2
[,13] [,14] [,15] [,16]
[1,] 2 2 2 2
Beta Coefficients:
B_0000 B_0001 B_0010 B_0011 B_0100 B_0101
Item1 -0.105437 0.634047 0.938089 0.203936 0.418808 0.405443
Item2 0.327991 1.250081 0.020893 0.000000 0.100457 0.000000
Item3 -0.325383 0.229148 0.859486 0.000000 0.144624 0.000000
Item4 -0.933881 0.197789 0.107433 0.000000 0.759459 0.749515
Item5 0.072516 0.120875 0.272322 0.000000 0.824102 0.000000
Item6 -0.960221 0.726635 0.043984 0.000000 1.664274 0.000000
Item7 -1.486333 0.508565 0.412377 0.000000 0.223361 0.000000
Item8 -0.994736 0.423924 0.088323 0.954039 0.382002 0.000000
Item9 0.085978 0.027074 0.715612 0.000000 0.031428 0.000000
Item10 -0.970255 0.847469 0.269275 0.000000 0.002132 0.379218
Item11 -0.391251 0.778134 0.265904 0.000000 0.020230 0.000000
Item12 -1.444042 0.081535 0.373768 0.537912 1.116943 0.276480
Item13 -1.251363 0.274225 0.230999 0.000000 0.858100 0.000000
Item14 -0.905463 0.564529 0.173227 0.791435 0.396992 0.000000
Item15 -0.736840 1.195832 0.046636 0.771102 0.054131 0.000000
Item16 -0.824829 1.232579 0.700176 0.513861 0.024033 0.000000
Item17 -1.294188 0.303253 0.108431 0.000000 0.429418 0.448714
Item18 -1.707823 1.124667 0.469598 0.000000 0.257002 0.000000
Item19 -1.119670 1.559696 0.095179 0.000000 0.081829 0.000000
Item20 -1.316838 0.852741 0.168318 0.000000 0.084819 0.000000
Item21 -1.605657 0.127899 0.076126 0.000000 0.080311 0.000000
Item22 -1.718581 0.094457 0.107974 0.176877 0.127674 0.111924
Item23 -1.289514 0.745688 1.043650 0.000000 0.006366 0.000000
Item24 -1.123770 1.085557 0.842161 0.000000 0.013718 0.000000
Item25 -0.466513 0.335825 1.489274 0.000000 0.006622 0.000000
B_0110 B_0111 B_1000 B_1001 B_1010 B_1011
Item1 0.355671 -0.119112 1.464123 0.000000 0.000000 0.000000
Item2 0.000000 0.000000 1.461466 0.000000 0.000000 0.000000
Item3 0.000000 0.000000 1.036120 0.000000 0.422886 0.000000
Item4 0.000000 0.000000 0.079085 0.316472 0.000000 0.000000
Item5 0.000000 0.000000 0.191849 0.000000 0.000000 0.000000
Item6 0.000000 0.000000 1.069284 0.000000 0.000000 0.000000
Item7 0.000000 0.000000 0.322626 0.000000 0.000000 0.000000
Item8 0.000000 0.000000 0.904709 0.000000 0.000000 0.000000
Item9 0.000000 0.000000 1.186251 0.000000 0.000000 0.000000
Item10 0.000000 0.000000 0.590121 0.402301 0.000000 0.000000
Item11 0.000000 0.000000 0.420271 0.677745 0.000000 0.000000
Item12 0.268377 0.072034 0.307652 0.000000 0.000000 0.000000
Item13 0.000000 0.000000 0.158909 0.000000 0.123811 0.000000
Item14 0.000000 0.000000 0.121574 0.626967 0.474383 0.557914
Item15 0.000000 0.000000 0.504834 1.866299 0.929267 0.671728
Item16 0.000000 0.000000 1.037444 0.997740 0.747404 0.385618
Item17 0.000000 0.000000 0.103422 0.000000 0.000000 0.000000
Item18 0.000000 0.000000 0.003558 0.000000 0.000000 0.000000
Item19 0.000000 0.000000 0.617053 0.000000 0.000000 0.000000
Item20 0.000000 0.000000 0.256822 0.000000 0.000000 0.000000
Item21 0.000000 0.000000 0.334232 0.000000 0.000000 0.000000
Item22 0.155974 0.074315 0.085390 0.000000 0.000000 0.000000
Item23 0.000000 0.000000 0.299854 0.000000 0.000000 0.000000
Item24 0.000000 0.000000 0.002486 0.000000 0.000000 0.000000
Item25 0.000000 0.000000 0.328696 0.000000 0.000000 0.000000
B_1100 B_1101 B_1110 B_1111
Item1 0.000000 0.000000 0.000000 0.000000
Item2 0.000000 0.000000 0.000000 0.000000
Item3 0.000000 0.000000 0.000000 0.000000
Item4 0.696403 0.624255 0.000000 0.000000
Item5 0.000000 0.000000 0.000000 0.000000
Item6 0.000000 0.000000 0.000000 0.000000
Item7 0.000000 0.000000 0.000000 0.000000
Item8 0.000000 0.000000 0.000000 0.000000
Item9 0.000000 0.000000 0.000000 0.000000
Item10 0.522305 0.317542 0.000000 0.000000
Item11 0.000000 0.000000 0.000000 0.000000
Item12 0.000000 0.000000 0.000000 0.000000
Item13 0.000000 0.000000 0.000000 0.000000
Item14 0.000000 0.000000 0.000000 0.000000
Item15 0.000000 0.000000 0.000000 0.000000
Item16 0.000000 0.000000 0.000000 0.000000
Item17 0.000000 0.000000 0.000000 0.000000
Item18 0.000000 0.000000 0.000000 0.000000
Item19 0.000000 0.000000 0.000000 0.000000
Item20 0.000000 0.000000 0.000000 0.000000
Item21 0.000000 0.000000 0.000000 0.000000
Item22 0.000000 0.000000 0.000000 0.000000
Item23 0.000000 0.000000 0.000000 0.000000
Item24 0.000000 0.000000 0.000000 0.000000
Item25 0.000000 0.000000 0.000000 0.000000
Delta activeness:
D_0000 D_0001 D_0010 D_0011 D_0100 D_0101 D_0110 D_0111 D_1000
Item1 1.00 0.72 1.00 1.00 0.32 1.00 1.00 1.00 0.79
Item2 1.00 1.00 0.13 0.00 0.28 0.00 0.00 0.00 1.00
Item3 1.00 0.60 1.00 0.00 0.35 0.00 0.00 0.00 1.00
Item4 1.00 0.03 0.02 0.00 0.97 1.00 0.00 0.00 0.01
Item5 1.00 0.10 0.41 0.00 1.00 0.00 0.00 0.00 0.15
Item6 1.00 1.00 0.17 0.00 1.00 0.00 0.00 0.00 1.00
Item7 1.00 0.21 0.61 0.00 0.45 0.00 0.00 0.00 0.19
Item8 1.00 0.66 0.10 1.00 0.60 0.00 0.00 0.00 1.00
Item9 1.00 0.14 0.97 0.00 0.15 0.00 0.00 0.00 1.00
Item10 1.00 1.00 0.71 0.00 0.03 1.00 0.00 0.00 1.00
Item11 1.00 1.00 0.53 0.00 0.11 0.00 0.00 0.00 0.94
Item12 1.00 0.24 0.66 1.00 1.00 1.00 1.00 1.00 0.66
Item13 1.00 0.17 0.63 0.00 0.37 0.00 0.00 0.00 0.04
Item14 1.00 0.34 0.02 1.00 0.01 0.00 0.00 0.00 0.67
Item15 1.00 1.00 0.11 1.00 0.17 0.00 0.00 0.00 0.86
Item16 1.00 1.00 1.00 1.00 0.13 0.00 0.00 0.00 1.00
Item17 1.00 0.81 0.07 0.00 0.31 1.00 0.00 0.00 0.00
Item18 1.00 1.00 0.86 0.00 0.44 0.00 0.00 0.00 0.05
Item19 1.00 1.00 0.13 0.00 0.18 0.00 0.00 0.00 0.78
Item20 1.00 1.00 0.11 0.00 0.03 0.00 0.00 0.00 0.27
Item21 1.00 0.07 0.01 0.00 0.02 0.00 0.00 0.00 1.00
Item22 1.00 0.03 0.80 1.00 0.01 1.00 1.00 1.00 0.18
Item23 1.00 0.97 1.00 0.00 0.07 0.00 0.00 0.00 0.57
Item24 1.00 1.00 1.00 0.00 0.09 0.00 0.00 0.00 0.04
Item25 1.00 0.56 1.00 0.00 0.09 0.00 0.00 0.00 0.71
D_1001 D_1010 D_1011 D_1100 D_1101 D_1110 D_1111
Item1 0.00 0.00 0.00 0.00 0.00 0.00 0.00
Item2 0.00 0.00 0.00 0.00 0.00 0.00 0.00
Item3 0.00 1.00 0.00 0.00 0.00 0.00 0.00
Item4 1.00 0.00 0.00 1.00 1.00 0.00 0.00
Item5 0.00 0.00 0.00 0.00 0.00 0.00 0.00
Item6 0.00 0.00 0.00 0.00 0.00 0.00 0.00
Item7 0.00 0.00 0.00 0.00 0.00 0.00 0.00
Item8 0.00 0.00 0.00 0.00 0.00 0.00 0.00
Item9 0.00 0.00 0.00 0.00 0.00 0.00 0.00
Item10 1.00 0.00 0.00 1.00 1.00 0.00 0.00
Item11 1.00 0.00 0.00 0.00 0.00 0.00 0.00
Item12 0.00 0.00 0.00 0.00 0.00 0.00 0.00
Item13 0.00 1.00 0.00 0.00 0.00 0.00 0.00
Item14 1.00 1.00 1.00 0.00 0.00 0.00 0.00
Item15 1.00 1.00 1.00 0.00 0.00 0.00 0.00
Item16 1.00 1.00 1.00 0.00 0.00 0.00 0.00
Item17 0.00 0.00 0.00 0.00 0.00 0.00 0.00
Item18 0.00 0.00 0.00 0.00 0.00 0.00 0.00
Item19 0.00 0.00 0.00 0.00 0.00 0.00 0.00
Item20 0.00 0.00 0.00 0.00 0.00 0.00 0.00
Item21 0.00 0.00 0.00 0.00 0.00 0.00 0.00
Item22 0.00 0.00 0.00 0.00 0.00 0.00 0.00
Item23 0.00 0.00 0.00 0.00 0.00 0.00 0.00
Item24 0.00 0.00 0.00 0.00 0.00 0.00 0.00
Item25 0.00 0.00 0.00 0.00 0.00 0.00 0.00
Q matrix:
Q_1 Q_2 Q_3 Q_4
Item1 1 1 1 1
Item2 1 0 0 1
Item3 1 0 1 0
Item4 1 1 0 1
Item5 0 1 0 0
Item6 1 1 0 0
Item7 1 0 1 1
Item8 1 0 1 1
Item9 1 0 1 0
Item10 1 1 0 1
Item11 1 0 0 1
Item12 0 1 1 1
Item13 0 1 0 0
Item14 1 0 1 1
Item15 1 0 1 1
Item16 1 0 1 1
Item17 0 1 0 1
Item18 0 0 0 1
Item19 0 0 0 1
Item20 0 0 0 1
Item21 1 0 0 0
Item22 0 1 1 1
Item23 0 0 1 1
Item24 0 0 1 1
Item25 0 0 1 0