Sparse Latent Class Model for Cognitive Diagnosis (SLCM)

Description

Performs the Gibbs sampling routine for a sparse latent class model as described in Chen et al. (2020) <doi: 10.1007/s11336-019-09693-2>

Usage

slcm(
  y,
  k,
  burnin = 1000L,
  chain_length = 10000L,
  psi_invj = c(1, rep(2, 2^k - 1)),
  m0 = 0,
  bq = 1
)

Arguments

y Item Matrix
k Dimension to estimate for Q matrix
burnin Amount of Draws to Burn
chain_length Number of Iterations for chain.
psi_invj, m0, bq Additional tuning parameters.

Details

The estimates list contains the mean information from the sampling procedure. Meanwhile, the chain list contains full MCMC values. Lastly, the details list provides information regarding the estimation call.

Value

An slcm object containing three named lists:

  • estimates

    • beta: Average beta coefficients

    • theta: Average theta coefficients

    • delta: Average activeness of coefficients

    • class: Average class membership

    • pi: Average attribute class probability.

    • omega: Average omega

    • q: Average activeness of Q matrix entries based on heuristic transformation.

    • m2ll: Average negative two times log-likelihood

  • chain

    • theta: theta coefficients iterations

    • beta: beta coefficients iterations

    • class: class membership iterations

    • pi: attribute class probability iterations

    • omega: omega iterations

    • m2ll: Negative two times log-likelihood iterations

  • details

    • n: Number of Subjects

    • j: Number of Items

    • k: Number of Traits

    • l1: Slab parameter

    • m0, bq: Additional tuning parameters

    • burnin: Number of Iterations to discard

    • chain_length: Number of Iterations to keep

    • runtime: Duration of model run inside of the C++ code. (Does not include summarization of MCMC chain.)

    • package_version: Version of the package the SLCM model was fit with.

    • date_time: Date and Time the model was fit.

Examples

library(slcm)

# Use a demo data set from the paper
data("items_matrix_reasoning", package = "edmdata")
  
burnin = 50        # Set for demonstration purposes, increase to at least 1,000 in practice.
chain_length = 100 # Set for demonstration purposes, increase to at least 10,000 in practice.  
  
model_reasoning = slcm(items_matrix_reasoning, k = 4, 
                       burnin = burnin, chain_length = chain_length)
                         
print(model_reasoning)

Model Details:
- Observations (n): 400
- Items (j): 25
- Attributes (k): 4
- Runtime: 0.294
- Date: 2024-06-14 01:09:54.345644
- Package Version: 0.1.0

Chain properties:
- Burn in: 50
- Chain Length: 100
- Total Iterations: 150

Hyperparameter Details:
- m0: 400
- bq: 25
- l1:
      [,1]  [,2]  [,3]  [,4]  [,5]  [,6]  [,7]  [,8]  [,9]  [,10]  [,11]  [,12]
[1,]  1     2     2     2     2     2     2     2     2     2      2      2    
      [,13]  [,14]  [,15]  [,16]
[1,]  2      2      2      2    


Beta Coefficients:
        B_0000    B_0001    B_0010    B_0011    B_0100    B_0101    B_0110  
Item1    0.26461   1.85519   0.01551   0.00000   1.30027   0.00000   0.00000
Item2    0.52241   0.94994   0.00911   0.00000   1.24358   0.00000   0.00000
Item3   -0.08369   0.71393   0.01876   0.00000   0.93911   0.00000   0.00000
Item4   -0.95735   1.05575   0.44959   0.62507   0.08793   0.00000   0.00000
Item5    0.09590   0.74075   0.03248   0.00000   0.32075   0.00000   0.00000
Item6   -0.33117   1.14116   0.25630   0.51433   0.32420   0.00000   0.00000
Item7   -1.30225   0.18436   0.20987   0.00000   0.27076   0.29983   0.00000
Item8   -0.81947   0.35760   0.07715   0.00000   0.88605   0.00000   0.00000
Item9    0.41146   0.10650   0.13358   0.00000   0.67930   0.00000   0.23657
Item10  -0.98080   0.22379   0.34943   0.00000   1.16013   0.00000   0.00000
Item11  -0.38404   0.10237   0.06622   0.37789   0.79333   0.27238   0.52411
Item12  -1.13256   1.15886   0.04215   0.40463   0.03892   0.00000   0.00000
Item13  -1.08360   0.33101   0.37732   0.36036   0.16947   0.00000   0.00000
Item14  -1.15924   0.66064   0.43554   0.00000   1.03098   0.00000   0.00000
Item15  -0.90107   0.76546   0.42938   0.00000   1.46054   0.00000   0.00000
Item16  -0.68069   0.81902   0.06212   0.00000   1.88412   0.00000   0.00000
Item17  -1.31510   0.46214   0.70879   0.00000   0.09426   0.00000   0.61828
Item18  -1.96699   0.66817   1.17070   0.00000   0.91820   0.00000   0.00000
Item19  -1.09393   0.19039   0.98152   0.00000   1.17178   0.00000   0.00000
Item20  -1.51430   0.18138   0.68318   0.00000   0.89147   0.00000   0.00000
Item21  -1.63514   0.08250   0.11030   0.20641   0.33172   0.00000   0.00000
Item22  -1.83399   0.13639   0.33481   0.43933   0.19468   0.00000   0.00000
Item23  -1.16816   0.26479   0.11697   0.00000   0.88549   0.00000   0.30526
Item24  -1.31878   0.46969   1.36786   0.00000   0.80982   0.34477   0.00000
Item25  -0.30039   0.11887   0.15991   0.73650   0.69557   0.08755   0.43455
        B_0111    B_1000    B_1001    B_1010    B_1011    B_1100    B_1101  
Item1    0.00000   0.11702   0.00000   0.00000   0.00000   0.00000   0.00000
Item2    0.00000   0.06114   0.00000   0.00000   0.00000   0.00000   0.00000
Item3    0.00000   0.13846   0.00000   0.00000   0.00000   0.00000   0.00000
Item4    0.00000   0.67110   0.00000   0.00000   0.00000   0.00000   0.00000
Item5    0.00000   0.15448   0.00000   0.00000   0.00000   0.00000   0.00000
Item6    0.00000   0.31951   0.69434   0.30581   0.39519   0.00000   0.00000
Item7    0.00000   0.54547   0.00000   0.00000   0.00000   0.00000   0.00000
Item8    0.00000   1.20160   0.00000   0.00000   0.00000   0.00000   0.00000
Item9    0.00000   0.17019   0.00000   0.57586   0.00000   0.14731   0.00000
Item10   0.00000   0.84085   0.00000   0.00000   0.00000   0.00000   0.00000
Item11   0.38470   0.55809   0.00000   0.00000   0.00000   0.00000   0.00000
Item12   0.00000   1.12502   0.00000   0.00000   0.00000   0.00000   0.00000
Item13   0.00000   0.26543   0.94511   0.43879   0.51062   0.00000   0.00000
Item14   0.00000   0.98425   0.00000   0.00000   0.00000   0.00000   0.00000
Item15   0.00000   0.61470   0.00000   0.00000   0.00000   0.00000   0.00000
Item16   0.00000   0.12871   0.00000   0.00000   0.00000   0.00000   0.00000
Item17   0.00000   0.43279   0.00000   0.00000   0.00000   0.00000   0.00000
Item18   0.00000   0.14441   0.00000   0.00000   0.00000   0.00000   0.00000
Item19   0.00000   0.04506   0.71711   0.00000   0.00000   0.00000   0.00000
Item20   0.00000   0.03934   0.00000   0.00000   0.00000   0.00000   0.00000
Item21   0.00000   0.06780   0.12683   0.16430   0.40293   0.00000   0.00000
Item22   0.00000   0.11764   0.20380   0.43668   0.32933   0.00000   0.00000
Item23   0.00000   0.19111   0.00000   0.51972   0.00000   0.46495   0.00000
Item24   0.00000   0.14193   0.00000   0.00000   0.00000   0.00000   0.00000
Item25   0.15332   0.14647   0.17545   0.54895   0.28199   0.21980   0.01524
        B_1110    B_1111  
Item1    0.00000   0.00000
Item2    0.00000   0.00000
Item3    0.00000   0.00000
Item4    0.00000   0.00000
Item5    0.00000   0.00000
Item6    0.00000   0.00000
Item7    0.00000   0.00000
Item8    0.00000   0.00000
Item9    0.64524   0.00000
Item10   0.00000   0.00000
Item11   0.00000   0.00000
Item12   0.00000   0.00000
Item13   0.00000   0.00000
Item14   0.00000   0.00000
Item15   0.00000   0.00000
Item16   0.00000   0.00000
Item17   0.00000   0.00000
Item18   0.00000   0.00000
Item19   0.00000   0.00000
Item20   0.00000   0.00000
Item21   0.00000   0.00000
Item22   0.00000   0.00000
Item23   0.46100   0.00000
Item24   0.00000   0.00000
Item25   0.24957   0.07856

Delta activeness:
        D_0000  D_0001  D_0010  D_0011  D_0100  D_0101  D_0110  D_0111  D_1000
Item1   1.00    1.00    0.09    0.00    1.00    0.00    0.00    0.00    0.34  
Item2   1.00    1.00    0.11    0.00    1.00    0.00    0.00    0.00    0.17  
Item3   1.00    0.91    0.05    0.00    0.97    0.00    0.00    0.00    0.22  
Item4   1.00    1.00    0.17    1.00    0.10    0.00    0.00    0.00    0.37  
Item5   1.00    1.00    0.12    0.00    0.76    0.00    0.00    0.00    0.31  
Item6   1.00    1.00    0.08    1.00    0.32    0.00    0.00    0.00    0.07  
Item7   1.00    0.03    0.00    0.00    0.02    1.00    0.00    0.00    1.00  
Item8   1.00    0.76    0.22    0.00    1.00    0.00    0.00    0.00    1.00  
Item9   1.00    0.00    1.00    0.00    0.00    0.00    1.00    0.00    0.00  
Item10  1.00    0.55    0.55    0.00    1.00    0.00    0.00    0.00    0.87  
Item11  1.00    0.14    0.13    1.00    1.00    1.00    1.00    1.00    0.72  
Item12  1.00    1.00    0.20    1.00    0.19    0.00    0.00    0.00    1.00  
Item13  1.00    0.35    0.12    1.00    0.02    0.00    0.00    0.00    0.68  
Item14  1.00    0.98    0.76    0.00    1.00    0.00    0.00    0.00    1.00  
Item15  1.00    1.00    0.50    0.00    1.00    0.00    0.00    0.00    0.84  
Item16  1.00    1.00    0.22    0.00    1.00    0.00    0.00    0.00    0.30  
Item17  1.00    0.61    0.34    0.00    0.09    0.00    1.00    0.00    0.78  
Item18  1.00    0.91    0.98    0.00    1.00    0.00    0.00    0.00    0.36  
Item19  1.00    0.41    1.00    0.00    1.00    0.00    0.00    0.00    0.16  
Item20  1.00    0.48    0.93    0.00    1.00    0.00    0.00    0.00    0.14  
Item21  1.00    0.40    0.04    1.00    0.64    0.00    0.00    0.00    0.01  
Item22  1.00    0.00    1.00    1.00    0.00    0.00    0.00    0.00    0.00  
Item23  1.00    0.47    0.08    0.00    1.00    0.00    1.00    0.00    0.15  
Item24  1.00    0.77    1.00    0.00    1.00    1.00    0.00    0.00    0.35  
Item25  1.00    0.13    0.06    1.00    1.00    1.00    1.00    1.00    0.05  
        D_1001  D_1010  D_1011  D_1100  D_1101  D_1110  D_1111
Item1   0.00    0.00    0.00    0.00    0.00    0.00    0.00  
Item2   0.00    0.00    0.00    0.00    0.00    0.00    0.00  
Item3   0.00    0.00    0.00    0.00    0.00    0.00    0.00  
Item4   0.00    0.00    0.00    0.00    0.00    0.00    0.00  
Item5   0.00    0.00    0.00    0.00    0.00    0.00    0.00  
Item6   1.00    1.00    1.00    0.00    0.00    0.00    0.00  
Item7   0.00    0.00    0.00    0.00    0.00    0.00    0.00  
Item8   0.00    0.00    0.00    0.00    0.00    0.00    0.00  
Item9   0.00    1.00    0.00    1.00    0.00    1.00    0.00  
Item10  0.00    0.00    0.00    0.00    0.00    0.00    0.00  
Item11  0.00    0.00    0.00    0.00    0.00    0.00    0.00  
Item12  0.00    0.00    0.00    0.00    0.00    0.00    0.00  
Item13  1.00    1.00    1.00    0.00    0.00    0.00    0.00  
Item14  0.00    0.00    0.00    0.00    0.00    0.00    0.00  
Item15  0.00    0.00    0.00    0.00    0.00    0.00    0.00  
Item16  0.00    0.00    0.00    0.00    0.00    0.00    0.00  
Item17  0.00    0.00    0.00    0.00    0.00    0.00    0.00  
Item18  0.00    0.00    0.00    0.00    0.00    0.00    0.00  
Item19  1.00    0.00    0.00    0.00    0.00    0.00    0.00  
Item20  0.00    0.00    0.00    0.00    0.00    0.00    0.00  
Item21  1.00    1.00    1.00    0.00    0.00    0.00    0.00  
Item22  1.00    1.00    1.00    0.00    0.00    0.00    0.00  
Item23  0.00    1.00    0.00    1.00    0.00    1.00    0.00  
Item24  0.00    0.00    0.00    0.00    0.00    0.00    0.00  
Item25  1.00    1.00    1.00    1.00    1.00    1.00    1.00  

Q matrix:
        Q_1  Q_2  Q_3  Q_4
Item1   0    1    0    1  
Item2   0    1    0    1  
Item3   0    1    0    1  
Item4   0    0    1    1  
Item5   0    0    0    1  
Item6   1    0    1    1  
Item7   1    1    0    1  
Item8   1    1    0    0  
Item9   1    1    1    0  
Item10  1    1    0    0  
Item11  0    1    1    1  
Item12  1    0    0    1  
Item13  1    0    1    1  
Item14  1    1    0    1  
Item15  0    1    0    1  
Item16  0    1    0    0  
Item17  0    1    1    0  
Item18  0    1    1    1  
Item19  1    1    1    1  
Item20  0    1    1    0  
Item21  1    0    1    1  
Item22  1    0    1    1  
Item23  1    1    1    0  
Item24  0    1    1    1  
Item25  1    1    1    1