mdirt
fits a variety of item response models with discrete latent variables.
These include, but are not limited to, latent class analysis, multidimensional latent
class models, multidimensional discrete latent class models, DINA/DINO models,
grade of measurement models, C-RUM, and so on. If response models are not defined explicitly
then customized models can defined using the createItem
function.
Usage
mdirt(
data,
model,
customTheta = NULL,
structure = NULL,
item.Q = NULL,
nruns = 1,
method = "EM",
covdata = NULL,
formula = NULL,
itemtype = "lca",
optimizer = "nlminb",
return_max = TRUE,
group = NULL,
GenRandomPars = FALSE,
verbose = TRUE,
pars = NULL,
technical = list(),
...
)
Arguments
- data
a
matrix
ordata.frame
that consists of numerically ordered data, organized in the form of integers, with missing data coded asNA
- model
number of mutually exclusive classes to fit, or alternatively a more specific
mirt.model
definition (which reflects the so-called Q-matrix). Note that when using amirt.model
, the order with which the syntax factors/attributes are defined are associated with the columns in thecustomTheta
input- customTheta
input passed to
technical = list(customTheta = ...)
, but is included directly in this function for convenience. This input is most interesting for discrete latent models because it allows customized patterns of latent classes (i.e., defines the possible combinations of the latent attribute profile). The default builds the patterncustomTheta = diag(model)
, which is the typical pattern for the traditional latent class analysis whereby class membership mutually distinct and exhaustive. SeethetaComb
for a quick method to generate a matrix with all possible combinations- structure
an R formula allowing the profile probability patterns (i.e., the structural component of the model) to be fitted according to a log-linear model. When
NULL
, all profile probabilities (except one) will be estimated. Use of this input requires that thecustomTheta
input is supplied, and that the column names in this matrix match the names found within this formula- item.Q
a list of item-level Q-matrices indicating how the respective categories should be modeled by the underlying attributes. Each matrix must represent a \(K_i \times A\) matrix, where \(K_i\) represents the number of categories for the ith item, and \(A\) is the number of attributes included in the
Theta
matrix; otherwise, a value ofNULL
will default to a matrix consisting of 1's for each \(K_i \times A\) element except for the first row, which contains only 0's for proper identification. Incidentally, the first row of each matrixmust
contain only 0's so that the first category represents the reference category for identification- nruns
a numeric value indicating how many times the model should be fit to the data when using random starting values. If greater than 1,
GenRandomPars
is set to true by default- method
estimation method. Can be 'EM' or 'BL' (see
mirt
for more details)- covdata
a data.frame of data used for latent regression models
- formula
an R formula (or list of formulas) indicating how the latent traits can be regressed using external covariates in
covdata
. If a named list of formulas is supplied (where the names correspond to the latent trait/attribute names inmodel
) then specific regression effects can be estimated for each factor. Supplying a single formula will estimate the regression parameters for all latent variables by default- itemtype
a vector indicating the itemtype associated with each item. For discrete models this is limited to only 'lca' or items defined using a
createItem
definition- optimizer
optimizer used for the M-step, set to
'nlminb'
by default. Seemirt
for more details- return_max
logical; when
nruns > 1
, return the model that has the most optimal maximum likelihood criteria? If FALSE, returns a list of all the estimated objects- group
a factor variable indicating group membership used for multiple group analyses
- GenRandomPars
logical; use random starting values
- verbose
logical; turn on messages to the R console
- pars
used for modifying starting values; see
mirt
for details- technical
list of lower-level inputs. See
mirt
for details- ...
additional arguments to be passed to the estimation engine. See
mirt
for more details and examples
Details
Posterior classification accuracy for each response pattern may be obtained
via the fscores
function. The summary()
function will display
the category probability values given the class membership, which can also
be displayed graphically with plot()
, while coef()
displays the raw coefficient values (and their standard errors, if estimated). Finally,
anova()
is used to compare nested models, while
M2
and itemfit
may be used for model fitting purposes.
'lca' model definition
The latent class IRT model with two latent classes has the form
$$P(x = k|\theta_1, \theta_2, a1, a2) = \frac{exp(a1 \theta_1 + a2 \theta_2)}{ \sum_j^K exp(a1 \theta_1 + a2 \theta_2)}$$
where the \(\theta\) values generally take on discrete points (such as 0 or 1). For proper identification, the first category slope parameters (\(a1\) and \(a2\)) are never freely estimated. Alternatively, supplying a different grid of \(\theta\) values will allow the estimation of similar models (multidimensional discrete models, grade of membership, etc.). See the examples below.
When the item.Q
for is utilized, the above equation can be understood as
$$P(x = k|\theta_1, \theta_2, a1, a2) = \frac{exp(a1 \theta_1 Q_{j1} + a2 \theta_2 Q_{j2})}{ \sum_j^K exp(a1 \theta_1 Q_{j1} + a2 \theta_2 Q_{j2})}$$
where by construction Q
is a \(K_i \times A\) matrix indicating whether the category should
be modeled according to the latent class structure. For the standard latent class model, the Q-matrix
has as many rows as categories, as many columns as the number of classes/attributes modeled,
and consist of 0's in the first row and 1's elsewhere. This of course can be over-written by passing
an alternative item.Q
definition for each respective item.
References
Chalmers, R., P. (2012). mirt: A Multidimensional Item Response Theory Package for the R Environment. Journal of Statistical Software, 48(6), 1-29.
Proctor, C. H. (1970). A probabilistic formulation and statistical analysis for Guttman scaling. Psychometrika, 35, 73-78. doi:10.18637/jss.v048.i06
Author
Phil Chalmers rphilip.chalmers@gmail.com
Examples
# LSAT6 dataset
dat <- expand.table(LSAT6)
# fit with 2-3 latent classes
(mod2 <- mdirt(dat, 2))
#>
Iteration: 1, Log-Lik: -3209.317, Max-Change: 2.67790
Iteration: 2, Log-Lik: -2475.451, Max-Change: 0.21401
Iteration: 3, Log-Lik: -2470.563, Max-Change: 0.09205
Iteration: 4, Log-Lik: -2469.778, Max-Change: 0.05245
Iteration: 5, Log-Lik: -2469.535, Max-Change: 0.03452
Iteration: 6, Log-Lik: -2469.405, Max-Change: 0.02589
Iteration: 7, Log-Lik: -2469.246, Max-Change: 0.02880
Iteration: 8, Log-Lik: -2469.158, Max-Change: 0.01855
Iteration: 9, Log-Lik: -2469.093, Max-Change: 0.01736
Iteration: 10, Log-Lik: -2469.039, Max-Change: 0.03472
Iteration: 11, Log-Lik: -2468.920, Max-Change: 0.01539
Iteration: 12, Log-Lik: -2468.878, Max-Change: 0.01420
Iteration: 13, Log-Lik: -2468.831, Max-Change: 0.02339
Iteration: 14, Log-Lik: -2468.770, Max-Change: 0.01408
Iteration: 15, Log-Lik: -2468.735, Max-Change: 0.01310
Iteration: 16, Log-Lik: -2468.695, Max-Change: 0.02106
Iteration: 17, Log-Lik: -2468.645, Max-Change: 0.01246
Iteration: 18, Log-Lik: -2468.615, Max-Change: 0.01221
Iteration: 19, Log-Lik: -2468.581, Max-Change: 0.02004
Iteration: 20, Log-Lik: -2468.534, Max-Change: 0.01204
Iteration: 21, Log-Lik: -2468.507, Max-Change: 0.01004
Iteration: 22, Log-Lik: -2468.496, Max-Change: 0.02473
Iteration: 23, Log-Lik: -2468.422, Max-Change: 0.01147
Iteration: 24, Log-Lik: -2468.398, Max-Change: 0.01041
Iteration: 25, Log-Lik: -2468.376, Max-Change: 0.02018
Iteration: 26, Log-Lik: -2468.328, Max-Change: 0.01024
Iteration: 27, Log-Lik: -2468.308, Max-Change: 0.00924
Iteration: 28, Log-Lik: -2468.300, Max-Change: 0.02179
Iteration: 29, Log-Lik: -2468.241, Max-Change: 0.00899
Iteration: 30, Log-Lik: -2468.223, Max-Change: 0.00917
Iteration: 31, Log-Lik: -2468.205, Max-Change: 0.01688
Iteration: 32, Log-Lik: -2468.170, Max-Change: 0.00823
Iteration: 33, Log-Lik: -2468.154, Max-Change: 0.00852
Iteration: 34, Log-Lik: -2468.138, Max-Change: 0.01546
Iteration: 35, Log-Lik: -2468.107, Max-Change: 0.00808
Iteration: 36, Log-Lik: -2468.093, Max-Change: 0.00770
Iteration: 37, Log-Lik: -2468.085, Max-Change: 0.01709
Iteration: 38, Log-Lik: -2468.047, Max-Change: 0.00756
Iteration: 39, Log-Lik: -2468.033, Max-Change: 0.00694
Iteration: 40, Log-Lik: -2468.033, Max-Change: 0.01749
Iteration: 41, Log-Lik: -2467.990, Max-Change: 0.00707
Iteration: 42, Log-Lik: -2467.978, Max-Change: 0.00670
Iteration: 43, Log-Lik: -2467.975, Max-Change: 0.01603
Iteration: 44, Log-Lik: -2467.939, Max-Change: 0.00663
Iteration: 45, Log-Lik: -2467.929, Max-Change: 0.00628
Iteration: 46, Log-Lik: -2467.927, Max-Change: 0.01508
Iteration: 47, Log-Lik: -2467.895, Max-Change: 0.00612
Iteration: 48, Log-Lik: -2467.885, Max-Change: 0.00614
Iteration: 49, Log-Lik: -2467.886, Max-Change: 0.01511
Iteration: 50, Log-Lik: -2467.853, Max-Change: 0.00604
Iteration: 51, Log-Lik: -2467.844, Max-Change: 0.00554
Iteration: 52, Log-Lik: -2467.835, Max-Change: 0.01010
Iteration: 53, Log-Lik: -2467.821, Max-Change: 0.00526
Iteration: 54, Log-Lik: -2467.814, Max-Change: 0.00533
Iteration: 55, Log-Lik: -2467.814, Max-Change: 0.01387
Iteration: 56, Log-Lik: -2467.787, Max-Change: 0.00518
Iteration: 57, Log-Lik: -2467.780, Max-Change: 0.00445
Iteration: 58, Log-Lik: -2467.778, Max-Change: 0.01108
Iteration: 59, Log-Lik: -2467.758, Max-Change: 0.00559
Iteration: 60, Log-Lik: -2467.752, Max-Change: 0.00397
Iteration: 61, Log-Lik: -2467.745, Max-Change: 0.00700
Iteration: 62, Log-Lik: -2467.738, Max-Change: 0.00388
Iteration: 63, Log-Lik: -2467.732, Max-Change: 0.00563
Iteration: 64, Log-Lik: -2467.724, Max-Change: 0.00766
Iteration: 65, Log-Lik: -2467.716, Max-Change: 0.00398
Iteration: 66, Log-Lik: -2467.711, Max-Change: 0.00555
Iteration: 67, Log-Lik: -2467.704, Max-Change: 0.00726
Iteration: 68, Log-Lik: -2467.696, Max-Change: 0.00448
Iteration: 69, Log-Lik: -2467.691, Max-Change: 0.00340
Iteration: 70, Log-Lik: -2467.691, Max-Change: 0.00960
Iteration: 71, Log-Lik: -2467.675, Max-Change: 0.00460
Iteration: 72, Log-Lik: -2467.671, Max-Change: 0.00320
Iteration: 73, Log-Lik: -2467.666, Max-Change: 0.00647
Iteration: 74, Log-Lik: -2467.660, Max-Change: 0.00455
Iteration: 75, Log-Lik: -2467.655, Max-Change: 0.00346
Iteration: 76, Log-Lik: -2467.651, Max-Change: 0.00712
Iteration: 77, Log-Lik: -2467.643, Max-Change: 0.00459
Iteration: 78, Log-Lik: -2467.639, Max-Change: 0.00319
Iteration: 79, Log-Lik: -2467.635, Max-Change: 0.00690
Iteration: 80, Log-Lik: -2467.628, Max-Change: 0.00439
Iteration: 81, Log-Lik: -2467.624, Max-Change: 0.00308
Iteration: 82, Log-Lik: -2467.620, Max-Change: 0.00652
Iteration: 83, Log-Lik: -2467.615, Max-Change: 0.00407
Iteration: 84, Log-Lik: -2467.611, Max-Change: 0.00291
Iteration: 85, Log-Lik: -2467.607, Max-Change: 0.00631
Iteration: 86, Log-Lik: -2467.602, Max-Change: 0.00328
Iteration: 87, Log-Lik: -2467.598, Max-Change: 0.00354
Iteration: 88, Log-Lik: -2467.595, Max-Change: 0.00435
Iteration: 89, Log-Lik: -2467.591, Max-Change: 0.00248
Iteration: 90, Log-Lik: -2467.589, Max-Change: 0.00331
Iteration: 91, Log-Lik: -2467.585, Max-Change: 0.00380
Iteration: 92, Log-Lik: -2467.582, Max-Change: 0.00267
Iteration: 93, Log-Lik: -2467.580, Max-Change: 0.00317
Iteration: 94, Log-Lik: -2467.577, Max-Change: 0.00345
Iteration: 95, Log-Lik: -2467.574, Max-Change: 0.00281
Iteration: 96, Log-Lik: -2467.572, Max-Change: 0.00304
Iteration: 97, Log-Lik: -2467.569, Max-Change: 0.00319
Iteration: 98, Log-Lik: -2467.566, Max-Change: 0.00287
Iteration: 99, Log-Lik: -2467.564, Max-Change: 0.00293
Iteration: 100, Log-Lik: -2467.561, Max-Change: 0.00299
Iteration: 101, Log-Lik: -2467.559, Max-Change: 0.00287
Iteration: 102, Log-Lik: -2467.557, Max-Change: 0.00283
Iteration: 103, Log-Lik: -2467.554, Max-Change: 0.00284
Iteration: 104, Log-Lik: -2467.552, Max-Change: 0.00284
Iteration: 105, Log-Lik: -2467.550, Max-Change: 0.00276
Iteration: 106, Log-Lik: -2467.548, Max-Change: 0.00271
Iteration: 107, Log-Lik: -2467.546, Max-Change: 0.00278
Iteration: 108, Log-Lik: -2467.544, Max-Change: 0.00269
Iteration: 109, Log-Lik: -2467.542, Max-Change: 0.00265
Iteration: 110, Log-Lik: -2467.540, Max-Change: 0.00272
Iteration: 111, Log-Lik: -2467.538, Max-Change: 0.00264
Iteration: 112, Log-Lik: -2467.536, Max-Change: 0.00258
Iteration: 113, Log-Lik: -2467.534, Max-Change: 0.00264
Iteration: 114, Log-Lik: -2467.532, Max-Change: 0.00258
Iteration: 115, Log-Lik: -2467.530, Max-Change: 0.00252
Iteration: 116, Log-Lik: -2467.528, Max-Change: 0.00257
Iteration: 117, Log-Lik: -2467.527, Max-Change: 0.00253
Iteration: 118, Log-Lik: -2467.525, Max-Change: 0.00247
Iteration: 119, Log-Lik: -2467.523, Max-Change: 0.00249
Iteration: 120, Log-Lik: -2467.522, Max-Change: 0.00248
Iteration: 121, Log-Lik: -2467.520, Max-Change: 0.00241
Iteration: 122, Log-Lik: -2467.518, Max-Change: 0.00241
Iteration: 123, Log-Lik: -2467.517, Max-Change: 0.00243
Iteration: 124, Log-Lik: -2467.515, Max-Change: 0.00236
Iteration: 125, Log-Lik: -2467.513, Max-Change: 0.00234
Iteration: 126, Log-Lik: -2467.512, Max-Change: 0.00238
Iteration: 127, Log-Lik: -2467.510, Max-Change: 0.00230
Iteration: 128, Log-Lik: -2467.509, Max-Change: 0.00226
Iteration: 129, Log-Lik: -2467.507, Max-Change: 0.00233
Iteration: 130, Log-Lik: -2467.506, Max-Change: 0.00225
Iteration: 131, Log-Lik: -2467.504, Max-Change: 0.00219
Iteration: 132, Log-Lik: -2467.503, Max-Change: 0.00229
Iteration: 133, Log-Lik: -2467.502, Max-Change: 0.00220
Iteration: 134, Log-Lik: -2467.500, Max-Change: 0.00212
Iteration: 135, Log-Lik: -2467.499, Max-Change: 0.00224
Iteration: 136, Log-Lik: -2467.497, Max-Change: 0.00215
Iteration: 137, Log-Lik: -2467.496, Max-Change: 0.00206
Iteration: 138, Log-Lik: -2467.495, Max-Change: 0.00220
Iteration: 139, Log-Lik: -2467.494, Max-Change: 0.00210
Iteration: 140, Log-Lik: -2467.492, Max-Change: 0.00187
Iteration: 141, Log-Lik: -2467.491, Max-Change: 0.00201
Iteration: 142, Log-Lik: -2467.490, Max-Change: 0.00317
Iteration: 143, Log-Lik: -2467.488, Max-Change: 0.00198
Iteration: 144, Log-Lik: -2467.487, Max-Change: 0.00189
Iteration: 145, Log-Lik: -2467.486, Max-Change: 0.00406
Iteration: 146, Log-Lik: -2467.483, Max-Change: 0.00201
Iteration: 147, Log-Lik: -2467.482, Max-Change: 0.00181
Iteration: 148, Log-Lik: -2467.481, Max-Change: 0.00400
Iteration: 149, Log-Lik: -2467.478, Max-Change: 0.00197
Iteration: 150, Log-Lik: -2467.477, Max-Change: 0.00176
Iteration: 151, Log-Lik: -2467.477, Max-Change: 0.00376
Iteration: 152, Log-Lik: -2467.474, Max-Change: 0.00187
Iteration: 153, Log-Lik: -2467.473, Max-Change: 0.00170
Iteration: 154, Log-Lik: -2467.473, Max-Change: 0.00390
Iteration: 155, Log-Lik: -2467.470, Max-Change: 0.00187
Iteration: 156, Log-Lik: -2467.469, Max-Change: 0.00166
Iteration: 157, Log-Lik: -2467.468, Max-Change: 0.00336
Iteration: 158, Log-Lik: -2467.466, Max-Change: 0.00170
Iteration: 159, Log-Lik: -2467.465, Max-Change: 0.00159
Iteration: 160, Log-Lik: -2467.465, Max-Change: 0.00373
Iteration: 161, Log-Lik: -2467.463, Max-Change: 0.00177
Iteration: 162, Log-Lik: -2467.462, Max-Change: 0.00156
Iteration: 163, Log-Lik: -2467.461, Max-Change: 0.00308
Iteration: 164, Log-Lik: -2467.459, Max-Change: 0.00157
Iteration: 165, Log-Lik: -2467.459, Max-Change: 0.00149
Iteration: 166, Log-Lik: -2467.458, Max-Change: 0.00352
Iteration: 167, Log-Lik: -2467.456, Max-Change: 0.00166
Iteration: 168, Log-Lik: -2467.455, Max-Change: 0.00147
Iteration: 169, Log-Lik: -2467.455, Max-Change: 0.00282
Iteration: 170, Log-Lik: -2467.453, Max-Change: 0.00154
Iteration: 171, Log-Lik: -2467.453, Max-Change: 0.00139
Iteration: 172, Log-Lik: -2467.452, Max-Change: 0.00333
Iteration: 173, Log-Lik: -2467.450, Max-Change: 0.00155
Iteration: 174, Log-Lik: -2467.450, Max-Change: 0.00138
Iteration: 175, Log-Lik: -2467.449, Max-Change: 0.00283
Iteration: 176, Log-Lik: -2467.448, Max-Change: 0.00161
Iteration: 177, Log-Lik: -2467.447, Max-Change: 0.00127
Iteration: 178, Log-Lik: -2467.446, Max-Change: 0.00220
Iteration: 179, Log-Lik: -2467.446, Max-Change: 0.00138
Iteration: 180, Log-Lik: -2467.445, Max-Change: 0.00124
Iteration: 181, Log-Lik: -2467.445, Max-Change: 0.00307
Iteration: 182, Log-Lik: -2467.443, Max-Change: 0.00142
Iteration: 183, Log-Lik: -2467.443, Max-Change: 0.00127
Iteration: 184, Log-Lik: -2467.442, Max-Change: 0.00248
Iteration: 185, Log-Lik: -2467.441, Max-Change: 0.00144
Iteration: 186, Log-Lik: -2467.440, Max-Change: 0.00116
Iteration: 187, Log-Lik: -2467.440, Max-Change: 0.00212
Iteration: 188, Log-Lik: -2467.439, Max-Change: 0.00132
Iteration: 189, Log-Lik: -2467.439, Max-Change: 0.00114
Iteration: 190, Log-Lik: -2467.438, Max-Change: 0.00267
Iteration: 191, Log-Lik: -2467.437, Max-Change: 0.00125
Iteration: 192, Log-Lik: -2467.437, Max-Change: 0.00115
Iteration: 193, Log-Lik: -2467.436, Max-Change: 0.00233
Iteration: 194, Log-Lik: -2467.435, Max-Change: 0.00126
Iteration: 195, Log-Lik: -2467.435, Max-Change: 0.00110
Iteration: 196, Log-Lik: -2467.435, Max-Change: 0.00227
Iteration: 197, Log-Lik: -2467.434, Max-Change: 0.00123
Iteration: 198, Log-Lik: -2467.433, Max-Change: 0.00107
Iteration: 199, Log-Lik: -2467.433, Max-Change: 0.00237
Iteration: 200, Log-Lik: -2467.432, Max-Change: 0.00133
Iteration: 201, Log-Lik: -2467.432, Max-Change: 0.00161
Iteration: 202, Log-Lik: -2467.431, Max-Change: 0.00163
Iteration: 203, Log-Lik: -2467.431, Max-Change: 0.00107
Iteration: 204, Log-Lik: -2467.430, Max-Change: 0.00099
Iteration: 205, Log-Lik: -2467.430, Max-Change: 0.00244
Iteration: 206, Log-Lik: -2467.429, Max-Change: 0.00111
Iteration: 207, Log-Lik: -2467.429, Max-Change: 0.00100
Iteration: 208, Log-Lik: -2467.429, Max-Change: 0.00199
Iteration: 209, Log-Lik: -2467.428, Max-Change: 0.00090
Iteration: 210, Log-Lik: -2467.428, Max-Change: 0.00123
Iteration: 211, Log-Lik: -2467.427, Max-Change: 0.00080
Iteration: 212, Log-Lik: -2467.427, Max-Change: 0.00094
Iteration: 213, Log-Lik: -2467.427, Max-Change: 0.00090
Iteration: 214, Log-Lik: -2467.427, Max-Change: 0.00227
Iteration: 215, Log-Lik: -2467.426, Max-Change: 0.00102
Iteration: 216, Log-Lik: -2467.425, Max-Change: 0.00093
Iteration: 217, Log-Lik: -2467.425, Max-Change: 0.00176
Iteration: 218, Log-Lik: -2467.425, Max-Change: 0.00081
Iteration: 219, Log-Lik: -2467.424, Max-Change: 0.00113
Iteration: 220, Log-Lik: -2467.424, Max-Change: 0.00074
Iteration: 221, Log-Lik: -2467.424, Max-Change: 0.00086
Iteration: 222, Log-Lik: -2467.424, Max-Change: 0.00083
Iteration: 223, Log-Lik: -2467.424, Max-Change: 0.00196
Iteration: 224, Log-Lik: -2467.423, Max-Change: 0.00105
Iteration: 225, Log-Lik: -2467.423, Max-Change: 0.00069
Iteration: 226, Log-Lik: -2467.422, Max-Change: 0.00081
Iteration: 227, Log-Lik: -2467.422, Max-Change: 0.00106
Iteration: 228, Log-Lik: -2467.422, Max-Change: 0.00070
Iteration: 229, Log-Lik: -2467.422, Max-Change: 0.00071
Iteration: 230, Log-Lik: -2467.421, Max-Change: 0.00103
Iteration: 231, Log-Lik: -2467.421, Max-Change: 0.00068
Iteration: 232, Log-Lik: -2467.421, Max-Change: 0.00069
Iteration: 233, Log-Lik: -2467.421, Max-Change: 0.00100
Iteration: 234, Log-Lik: -2467.421, Max-Change: 0.00066
Iteration: 235, Log-Lik: -2467.420, Max-Change: 0.00068
Iteration: 236, Log-Lik: -2467.420, Max-Change: 0.00098
Iteration: 237, Log-Lik: -2467.420, Max-Change: 0.00065
Iteration: 238, Log-Lik: -2467.420, Max-Change: 0.00067
Iteration: 239, Log-Lik: -2467.420, Max-Change: 0.00096
Iteration: 240, Log-Lik: -2467.419, Max-Change: 0.00063
Iteration: 241, Log-Lik: -2467.419, Max-Change: 0.00066
Iteration: 242, Log-Lik: -2467.419, Max-Change: 0.00094
Iteration: 243, Log-Lik: -2467.419, Max-Change: 0.00062
Iteration: 244, Log-Lik: -2467.419, Max-Change: 0.00064
Iteration: 245, Log-Lik: -2467.418, Max-Change: 0.00092
Iteration: 246, Log-Lik: -2467.418, Max-Change: 0.00061
Iteration: 247, Log-Lik: -2467.418, Max-Change: 0.00063
Iteration: 248, Log-Lik: -2467.418, Max-Change: 0.00090
Iteration: 249, Log-Lik: -2467.418, Max-Change: 0.00060
Iteration: 250, Log-Lik: -2467.418, Max-Change: 0.00062
Iteration: 251, Log-Lik: -2467.417, Max-Change: 0.00088
Iteration: 252, Log-Lik: -2467.417, Max-Change: 0.00058
Iteration: 253, Log-Lik: -2467.417, Max-Change: 0.00061
Iteration: 254, Log-Lik: -2467.417, Max-Change: 0.00086
Iteration: 255, Log-Lik: -2467.417, Max-Change: 0.00057
Iteration: 256, Log-Lik: -2467.417, Max-Change: 0.00059
Iteration: 257, Log-Lik: -2467.417, Max-Change: 0.00085
Iteration: 258, Log-Lik: -2467.416, Max-Change: 0.00056
Iteration: 259, Log-Lik: -2467.416, Max-Change: 0.00058
Iteration: 260, Log-Lik: -2467.416, Max-Change: 0.00083
Iteration: 261, Log-Lik: -2467.416, Max-Change: 0.00055
Iteration: 262, Log-Lik: -2467.416, Max-Change: 0.00057
Iteration: 263, Log-Lik: -2467.416, Max-Change: 0.00081
Iteration: 264, Log-Lik: -2467.416, Max-Change: 0.00054
Iteration: 265, Log-Lik: -2467.415, Max-Change: 0.00055
Iteration: 266, Log-Lik: -2467.415, Max-Change: 0.00079
Iteration: 267, Log-Lik: -2467.415, Max-Change: 0.00053
Iteration: 268, Log-Lik: -2467.415, Max-Change: 0.00054
Iteration: 269, Log-Lik: -2467.415, Max-Change: 0.00078
Iteration: 270, Log-Lik: -2467.415, Max-Change: 0.00052
Iteration: 271, Log-Lik: -2467.415, Max-Change: 0.00053
Iteration: 272, Log-Lik: -2467.414, Max-Change: 0.00076
Iteration: 273, Log-Lik: -2467.414, Max-Change: 0.00051
Iteration: 274, Log-Lik: -2467.414, Max-Change: 0.00052
Iteration: 275, Log-Lik: -2467.414, Max-Change: 0.00075
Iteration: 276, Log-Lik: -2467.414, Max-Change: 0.00050
Iteration: 277, Log-Lik: -2467.414, Max-Change: 0.00051
Iteration: 278, Log-Lik: -2467.414, Max-Change: 0.00073
Iteration: 279, Log-Lik: -2467.414, Max-Change: 0.00049
Iteration: 280, Log-Lik: -2467.414, Max-Change: 0.00049
Iteration: 281, Log-Lik: -2467.413, Max-Change: 0.00072
Iteration: 282, Log-Lik: -2467.413, Max-Change: 0.00048
Iteration: 283, Log-Lik: -2467.413, Max-Change: 0.00048
Iteration: 284, Log-Lik: -2467.413, Max-Change: 0.00070
Iteration: 285, Log-Lik: -2467.413, Max-Change: 0.00047
Iteration: 286, Log-Lik: -2467.413, Max-Change: 0.00047
Iteration: 287, Log-Lik: -2467.413, Max-Change: 0.00069
Iteration: 288, Log-Lik: -2467.413, Max-Change: 0.00046
Iteration: 289, Log-Lik: -2467.413, Max-Change: 0.00046
Iteration: 290, Log-Lik: -2467.413, Max-Change: 0.00067
Iteration: 291, Log-Lik: -2467.412, Max-Change: 0.00045
Iteration: 292, Log-Lik: -2467.412, Max-Change: 0.00045
Iteration: 293, Log-Lik: -2467.412, Max-Change: 0.00066
Iteration: 294, Log-Lik: -2467.412, Max-Change: 0.00044
Iteration: 295, Log-Lik: -2467.412, Max-Change: 0.00044
Iteration: 296, Log-Lik: -2467.412, Max-Change: 0.00064
Iteration: 297, Log-Lik: -2467.412, Max-Change: 0.00043
Iteration: 298, Log-Lik: -2467.412, Max-Change: 0.00043
Iteration: 299, Log-Lik: -2467.412, Max-Change: 0.00063
Iteration: 300, Log-Lik: -2467.412, Max-Change: 0.00042
Iteration: 301, Log-Lik: -2467.412, Max-Change: 0.00042
Iteration: 302, Log-Lik: -2467.412, Max-Change: 0.00062
Iteration: 303, Log-Lik: -2467.411, Max-Change: 0.00041
Iteration: 304, Log-Lik: -2467.411, Max-Change: 0.00041
Iteration: 305, Log-Lik: -2467.411, Max-Change: 0.00061
Iteration: 306, Log-Lik: -2467.411, Max-Change: 0.00040
Iteration: 307, Log-Lik: -2467.411, Max-Change: 0.00040
Iteration: 308, Log-Lik: -2467.411, Max-Change: 0.00059
Iteration: 309, Log-Lik: -2467.411, Max-Change: 0.00040
Iteration: 310, Log-Lik: -2467.411, Max-Change: 0.00040
Iteration: 311, Log-Lik: -2467.411, Max-Change: 0.00058
Iteration: 312, Log-Lik: -2467.411, Max-Change: 0.00039
Iteration: 313, Log-Lik: -2467.411, Max-Change: 0.00039
Iteration: 314, Log-Lik: -2467.411, Max-Change: 0.00057
Iteration: 315, Log-Lik: -2467.411, Max-Change: 0.00038
Iteration: 316, Log-Lik: -2467.410, Max-Change: 0.00038
Iteration: 317, Log-Lik: -2467.410, Max-Change: 0.00056
Iteration: 318, Log-Lik: -2467.410, Max-Change: 0.00037
Iteration: 319, Log-Lik: -2467.410, Max-Change: 0.00037
Iteration: 320, Log-Lik: -2467.410, Max-Change: 0.00055
Iteration: 321, Log-Lik: -2467.410, Max-Change: 0.00037
Iteration: 322, Log-Lik: -2467.410, Max-Change: 0.00036
Iteration: 323, Log-Lik: -2467.410, Max-Change: 0.00054
Iteration: 324, Log-Lik: -2467.410, Max-Change: 0.00036
Iteration: 325, Log-Lik: -2467.410, Max-Change: 0.00036
Iteration: 326, Log-Lik: -2467.410, Max-Change: 0.00052
Iteration: 327, Log-Lik: -2467.410, Max-Change: 0.00035
Iteration: 328, Log-Lik: -2467.410, Max-Change: 0.00035
Iteration: 329, Log-Lik: -2467.410, Max-Change: 0.00051
Iteration: 330, Log-Lik: -2467.410, Max-Change: 0.00034
Iteration: 331, Log-Lik: -2467.410, Max-Change: 0.00034
Iteration: 332, Log-Lik: -2467.410, Max-Change: 0.00050
Iteration: 333, Log-Lik: -2467.409, Max-Change: 0.00034
Iteration: 334, Log-Lik: -2467.409, Max-Change: 0.00034
Iteration: 335, Log-Lik: -2467.409, Max-Change: 0.00049
Iteration: 336, Log-Lik: -2467.409, Max-Change: 0.00033
Iteration: 337, Log-Lik: -2467.409, Max-Change: 0.00033
Iteration: 338, Log-Lik: -2467.409, Max-Change: 0.00048
Iteration: 339, Log-Lik: -2467.409, Max-Change: 0.00032
Iteration: 340, Log-Lik: -2467.409, Max-Change: 0.00032
Iteration: 341, Log-Lik: -2467.409, Max-Change: 0.00047
Iteration: 342, Log-Lik: -2467.409, Max-Change: 0.00032
Iteration: 343, Log-Lik: -2467.409, Max-Change: 0.00032
Iteration: 344, Log-Lik: -2467.409, Max-Change: 0.00046
Iteration: 345, Log-Lik: -2467.409, Max-Change: 0.00031
Iteration: 346, Log-Lik: -2467.409, Max-Change: 0.00031
Iteration: 347, Log-Lik: -2467.409, Max-Change: 0.00045
Iteration: 348, Log-Lik: -2467.409, Max-Change: 0.00031
Iteration: 349, Log-Lik: -2467.409, Max-Change: 0.00030
Iteration: 350, Log-Lik: -2467.409, Max-Change: 0.00045
Iteration: 351, Log-Lik: -2467.409, Max-Change: 0.00030
Iteration: 352, Log-Lik: -2467.409, Max-Change: 0.00030
Iteration: 353, Log-Lik: -2467.409, Max-Change: 0.00044
Iteration: 354, Log-Lik: -2467.409, Max-Change: 0.00029
Iteration: 355, Log-Lik: -2467.408, Max-Change: 0.00029
Iteration: 356, Log-Lik: -2467.408, Max-Change: 0.00043
Iteration: 357, Log-Lik: -2467.408, Max-Change: 0.00029
Iteration: 358, Log-Lik: -2467.408, Max-Change: 0.00029
Iteration: 359, Log-Lik: -2467.408, Max-Change: 0.00042
Iteration: 360, Log-Lik: -2467.408, Max-Change: 0.00028
Iteration: 361, Log-Lik: -2467.408, Max-Change: 0.00028
Iteration: 362, Log-Lik: -2467.408, Max-Change: 0.00041
Iteration: 363, Log-Lik: -2467.408, Max-Change: 0.00000
#>
#> Call:
#> mdirt(data = dat, model = 2)
#>
#> Latent class model with 2 classes and 2 profiles.
#> Converged within 1e-04 tolerance after 363 EM iterations.
#> mirt version: 1.43.2
#> M-step optimizer: nlminb
#> EM acceleration: Ramsay
#> Latent density type: discrete
#>
#> Log-likelihood = -2467.408
#> Estimated parameters: 11
#> AIC = 4956.816
#> BIC = 5010.802; SABIC = 4975.865
#> G2 (20) = 22.74, p = 0.3018, RMSEA = 0.012
if (FALSE) { # \dontrun{
(mod3 <- mdirt(dat, 3))
summary(mod2)
residuals(mod2)
residuals(mod2, type = 'exp')
anova(mod2, mod3)
M2(mod2)
itemfit(mod2)
# generate classification plots
plot(mod2)
plot(mod2, facet_items = FALSE)
plot(mod2, profile = TRUE)
# available for polytomous data
mod <- mdirt(Science, 2)
summary(mod)
plot(mod)
plot(mod, profile=TRUE)
# classification based on response patterns
fscores(mod2, full.scores = FALSE)
# classify individuals either with the largest posterior probability.....
fs <- fscores(mod2)
head(fs)
classes <- 1:2
class_max <- classes[apply(apply(fs, 1, max) == fs, 1, which)]
table(class_max)
# ... or by probability sampling (i.e., plausible value draws)
class_prob <- apply(fs, 1, function(x) sample(1:2, 1, prob=x))
table(class_prob)
# plausible value imputations for stochastic classification in both classes
pvs <- fscores(mod2, plausible.draws=10)
tabs <- lapply(pvs, function(x) apply(x, 2, table))
tabs[[1]]
# fit with random starting points (run in parallel to save time)
if(interactive()) mirtCluster()
mod <- mdirt(dat, 2, nruns=10)
#--------------------------
# Grade of measurement model
# define a custom Theta grid for including a 'fuzzy' class membership
(Theta <- matrix(c(1, 0, .5, .5, 0, 1), nrow=3 , ncol=2, byrow=TRUE))
(mod_gom <- mdirt(dat, 2, customTheta = Theta))
summary(mod_gom)
#-----------------
# Multidimensional discrete latent class model
dat <- key2binary(SAT12,
key = c(1,4,5,2,3,1,2,1,3,1,2,4,2,1,5,3,4,4,1,4,3,3,4,1,3,5,1,3,1,5,4,5))
# define Theta grid for three latent classes
(Theta <- thetaComb(0:1, 3))
(mod_discrete <- mdirt(dat, 3, customTheta = Theta))
summary(mod_discrete)
# Located latent class model
model <- mirt.model('C1 = 1-32
C2 = 1-32
C3 = 1-32
CONSTRAIN = (1-32, a1), (1-32, a2), (1-32, a3)')
(mod_located <- mdirt(dat, model, customTheta = diag(3)))
summary(mod_located)
#-----------------
### DINA model example
# generate some suitable data for a two dimensional DINA application
# (first columns are intercepts)
set.seed(1)
Theta <- expand.table(matrix(c(1,0,0,0,
1,1,0,0,
1,0,1,0,
1,1,1,1), 4, 4, byrow=TRUE),
freq = c(200,200,100,500))
a <- matrix(c(rnorm(15, -1.5, .5), rlnorm(5, .2, .3), numeric(15), rlnorm(5, .2, .3),
numeric(15), rlnorm(5, .2, .3)), 15, 4)
guess <- plogis(a[11:15,1]) # population guess
slip <- 1 - plogis(rowSums(a[11:15,])) # population slip
dat <- simdata(a, Theta=Theta, itemtype = 'lca')
# first column is the intercept, 2nd and 3rd are attributes
theta <- cbind(1, thetaComb(0:1, 2))
theta <- cbind(theta, theta[,2] * theta[,3]) #DINA interaction of main attributes
model <- mirt.model('Intercept = 1-15
A1 = 1-5
A2 = 6-10
A1A2 = 11-15')
# last 5 items are DINA (first 10 are unidimensional C-RUMs)
DINA <- mdirt(dat, model, customTheta = theta)
coef(DINA, simplify=TRUE)
summary(DINA)
M2(DINA) # fits well (as it should)
cfs <- coef(DINA, simplify=TRUE)$items[11:15,]
cbind(guess, estguess = plogis(cfs[,1]))
cbind(slip, estslip = 1 - plogis(rowSums(cfs)))
### DINO model example
theta <- cbind(1, thetaComb(0:1, 2))
# define theta matrix with negative interaction term
(theta <- cbind(theta, -theta[,2] * theta[,3]))
model <- mirt.model('Intercept = 1-15
A1 = 1-5, 11-15
A2 = 6-15
Yoshi = 11-15
CONSTRAIN = (11,a2,a3,a4), (12,a2,a3,a4), (13,a2,a3,a4),
(14,a2,a3,a4), (15,a2,a3,a4)')
# last five items are DINOs (first 10 are unidimensional C-RUMs)
DINO <- mdirt(dat, model, customTheta = theta)
coef(DINO, simplify=TRUE)
summary(DINO)
M2(DINO) #doesn't fit as well, because not the generating model
## C-RUM (analogous to MIRT model)
theta <- cbind(1, thetaComb(0:1, 2))
model <- mirt.model('Intercept = 1-15
A1 = 1-5, 11-15
A2 = 6-15')
CRUM <- mdirt(dat, model, customTheta = theta)
coef(CRUM, simplify=TRUE)
summary(CRUM)
# good fit, but over-saturated (main effects for items 11-15 can be set to 0)
M2(CRUM)
#------------------
# multidimensional latent class model
dat <- key2binary(SAT12,
key = c(1,4,5,2,3,1,2,1,3,1,2,4,2,1,5,3,4,4,1,4,3,3,4,1,3,5,1,3,1,5,4,5))
# 5 latent classes within 2 different sets of items
model <- mirt.model('C1 = 1-16
C2 = 1-16
C3 = 1-16
C4 = 1-16
C5 = 1-16
C6 = 17-32
C7 = 17-32
C8 = 17-32
C9 = 17-32
C10 = 17-32
CONSTRAIN = (1-16, a1), (1-16, a2), (1-16, a3), (1-16, a4), (1-16, a5),
(17-32, a6), (17-32, a7), (17-32, a8), (17-32, a9), (17-32, a10)')
theta <- diag(10) # defined explicitly. Otherwise, this profile is assumed
mod <- mdirt(dat, model, customTheta = theta)
coef(mod, simplify=TRUE)
summary(mod)
#------------------
# multiple group with constrained group probabilities
dat <- key2binary(SAT12,
key = c(1,4,5,2,3,1,2,1,3,1,2,4,2,1,5,3,4,4,1,4,3,3,4,1,3,5,1,3,1,5,4,5))
group <- rep(c('G1', 'G2'), each = nrow(SAT12)/2)
Theta <- diag(2)
# the latent class parameters are technically located in the (nitems + 1) location
model <- mirt.model('A1 = 1-32
A2 = 1-32
CONSTRAINB = (33, c1)')
mod <- mdirt(dat, model, group = group, customTheta = Theta)
coef(mod, simplify=TRUE)
summary(mod)
#------------------
# Probabilistic Guttman Model (Proctor, 1970)
# example analysis can also be found in the sirt package (see ?prob.guttman)
data(data.read, package = 'sirt')
head(data.read)
Theta <- matrix(c(1,0,0,0,
1,1,0,0,
1,1,1,0,
1,1,1,1), 4, byrow=TRUE)
model <- mirt.model("INTERCEPT = 1-12
C1 = 1,7,9,11
C2 = 2,5,8,10,12
C3 = 3,4,6")
mod <- mdirt(data.read, model, customTheta=Theta)
summary(mod)
M2(mod)
itemfit(mod)
} # }