Refining Bayesian hierarchical MPT modeling: Integrating prior knowledge and ordinal expectations

Behav Res Methods. 2024 Apr 16. doi: 10.3758/s13428-024-02370-y. Online ahead of print.ABSTRACTMultinomial processing tree (MPT) models are a broad class of statistical models used to test sophisticated psychological theories. The research questions derived from these theories often go beyond simple condition effects on parameters and involve ordinal expectations (e.g., the same-direction effect on the memory parameter is stronger in one experimental condition than another) or disordinal expectations (e.g., the effect reverses in one experimental condition). Here, we argue that by refining common modeling practices, Bayesian hierarchical models are well suited to estimate and test these expectations. Concretely, we show that the default priors proposed in the literature lead to nonsensical predictions for individuals and the population distribution, leading to problems not only in model comparison but also in parameter estimation. Rather than relying on these priors, we argue that MPT modelers should determine priors that are consistent with their theoretical knowledge. In addition, we demonstrate how Bayesian model comparison may be used to test ordinal and disordinal interactions by means of Bayes factors. We apply the techniques discussed to empirical data from Bell et al. Journal of Experimental Psychology: Learning, Memory, and Cognition, 41, 456-472 (2015).PMID:38627323 | DOI:10.3758/s13428-024-02370-y
Source: Behavior Research Methods - Category: Psychiatry & Psychology Authors: Source Type: research