Model evaluation and case series data

Research output: Contribution to journalArticle

View graph of relations Save citation

Authors

Abstract

We discuss aggregation of data from neuropsychological patients and the process of evaluating models using data from a series of patients. We argue that aggregation can be misleading but not aggregating can also result in information loss. The basis for combining data needs to be theoretically defined, and the particular method of aggregation depends on the theoretical question and characteristics of the data. We present examples, often drawn from our own research, to illustrate these points. We also argue that statistical models and formal methods of model selection are a useful way to test theoretical accounts using data from several patients in multiple-case studies or case series. Statistical models can often measure fit in a way that explicitly captures what a theory allows; the parameter values that result from model fitting often measure theoretically important dimensions and can lead to more constrained theories or new predictions; and model selection allows the strength of evidence for models to be quantified without forcing this into the artificial binary choice that characterizes hypothesis testing methods. Methods that aggregate and then formally model patient data, however, are not automatically preferred to other methods. Which method is preferred depends on the question to be addressed, characteristics of the data, and practical issues like availability of suitable patients, but case series, multiple-case studies, single-case studies, statistical models, and process models should be complementary methods when guided by theory development.

Request a copy

Request a copy

Details

Original languageEnglish
Pages (from-to)486-499
Number of pages14
JournalCognitive Neuropsychology
Volume28
Issue7
DOIs
StatePublished - Oct 2011

    Keywords

  • case series, model selection , aggregation, multiple-case studies, Akaike's information criterion, single-case studies

DOI

Employable Graduates; Exploitable Research

Copy the text from this field...