Assessment, Ahead of Print.
In neuropsychological research, there are a near limitless number of different approaches researchers can choose when designing studies. Here we showcase the multiverse/specification curve technique to establish the robustness of analytical pathways choices within classic psychometric test validation in an example test of executive function. We examined the impact of choices regarding sample groups, sample sizes, test metrics, and covariate inclusions on convergent validation correlations between tests of executive function. Data were available for 87 neurologically healthy adults and 117 stroke survivors, and a total of 2,220 different analyses were run in a multiverse analysis. We found that the type of sample group, sample size, and test metric used for analyses affected validation outcomes. Covariate inclusion choices did not affect the observed coefficients in our analyses. The present analysis demonstrates the importance of carefully justifying every aspect of a psychometric test validation study a priori with theoretical and statistical factors in mind. It is essential to thoroughly consider the purpose and use of a new tool when designing validation studies.