The purpose of this study was to examine the psychometric properties of the Precalculus Concept Assessment (PCA), a 25-item multiple-choice instrument designed to assess student reasoning abilities and understanding of foundational calculus concepts (Carlson et al., 2010). When this study was conducted, the extant research on the PCA and the PCA Taxonomy lacked in-depth investigations of the instruments' psychometric properties. Most notably was the lack of studies into the validity of the internal structure of PCA response data implied by the PCA Taxonomy. This study specifically investigated the psychometric properties of the three reasoning constructs found in the PCA taxonomy, namely, Process View of Function (R1), Covariational Reasoning (R2), and Computational Abilities (R3). Confirmatory Factor Analysis (CFA) was conducted using a total of 3,018 pretest administrations of the PCA. These data were collected in select College Algebra and Precalculus sections at a large private university in the mountain west and one public university in the Phoenix metropolitan area. Results showed that the three hypothesized reasoning factors were highly correlated. Rival statistical models were evaluated to explain the relationship between the three reasoning constructs. The bifactor model was the best fitting model and successfully partitioned the variance between a general reasoning ability factor and two specific reasoning ability factors. The general factor was the dominant factor accounting for 76% of the variance and accounted for 91% of the reliability. The omegaHS values were low, indicating that this model does not serve as a reliable measure of the two specific factors. PCA response data were retrofitted to diagnostic classification models (DCMs) to evaluate the extent to which individual mastery profiles could be generated to classify individuals as masters or non-masters of the three reasoning constructs. The retrofitting of PCA data to DCMs were unsuccessful. High attribute correlations and other model deficiencies limit the confidence in which these particular models could estimate student mastery. The results of this study have several key implications for future researchers and practitioners using the PCA. Researchers interested in using PCA scores in predictive models should use the General Reasoning Ability factor from the respecified bifactor model or the single-factor model in conjunction with structural equation modeling techniques. Practitioners using the PCA should avoid using PCA subscores for reasoning abilities and continue to follow the recommended practice of reporting a simple sum score (i.e., unit-weighted composite score).



College and Department

David O. McKay School of Education; Educational Inquiry, Measurement, and Evaluation



Date Submitted


Document Type





factor analysis, Diagnostic Classification Models (DCMs), calculus, mathematics education



Included in

Education Commons