Factor analysis methods and validity evidence: a review of instrument development across the medical education continuum.

2012 
PURPOSE: Instrument development consistent with best practices is necessary for effective assessment and evaluation of learners and programs across the medical education continuum. The author explored the extent to which current factor analytic methods and other techniques for establishing validity are consistent with best practices. METHOD: The author conducted electronic and hand searches of the English-language medical education literature published January 2006 through December 2010. To describe and assess current practices, she systematically abstracted reliability and validity evidence as well as factor analysis methods, data analysis, and reported evidence from instrument development articles reporting the application of exploratory factor analysis and principal component analysis. RESULTS: Sixty-two articles met eligibility criteria. They described 64 instruments and 95 factor analyses. Most studies provided at least one source of evidence based on test content. Almost all reported internal consistency, providing evidence based on internal structure. Evidence based on response process and relationships with other variables was reported less often, and evidence based on consequences of testing was not identified. Factor analysis findings suggest common method selection errors and critical omissions in reporting. CONCLUSIONS: Given the limited reliability and validity evidence provided for the reviewed instruments, educators should carefully consider the available supporting evidence before adopting and applying published instruments. Researchers should design for, test, and report additional evidence to strengthen the argument for reliability and validity of these measures for research and practice.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    87
    References
    46
    Citations
    NaN
    KQI
    []