Deborah Schnipke, Ph.D.
Senior Psychometrician
Deborah has over 25 years of experience working in measurement, providing psychometric expertise for all aspects of the test development process in a variety of fields. She earned her B.S. in psychology and statistics at Bowling Green State University, Ohio and her M.A. and Ph.D. in Quantitative Psychology from Johns Hopkins University, Baltimore, Maryland.
Her work encompasses the full spectrum of psychometric activities from job task analyses studies to developing test specifications; from training item writers and reviewers, to performing classical and IRT item and test analyses. Deborah’s work also includes assembling balanced test forms, conducting standard setting studies, scaling and equating test forms. She has investigated test security breaches, provided guidance and psychometric services for third-party accreditation, audited testing programs for adherence to psychometric standards, and performed differential item functioning analyses, timing analyses, etc.
She has conducted and published research on a broad range of topics, including job task analyses, item selection algorithms, adaptive testing, response time analyses, differential item function, test security, test design, etc. For a selection of Deborah’s research and publications, please see the bottom of the page.
She is invested in ensuring that exams are reliable, valid, and fair, and in compliance with industry standards, such as the AERA/APA/NCME standards and NCCA accreditation standards. She has experience as a speaker, reviewer, contributor, and author for major psychometric journals and conferences.
Deborah lives in Wooster, OH with one human, two dogs, three cats, four rabbits, five chickens, and six more chickens. When she is not feeding animals or cleaning up after them, she enjoys ballroom dancing and watching her son play hockey. An avid reader, she is usually in the middle of several non-fiction books at any given time. She has a large collection of cookbooks but almost never cooks. She is able to meditate for five whole minutes at a time.
Dr. Schnipke’s Selected Scholarly Research
Roussos, L.A., Schnipke, D.L., & Pashley, P.J. (1999). A generalized formula for the Mantel-Haenszel differential item functioning parameter. Journal of Educational and Behavioral Statistics, 24, 293-322.
Schnipke, D.L. (1996). Assessing speededness in computer-based tests using item response times. Dissertation Abstracts International, 57(01), 759B.
Schnipke, D.L., & Becker, K. (2007). Making the test development process more efficient using web-based virtual meetings. CLEAR Exam Review, (18), 13-17.
Schnipke, D. L., Becker, K, & Masters, J. S. (2006). Evaluating content-management systems for online learning programs. In D. D. Williams, S. L. Howell, & M. Hricko (Eds.), Online Assessment, Measurement and Evaluation: Emerging Practices. Hershey, PA: Information Science Publishing. (Chapter 17)
Schnipke, D.L. & Green, B.F. (1995). A comparison of item selection routines in linear and adaptive tests. Journal of Educational Measurement, 32, 227-242.
Schnipke, D.L. & Scrams, D.J. (1997). Modeling item response times with a two-state mixture model: A new method of measuring speededness. Journal of Educational Measurement, 34, 213-232.
Schnipke, D.L. & Scrams, D.J. (2002). Exploring issues of examinee behavior: Insights gained from response-time analyses. In C. N. Mills, M. T. Potenza, J. J. Fremer, & W. C. Ward (Eds.), Computer-based testing: Building the foundation for future assessments. Mahwah, NJ: Lawrence Erlbaum Associates.
Schnipke, D.L., & Wiley, A. (2019). Selection and use of item types. In J. Henderson (Ed.), Certification: The ICE Handbook, Third Edition. Washington, DC: Institute for Credentialing Excellence.
van der Linden, W.J., Scrams, D.J., & Schnipke, D.L. (1999). Using response-time constraints in item selection to control for differential speededness in computerized adaptive testing. Applied Psychological Measurement, 23, 195-210.
Wang, N., Schnipke, D., and Witt, E., (2005). Use of knowledge, skill and ability statements in developing licensure and certification examinations. Educational Measurement: Issues and Practice, 24(1), 15-22.