Identifying Statistically Actionable Collusion in Remote Proctored Exams

Identifying Statistically Actionable Collusion in Remote Proctored Exams

Authors

  • Kirk Becker, Pearson VUE, 4627 N Leclaire, Chicago, IL 60630
  • Senior Psychometrician, Amazon Web Services (AWS), 9963 Cyrandall Drive, Oakton, VA, 22124

Keywords:

Test Security, Online Proctored Exams, Cheating Detection

Abstract

The rise of online proctoring potentially provides more opportunities for item harvesting and consequent brain dumping and shared “study guides†based on stolen content. This has increased the need for rapid approaches for evaluating and acting on suspicious test responses in every delivery modality. Both hiring proxy test takers and studying unauthorized test content (e.g., “study guides†or brain dumps) result in characteristic patterns of responses, many of which are detectable through collusion analysis. The ability to identify and rapidly revoke test results are one component of stopping test takers from engaging in these behaviors, both in online proctored and test center testing. Existing collusion analyses have typically taken the approach of evaluating all response pairs sequentially, potentially requiring several days to evaluate a set of test results. This paper demonstrates matrix-based methods for quickly calculating exact overlap counts for large data sets, as well as approaches for determining criteria for flagging suspicious results or invalidating results. We discuss and compare the results for simulations and probability calculations and discuss the operational implications of these decisions.

Downloads

Download data is not yet available.

Metrics

Metrics Loading ...

Downloads

Published

2022-03-20

How to Cite

Becker, K., & Meng, H. (2022). Identifying Statistically Actionable Collusion in Remote Proctored Exams. Journal of Applied Testing Technology, 23, 54–61. Retrieved from http://jattjournal.net/index.php/atp/article/view/167810

Issue

Section

Articles

References

Angoff, W. H. (1974). The development of statistical indices for detecting cheaters. Journal of the American Statistical Association, 69(345), 44-49. https://doi.org/10.1080/01621 459.1974.10480126.

Becker, K. A. and Makransky, G. (February, 2011). Verifying Candidate Identity Over Time: Candidate Response Consistency for Repeated Test Items. Paper Presented at the Association of Test Publishers Annual Conference, Phoenix, AZ.

Belov, D. I. and Armstrong, R. D. (2009). Detection of answer copying via Kullback-Liebler divergence and K-Index. Law School Admission Council Research Report, 09-01.

Bird, C. (1927). The detection of cheating in objective examinations. School and Society, 25(635), 261-262.

Bird, C. (1929). An improved method of detecting cheating in objective examinations. The Journal of Educational Research, 19(5), 341-348. https://doi.org/10.1080/00220671 .1929.10879954.

Cizek, G. J. and Wollack, J. A. (Eds.). (2017). Handbook of Quantitative Methods for Detecting Cheating on Tests. Routledge. https://doi.org/10.4324/9781315743097.

Holland, P. W. (1996). Assessing unusual agreement between the incorrect answers of two examinees using the K-Index: Statistical theory and empirical support. ETS Program Statistics Research Technical Report No. 96-4. https://doi.org/10.1002/j.2333-8504.1996.tb01685.x

Maynes, D. (2017). Detecting Potential Collusion Among Individual Examinees Using Similarity Analysis. In G. J. Cizek & J. A. Wollack (Eds.) Handbook of Quantitative Methods for Detecting Cheating on Tests. https://doi.org/10.4324/9781315743097-3.

R Core Team (2020). R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria. https://www.R-project.org/.

Romo, V. and Bowman, T. (2020, December 28). More Than 70 West Point Cadets Accused of Cheating in Academic Scandal. NPR. https://www.npr.org/2020/12/21/949025580/ more-than-70-west-point-cadets-accused-of-cheating-inacademicscandal.

Zopluoglu, C. (2017). Similarity, Answer Copying, and Aberrance: Understanding the Status Quo. In: G. J.

Cizek & J. A. Wollack (Eds.), Handbook of Quantitative Methods for Detecting Cheating on Tests. https://doi.org/10.4324/9781315743097-2.

Zopluoglu, C. (2019). Computation of the Response Similarity Index M4 in R under the Dichotomous and Nominal Item Response Models. International Journal of Assessment Tools in Education, Promoting Free/Libre Software Use in Educational Measurement, 1-19. https://doi.org/10.21449/ijate.527299.

Loading...