The Impact of Technology-Enhanced Items on Test-Taker Disengagement

The Impact of Technology-Enhanced Items on Test-Taker Disengagement

Authors

  • NWEA, 121 NW Everett St, Portland, OR 97209
  • Curry School of Education, University of Virginia, 417 Emmet St. S, Charlottesville, VA 22903
  • NWEA, 121 NW Everett St, Portland, OR 97209

Keywords:

Technology-enhanced Items, Test-taking Engagement

Abstract

Technology-Enhanced Items (TEIs) have been purported to be more motivating and engaging to test takers than traditional multiple-choice items. The claim of enhanced engagement, however, has thus far received limited research attention. This study examined the rates of rapid-guessing behavior received by three types of items (multiple-choice, multiple select, and TEIs) on a commonly-used K-12 adaptive achievement test. Across three subject areas, the TEIs consistently showed the lowest rapid guessing rate, suggesting that their use may help mitigate the problem of disengaged test taking.

 

Downloads

Download data is not yet available.

Metrics

Metrics Loading ...

Downloads

Published

2021-02-15

How to Cite

Wise, S. L., Soland, J., & Dupray, L. M. (2021). The Impact of Technology-Enhanced Items on Test-Taker Disengagement. Journal of Applied Testing Technology, 22(1), 28–36. Retrieved from http://jattjournal.net/index.php/atp/article/view/155941

Issue

Section

Articles

References

Bryant, W. (2017). Developing a strategy for using technology-enhanced items in large-scale standardized tests. Practical Assessment, Research and Evaluation, 22(1). http://pareonline.net/getvn.asp?v=22&n=1.

Cayton-Hodges, G. A., Marquez, E., Keehner, M., Laitusis, C., van Rijn, P., Zapata-Rivera, D.,... Hakkinen, M. T. (2012, May). Technology Enhanced Assessments in Mathematics and Beyond: Strengths, Challenges, and Future Directions. ETS Invitational Research Symposium on Technology Enhanced Assessments, Washington, DC.

DeMars, C. E. (2000). Test stakes and item format interactions. Applied Measurement in Education, 13, 55-77. https://doi.org/10.1207/s15324818ame1301_3.

Dolan, R. P., Goodman, J., Strain-Seymour, E., Adams, J. and Sethuraman, S. (2011). Cognitive Lab Evaluation of Innovative Items in Mathematics and English/Language Arts Assessment of Elementary, Middle, and High School Students. Iowa City, IA: Pearson Education.

Eberhart, T. (2015). A Comparison of Multiple-Choice and Technology-Enhanced Item Types Administered on Computer Versus Ipad (Unpublished doctoral dissertation). University of Kansas, Lawrence, KS.

Eklöf, H. (2006). Development and validation of scores from an instrument measuring student test-taking motivation. Educational and Psychological Measurement, 66, 643-656. https://doi.org/10.1177/0013164405278574.

Finney, S. J., Sundre, D. L., Swain, M. S. and Williams, L. M. (2016). The validity of value-added estimates from lowstakes testing contexts: The impact of change in test-taking motivation and test consequences. Educational Assessment, 21, 60-87. https://doi.org/10.1080/10627197.2015.1127753.

Goldhammer, F., Martens, T., Christoph, G. and Lüdtke, O. (2016). Test-Taking Engagement in PIAAC. OECD Education Working Papers, No. 133. OECD Publishing, Paris.

Guerreiro, M. (2017). The Impact of a Technology-Enhanced Math Performance Task on Student Cognitive Engagement in Mathematics (Unpublished doctoral dissertation). University of Oregon, Eugene, OR.

Guo, H., Rios, J. A., Haberman, S., Liu, O. L., Wang, J. and Paek, I. (2016). A new procedure for detection of students’ rapid guessing responses using response time. Applied Measurement in Education, 29, 173-183. https://doi.org/10.1080/08957347.2016.1171766.

Kornhauser, Z. G. C., Minahan, J., Siedlecki, K. L. and Steedle, J. T. (2014, April). A Strategy For Increasing Student Motivation on Low-Stakes Assessments. Paper Presented at the Annual Meeting of the American Educational Research Association, Philadelphia.

Lau, A. R., Swerdzewski, P. J., Jones, A. T., Anderson, R. D. and Markle, R. E. (2009). Proctors matter: Strategies for increasing examinee effort on general education program assessments. Journal of General Education, 58, 196-217. https://doi.org/10.1353/jge.0.0045.

Lee, Y. H. and Jia, Y. (2014). Using response time to investigate students’ test-taking behaviors in a NAEP computer-based study. Large-scale Assessments in Education, 2(8), 1-24. https://doi.org/10.1186/s40536-014-0008-1.

Lindner, M. A., Lüdtke, O. and Nagy, G. (2019). The onset of rapid-guessing behavior over the course of testing time: A matter of motivation and cognitive resources. Frontiers in Psychology, https://doi.org/10.3389/fpsyg.2019.01533. PMid: 31396120, PMCid: PMC6664071.

Lindner, M. A., Lüdtke, O., Grund, S. and Köller, O. (2017). The merits of representational pictures in educational assessment: Evidence for cognitive and motivational effects in a time-on-task analysis. Contemporary Educational Psychology, 51, 482-492. https://doi.org/10.1016/j.cedpsych.2017.09.009.

Liu, O. L., Rios, J. A. and Borden, V. (2015). The effects of motivational instruction on college students’ performance on low-stakes assessment. Educational Assessment, 20, 79-94.

https://doi.org/10.1080/10627197.2015.1028618.

Rios, J. A., Guo, H., Mao, L. and Liu, O. L. (2017). Evaluating the impact of careless responding on aggregated-scores: To filter unmotivated examinees or not? International Journal of Testing, 17, 74-104. https://doi.org/10.1080/15305058.20 16.1231193.

Schnipke, D. L. (1995). Assessing Speededness in ComputerBased Tests Using Item Response Times. Unpublished doctoral dissertation, Johns Hopkins University.

Setzer, J. C., Wise, S. L., van den Heuvel, J. R. and Ling, G. (2013). An investigation of test-taking effort on a largescale assessment. Applied Measurement in Education, 26, 34-49. https://doi.org/10.1080/08957347.2013.739453.

Sireci, S. G. and Zenisky, A. L. (2016). Computerized Innovative Item Formats. In: S. Lane, M. R. Raymond & T. M. Haladyna (Eds.) Handbook of Test Development, 2nd ed., New York, Routledge; p. 313-334.

Soland, J. (2018). Are achievement gap estimates biased by differential student test effort? Putting an important policy metric to the test. Teachers College Record, 120(12), 1-26.

Strain-Seymour, E., Way, W. and Dolan, R. (2009). Strategies and Processes for Developing Innovative Items in LargeScale Assessments. Iowa City, IA: Pearson Education.

Sundre, D. L. and Moore, D. L. (2002). The Student Opinion Scale: A measure of examinee motivation. Assessment Update, 14(1), 8-9.

Wise, S. L. and DeMars, C. E. (2005). Low examinee effort in low-stakes assessment: Problems and potential solutions. Educational Assessment, 10, 1-17. https://doi.org/10.1207/ s15326977ea1001_1.

Wise, S. L. and DeMars, C. E. (2006). An application of item response time: The effort-moderated IRT model. Journal of Educational Measurement, 43, 19-38. https://doi.org/10.1111/j.1745-3984.2006.00002.x.

Wise, S. L. and Gao, L. (2017). A general approach to measuring test-taking effort on computer-based tests. Applied Measurement in Education, 30, 343-354. https://doi.org/10.1080/08957347.2017.1353992.

Wise, S. L. and Kuhfeld, M. R. (2020). A Cessation of Measurement: Identifying Test Taker Disengagement Using Response Time. In: M. J. Margolis & R. A. Feinberg (Eds.), Integrating Timing Considerations to Improve Testing Practices. New York: Routledge. https://doi.org/10.4324/9781351064781-11. PMid: 31573321.

Wise, S. L. and Ma, L. (2012, April). Setting Response Time Thresholds for a CAT Item Pool: The Normative Threshold Method. Paper Presented at the Annual Meeting of the National Council on Measurement in Education, Vancouver, Canada.

Wise, S. L. and Smith, L. F. (2011). A Model of Examinee TestTaking Effort. In: J. A. Bovaird, K. F. Geisinger, and C.

W. Buckendal (Eds.) High-Stakes Testing in Education: Science and Practice in K-12 Settings. Washington, DC: American Psychological Association; p. 139-153. https:// doi.org/10.1037/12330-009.

Wise, S. L. (2006). An investigation of the differential effort received by items on a low-stakes, computer-based test. Applied Measurement in Education, 19, 93-112. https://doi.org/10.1207/s15324818ame1902_2.

Wise, S. L. (2009). Strategies for managing the problem of unmotivated examinees in low-stakes testing programs. Journal of General Education, 58, 152-166. https://doi.org/10.1353/jge.0.0042.

Wise, S. L. (2015). Effort analysis: Individual score validation of achievement test data. Applied Measurement in Education, 28, 237-252. https://doi.org/10.1080/08957347.2015.10421 55.

Wise, S. L. (2017). Rapid-guessing behavior: Its identification, interpretations, and implications. Educational Measurement: Issues and Practice, 36(4), 52-61. https://doi.org/10.1111/emip.12165.

Wise, S. L., Bhola, D. and Yang, S. (2006). Taking the time to improve the validity of low-stakes tests: The effortmonitoring CBT. Educational Measurement: Issues and Practice 25(2), 21-30. https://doi.org/10.1111/j.17453992.2006.00054.x.

Wise, S. L., Kuhfeld, M. R. and Soland, J. (2018, April). The Effects of Effort Monitoring with Proctor Notification on Test-Taking Engagement, Test Performance, and Validity.

Paper Presented at the Annual Meeting of the National Council on Measurement in Education, New York. https:// doi.org/10.1080/08957347.2019.1577248.

Wise, S. L., Kuhfeld, M. R. and Soland, J. (2019). The effects of effort monitoring with proctor notification on test-taking engagement, test performance, and validity. Applied Measurement in Education, 32, 183-192. https://doi.org/10.1080/08957347.2019.1577248.

Wise, S. L., Ma, L., Kingsbury, G. G. and Hauser, C. (2010, May). An Investigation of the Relationship Between Time of Testing and Test-Taking Effort. Paper Presented at the Annual Meeting of the National Council on Measurement in Education, Denver, CO.

Wise, S. L., Pastor, D. A. and Kong, X. J. (2009). Understanding correlates of rapid-guessing behavior in low stakes testing: Implications for test development and measurement practice. Applied Measurement in Education, 22, 185-205. https://doi.org/10.1080/08957340902754650.

Wise. S. L. and Kong, X. (2005). Response time effort: A new measure of examinee motivation in computer-based tests. Applied Measurement in Education, 18, 163-183. https://doi.org/10.1207/s15324818ame1802_2.

Wolf, L. F., Smith, J. K. and Birnbaum, M. E. (1995). Consequence of performance, test motivation, and mentally taxing items. Applied Measurement in Education, 8, 341-351. https://doi.org/10.1207/s15324818ame0804_4, https://doi.org/10.1207/s15324818ame0803_3.

Loading...