Computer-based assessment in mathematics

Issues about validity


  • Anneli Dyrvold Department of Education, Uppsala University, Sweden
  • Ida Bergvall Department of Education, Uppsala University, Sweden



computer-based assessment, dynamic, interactive, validity, transfer


Computer-based assessments is becoming more and more common in mathematics education, and because the digital media entails other demands than paper-based tests, potential threats against validity must be considered. In this study we investigate how preparatory instructions and digital familiarity, may be of importance for test validity. 77 lower secondary students participated in the study and were divided into two groups that received different instructions about five different types of dynamic and/or interactive functions in digital mathematics items. One group received a verbal and visual instruction, whereas the other group also got the opportunity to try using the functions themselves. The students were monitored using eye-tracking equipment during their work with mathematics items with the five types of functions. The result revealed differences in how the students undertook the dynamic functions due to the students’ preparatory instructions. One conclusion is that students need to be very familiar with dynamic and interactive functions in tests, if validity is to be ensured. The validity also depends on the type of dynamic function used.


Aldon, G. & Panero, M. (2020). Can digital technology change the way mathematics skills are assessed? ZDM, 52(7), 1333–1348. DOI:

Baccaglini-Frank, A. (2021). To tell a story, you need a protagonist: How dynamic interactive mediators can fulfil this role and foster explorative participation to mathematical discourse. Educational Studies in Mathematics, 106(2), 291–312. DOI:

Barana, A., Marchisio, M., & Sacchet, M. (2021). Interactive feedback for learning mathematics in a digital learning environment. Education Sciences, 11(6), 279–290. DOI:

Bennett, R. E. (2015). The changing nature of educational assessment. Review of Research in Education, 39(1), 370–407. DOI:

Bennett, R. E., Braswell, J., Oranje, A., Sandene, B., Kaplan, B., Yan, F. (2008b). Does it matter if I take my mathematics test on computer? A second empirical study of mode effects in NAEP. Journal of Technology, Learning, and Assessment, 6(9), 1–38.

Bennett, S., Maton, K., & Kervin, L. (2008a). The “digital natives” debate: A critical review of the evidence. British Journal of Educational Technology, 39(5), 775–786. DOI:

Cohen, J. W. (1983). Statistical power analysis for the behavioral sciences (2nd ed.) Lawrence Erlbaum Associates.

College Board. (2022). Assessment framework for the digital SAT suite, version 1.0 (June 2022). College Board.

Dadey, N., Lyons, S., & DePascale, C. (2018). The comparability of scores from different digital devices: A literature review and synthesis with recommendations for practice. Applied Measurement in Education, 31(1), 30–50. 347.2017.1391262 DOI:

Davis, L. L., Morrison, K., Zhou-Yile Schnieders, J., & Marsh, B. (2021). Developing Authentic Digital Math Assessments. Journal of Applied Testing Technology, 22(1), 1–11. Retrieved from

Dyrvold, A. (2022). Missed opportunities in digital teaching platforms: Under-use of interactive and dynamic elements. Journal of Computers in Mathematics and Science Teaching, 41(2), 135–161.

Gass, S. M., & Selinker, L. (1983). Language transfer in language learning. Issues in second language research. Newbury House Publishers, Inc.

Geraniou, E., & Jankvist, U.T. (2019). Towards a definition of “mathematical digital competency”. Educ Stud Math 102, 29–45. DOI:

Goldstone, R., Son, J. Y., & Landy, D. (2008). A well grounded education: The role of perception in science and mathematics. Symbols and embodiment (pp. 327–356). Oxford University Press. DOI:

Hamhuis, E., Glas, C., & Meelissen, M. (2020). Tablet assessment in primary education: are there performance differences between timss’ paper-and-pencil test and tablet test among dutch grade-four students? British Journal of Educational Technology, 51(6), 2340–2358. DOI:

Hanho, J. (2014) A comparative study of scores on computer-based tests and paper-based tests. Behaviour & Information Technology, (33)4, 410–422. DOI:

Harris, D., Logan, T., & Lowrie, T. (2021). Unpacking mathematical-spatial relations: Problem-solving in static and interactive tasks. Mathematics Education Research Journal, 33(3), 495–511. DOI:

Helsper, E. J., & Eynon, R. (2010). Digital natives: where is the evidence? British Educational Research Journal, 36(3), 503–520. DOI:

Hoch, S., Reinhold, F., Werner, B., Richter-Gebert, J., & Reiss, K. (2018). Design and research potential of interactive textbooks: the case of fractions. ZDM Mathematics Education, 50(5), 839–848. DOI:

Ilovan, O.-R., Buzila, S.-R., Dulama, M. E., & Buzila, L. (2018). Study on the features of geography/sciences interactive multimedia learning activities (IMLA) in a digital textbook. Romanian Review of Geographical Education, 7(1), 20–30. DOI:

Junpeng, P., Krotha, J., Chanayota, K., Tang, K., & Wilson, M. (2019). Constructing progress maps of digital technology for diagnosing mathematical proficiency. Journal of Education and Learning, 8(6), 90–102. DOI:

Kaminski, J. A., Sloutsky, V. M., & Heckler, A. F. (2013). The cost of concreteness: The effect of nonessential information on analogical transfer. Journal of Experimental Psychology. Applied, 19(1), 14–29. DOI:

Lemmo, A. (2021). A tool for comparing mathematics tasks from paper-based and digital environments. International Journal of Science and Mathematics Education, 19(8), 1655–1675. DOI:

Lobato, J. (2012). The actor-oriented transfer perspective and its contributions to educational research and practice. Educational Psychologist, 47(3), 232-247. DOI:

Lobato, J. & Hohense, (2021). Current conceptualisations of the Transfer of learning and their use in STEM education research. In C. Hohensee & J. Lobato (Eds.), Transfer of learning: Progressive perspectives for mathematics education and related fields (pp. 3–26). Springer. DOI:

Lobato, J., & Siebert, D. (2002). Quantitative reasoning in a reconceived view of transfer. The Journal of Mathematical Behavior, 21(1), 87–116. DOI:

Messick, S. (1995). Validity of psychological assessment: Validation of inferences from persons’ responses and performances as scientific inquiry into score meaning. American psychologist, 50(9), 741–749. DOI:

Mullis, I. V. S., & Martin, M. O. (Eds.). (2017). TIMSS 2019 Assessment Frameworks. Retrieved from Boston College, TIMSS & PIRLS International Study Center website:

Nathan, M. J. & Alibali, M. W. (2021). An embodied theory of transfer of mathematical learning. In C. Hohensee & J. Lobato (Eds.), Transfer of learning: Progressive perspectives for mathematics education and related fields (pp. 27–58). Springer. DOI:

OECD (2013), PISA 2012 Assessment and Analytical Framework: Mathematics, Reading, Science, Problem Solving and Financial Literacy, OECD Publishing.

OECD (2021). 21st-Century Readers: Developing Literacy Skills in a Digital World. PISA, OECD Publishing. DOI:

O’Halloran, K. L., Beezer, R. A., & Farmer, D. W. (2018). A new generation of mathematics textbook research and development. ZDM Mathematics Education, 50(5), 863–879. DOI:

Prensky, M. (2001). Digital Natives, Digital Immigrants. On the Horizon, 9(5), 1–6. DOI:

Regeringen (2017). Uppdrag att digitalisera de nationella proven.

Ripley, M. (2009). Transformational computer-based testing. In F. Scheuermann & J. Björnsson (Eds.), The transition to computer-based assessment (pp. 92–98). Office for Official Publications of the European Communities

Smolinsky, L., Marx, B. D., Olafsson, G., & Ma, Y. A. (2020). Computer-based and paper-and-pencil tests: A study in calculus for STEM majors. Journal of Educational Computing Research, 58(7), 1256–1278. DOI:

Thorndike, E. L., & Woodworth, R. S. (1901). The influence of improvement in one mental function upon the efficiency of other functions. Psychological Review, 8, 247–261. DOI:

Usiskin, Z. (2018). Electronic vs. paper textbook presentations of the various aspects of mathematics. ZDM Mathematics Education, 50(5), 849–861. DOI:

Yerushalmy, M., & Olsher, S. (2020). Online assessment of students’ reasoning when solving example-eliciting tasks: Using conjunction and disjunction to increase the power of examples. ZDM Mathematics Education, 52(5), 1033–1049. DOI:

Cover image for the article.




How to Cite

Dyrvold, A., & Bergvall, I. (2023). Computer-based assessment in mathematics: Issues about validity. LUMAT: International Journal on Math, Science and Technology Education, 11(3), 49–76.