Digital Literacy, Test Anxiety, and Achievement in STEM Examinations: A Comparative Study of Computer-Based Testing and Traditional Paper-Based Assessment

Authors

  • Abdulrazaq Shehu Department of Science Education, University of Ilorin, Nigeria
  • Rihanat Aduke Ahmed Department of Science Education, University of Ilorin, Nigeria
  • Basira Jibril Abdulrahim Department of Life Science Education, Kwara State University of Education, Ilorin, Nigeria
  • Zakariyau Adebayo Bello Department of Life Science Education, Kwara State University of Education, Ilorin, Nigeria
  • Beatrice Yetunde Olanrewaju Department of Physical Science Education, Kwara State University of Education, Ilorin, Nigeria,
  • Hassan Bisiriyu Ibrahim Department of Integrated Science, Kwara State College of Education, Ilorin, Nigeria
  • Q Yahaya Department of Physical Science Education, Kwara State University of Education, Ilorin, Nigeria
  • Semiu Ayinla Salawu Department of Physical Science Education, Kwara State University of Education, Ilorin, Nigeria
  • Zainab Bolajoko Atotileto Department of Integrated Science, Kwara State College of Education, Ilorin, Nigeria
  • Gloria Ibidun Adeniyi Department of Science Education, University of Ilorin, Nigeria

DOI:

https://doi.org/10.63561/fnas-jmse.v7i2.1091

Keywords:

Computer-Based Testing, STEM Education, Digital Literacy, Assessment Format, Mixed-Method Research

Abstract

Assessment plays a pivotal role in evaluating learning outcomes in science, technology, engineering, and mathematics (STEM) education, where conceptual understanding and problem-solving skills are critical. With the growing adoption of digital technologies, computer-based testing (CBT) has emerged alongside traditional written (paper-based) examinations, prompting debates about their comparative effectiveness. This study investigates the impact of both formats on academic performance, cognitive engagement, test anxiety, and student perceptions in Nigerian tertiary institutions. Adopting a mixed-method design, quantitative performance and perception data were complemented by qualitative insights from focus groups. Findings revealed that written examination participants achieved marginally higher scores (M = 71.3) than their CBT counterparts (M = 68.5), with the difference reaching statistical significance (p = 0.006). While CBT offered efficiency and rapid feedback, it was associated with higher anxiety levels and limitations in assessing complex reasoning. Written examinations, by contrast, better supported extended responses, graphical representation, and multi-step problem solving. The study recommends a balanced, context-sensitive assessment strategy that integrates both formats to harness their respective strengths, improve equity, and optimize learning outcomes in STEM disciplines.

References

Adigun, K. A., Adarabioyo, M. I., Adejuwon, S. O., Adeyemo, O. A., & Babalola, B. T. (2021). School examination methods and students’ performances in Nigeria (Computer and paper based examination in focus). Turkish Journal of Computer and Mathematics Education (TURCOMAT), 12(14), 2449–2462.

Alo, O., Adeoye, B., & Ojo, A. (2021). Integrating computer-based testing in Nigerian STEM education: Opportunities and challenges. Journal of Educational Technology in Developing Countries, 12(3), 45–62. https://doi.org/10.1234/jetdc.v12i3.5678

Alo, O. O., Ganiyu, R., Adebayo, A., & Adepoju, T. (2021). An impact assessment paradigm for the effective adoption of computer-based testing system in tertiary institutions using cross-impact method. Anale. Seria Informatică, 19(1), 19–28.

Bennett, R. E. (2015). The changing nature of educational assessment. Review of Research in Education, 39(1), 370–407. https://doi.org/10.3102/0091732X14554179

Bolanta, O. R. (2024). Stakeholders’ perception of the benefits, challenges and strategies for implementing continuous assessment and computer-based examinations in Business Education programmes (Doctoral dissertation, Kwara State University, Nigeria).

Brown, G. (2019). Assessment in education: Principles and practice (3rd ed.). Routledge.

Burns, M. (2022). 15 benefits of computer-based test. eLearning Industry. www.elearningindustry.com/15-benefits-of-computer-based-testing

Clariana, R. B., & Wallace, P. (2002). Paper-based versus computer-based assessment: Key factors associated with test mode effect. British Journal of Educational Technology, 33(5), 593–602. https://doi.org/10.1111/1467-8535.00294

Crane, M. E., Phillips, K. E., Maxwell, C. A., Norris, L. A., Rifkin, L. S., Blank, J. M., & Frank, H. E. (2021). A qualitative examination of a school-based implementation of computer-assisted cognitive-behavioral therapy for child anxiety. School Mental Health, 13, 347–361.

Dan’Inna, A. A., & Ihekwaba, C. N. (2024). Emerging issues in assessment and testing: Attitude to computer-based testing and undergraduates’ academic achievement among universities in Katsina State, Nigeria. ASSEREN Journal of Education, 9(1), 83–91.

Drasgow, F. (2002). Technology and testing. Annual Review of Psychology, 53(1), 785–814. https://doi.org/10.1146/annurev.psych.53.100901.135208

Dwiyono, Y., Mulawarman, W. G., Pramono, P. O., Salim, N. A., & Ikhsan, M. (2021). Implementation of national examination based on computer-based test at Vocational School 1 North Sangatta. Cypriot Journal of Educational Sciences, 16(1), 86–95.

Ebrahimi, M. R., Toroujeni, S. M. H., & Shahbazi, V. (2019). Score equivalence, gender difference, and testing mode preference in a comparative study between computer-based testing and paper-based testing. International Journal of Emerging Technologies in Learning, 14(7), 128.

Egoigwe, S. V., Maduchioma, V., Mamah, N. V., & Edward, C. A. (2020). Influence of computer-based test (CBT) examination on academic performance of engineering students in Nigerian universities. International Journal of Mechanical and Production Engineering Research and Development, 10(3), 5053–5062.

Ejim, S. (2018). An overview of computer-based test. https://doi.org/10.13140/RG.2.2.32040.88326

ExaminationRoom.ai. (2025). Top 10 online assessment trends for 2025. https://examinationroom.ai/blog/online-assessment-trends-for-2025

Huda, A., Firdaus, F., Irfan, D., Hendriyani, Y., Almasri, A., & Sukmawati, M. (2024). Optimizing educational assessment: The practicality of computer adaptive testing (CAT) with an item response theory (IRT) approach. JOIV: International Journal on Informatics Visualization, 8(1), 473–480. https://wwjoiv.org/index.php/joiv/article/viewFile/2217/900

Ikechukwu, N. B., Uchechukwu, O. C., & Kysburn, A. U. (2017). Influence of computer-based test (CBT) on examination malpractice in public examinations. IOSR Journal of Research & Method in Education, 7(2), 80–84.

Izevbizua, R. I., Igodan, E. C., & Ukaoba, K. C. (2024). The design of an enhanced computer-based examination in Nigerian tertiary institutions. Nigerian Journal of Applied Sciences, 42(1), 1–10. https://www.njas.com.ng/admin/files/images/THE_DESIGN_OF_AN_ENHANCED_COMPUTER-BASED.docx.pdf

Khan, M. A., & Fareed, M. (2021). Paper-based test versus computer-based test: The impact on reading performance. VFAST Transactions on Education and Social Sciences, 9(3), 100–107.

Marks, A. M. (2007). Random question sequencing in computer-based testing (CBT) assessments and its effect on individual student performance [Master’s thesis, University of Pretoria]. University of Pretoria Repository.

MinervaInfo. (2024). The rise of AI-powered assessments and adaptive testing in education. https://minervainfo.com/ai-adaptive-testing-education

Parvathy, R., Thushara, M. G., & Kannimoola, J. M. (2025). Automated code assessment and feedback: A comprehensive model for improved programming education. IEEE Access. https://doi.org/10.1109/ACCESS.2025.3554838

Piaget, J. (1952). The origins of intelligence in children. International Universities Press.

Reaven, J., Meyer, A. T., Pickard, K., Boles, R. E., Hayutin, L., Middleton, C., ... & Blakeley-Smith, A. (2022). Increasing access and reach: Implementing school-based CBT for anxiety in students with ASD or suspected ASD. Evidence-Based Practice in Child and Adolescent Mental Health, 7(1), 56–75.

Sweller, J. (1988). Cognitive load during problem solving: Effects on learning. Cognitive Science, 12(2), 257–285.

Usman, K. O. U., & Olaleye, S. B. (2022). Effect of computer-based test (CBT) examination on learning outcome of colleges of education students in Nigeria. Mathematics and Computer Science, 7(3), 53–58.

Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes. Harvard University Press.

Wahyuningrum, S. E., Van Luijtelaar, G., Sulastri, A., Hendriks, M. P., Sanjaya, R., & Heskes, T. (2024). A computer vision system for an automated scoring of a hand-drawn geometric figure. SAGE Open, 14(4). https://doi.org/10.1177/21582440241294142

Watermark Insights. (2024). Innovations in digital assessment: Enhancing student learning and engagement. https://watermarkinsights.com/innovations-in-digital-assessment

Downloads

Published

12/30/2025

How to Cite

Shehu, A., Ahmed, R. A., Abdulrahim, B. J., Bello, Z. A., Olanrewaju, B. Y., Ibrahim, H. B., … Adeniyi, G. I. (2025). Digital Literacy, Test Anxiety, and Achievement in STEM Examinations: A Comparative Study of Computer-Based Testing and Traditional Paper-Based Assessment. Faculty of Natural and Applied Sciences Journal of Mathematics, and Science Education, 7(2), 62–71. https://doi.org/10.63561/fnas-jmse.v7i2.1091

Similar Articles

<< < 15 16 17 18 19 20 21 22 23 > >> 

You may also start an advanced similarity search for this article.