Effect of Diverse Assessment Methods on Undergraduate Engineering Students’ Performance in Mathematics Courses
DOI:
https://doi.org/10.70232/jrmste.v3i1.53Keywords:
Assessment Diversity, Formative Assessment, Mathematical Software, Mixed-Method Research, Online Quizzes, Poster Presentations, Student Engagement, Viva VoceAbstract
Assessment plays a crucial role in shaping how students engage with mathematics in engineering programs. Traditional summative examinations often emphasize rote learning rather than problem-solving or conceptual understanding. This study investigates the impact of diverse assessment methods—including online quizzes, assignments, poster presentations, viva voce, mathematical software applications, note checking, and class participation—on undergraduate engineering students’ performance in mathematics courses. The objectives were to evaluate which assessment strategies most effectively enhance learning outcomes and to provide actionable insights for improving assessment practices in higher education. A quasi-experimental design was employed with a sample of second-year undergraduate engineering students enrolled in core mathematics courses. Data were collected using structured quizzes, software-based tasks, poster presentations, viva assessments, and student surveys, and analyzed using descriptive and inferential statistics. Findings revealed that interactive and technology-driven methods, such as online quizzes and mathematical software tasks, had the strongest positive influence on student performance, while passive methods like note checking showed minimal impact. Assignments and viva voce were perceived as valuable for developing analytical reasoning and verbal articulation, although they sometimes generated stress. Poster presentations were effective in enhancing communication and creativity, but received mixed responses regarding mathematical depth. These results align with educational psychology frameworks, suggesting that active engagement reduces cognitive load and enhances intrinsic motivation. The study concludes that integrating diverse and interactive assessments fosters deeper conceptual understanding, problem-solving ability, and sustained engagement among engineering students. Implications for faculty, curriculum developers, policy makers, and educational technologists are discussed, particularly in terms of redesigning assessment frameworks and supporting professional development for educators.
References
Anderson, G., & Boud, D. (1996). Evaluating student learning in higher education. Routledge. https://www.routledge.com/Evaluating-Student-Learning-in-Higher-Education/Anderson-Boud/p/book/9780749417117
Arasasingham, R. D., Martorell, I., McIntire, T. M., & Nivens, D. A. (2011). Online homework and student achievement in a large enrollment introductory science course. Journal of College Science Teaching, 40(6), 70–79. https://www.jstor.org/stable/42993553
Biggs, J. (1996). Enhancing teaching through constructive alignment. Higher Education, 32(3), 347–364. https://doi.org/10.1007/BF00138871
Biggs, J., & Tang, C. (2011). Teaching for quality learning at university (4th ed.). Open University Press. https://www.mheducation.co.uk/teaching-for-quality-learning-at-university-9780335242757-emea-group
Black, P., & Wiliam, D. (1998). Assessment and classroom learning. Assessment in Education: Principles, Policy & Practice, 5(1), 7–74. https://doi.org/10.1080/0969595980050102
Bloxham, S., & Boyd, P. (2007). Developing effective assessment in higher education: A practical guide. Open University Press. https://www.mheducation.co.uk/developing-effective-assessment-in-higher-education-9780335221073-emea-group
Boud, D., & Falchikov, N. (2006). Aligning assessment with long‐term learning. Assessment & Evaluation in Higher Education, 31(4), 399–413. https://doi.org/10.1080/02602930600679050
Brookhart, S. M. (2004). Classroom assessment: Tensions and intersections in theory and practice. Teachers College Record, 106(3), 429–458. https://doi.org/10.1111/j.1467-9620.2004.00346.x
Carless, D. (2007). Learning-oriented assessment: Conceptual bases and practical implications. Innovations in Education and Teaching International, 44(1), 57–66. https://doi.org/10.1080/14703290601081332
Deci, E. L., & Ryan, R. M. (1985). Intrinsic motivation and self-determination in human behavior. Springer. https://link.springer.com/book/10.1007/978-1-4899-2271-7
Deci, E. L., & Ryan, R. M. (2000). The “what” and “why” of goal pursuits: Human needs and the self-determination of behavior. Psychological Inquiry, 11(4), 227–268. https://doi.org/10.1207/S15327965PLI1104_01
Entwistle, N., & Ramsden, P. (1983). Understanding student learning. Croom Helm. https://www.taylorfrancis.com/books/mono/10.4324/9781315535484/understanding-student-learning-ramsden-entwistle
Nguyen, T., & Kulik, C. (2020). Online learning and assessment in higher education: A review of research. Educational Technology Research and Development, 68(4), 1747–1769. https://doi.org/10.1007/s11423-020-09750-5
Nicol, D. J., & Macfarlane‐Dick, D. (2006). Formative assessment and self‐regulated learning: A model and seven principles of good feedback practice. Studies in Higher Education, 31(2), 199–218. https://doi.org/10.1080/03075070600572090
Ouyang, F., & Stanley, N. (2014). Theories and research in educational technology and distance learning instruction through online classrooms. Universal Journal of Educational Research, 2(2), 161–172. https://doi.org/10.13189/ujer.2014.020208
Race, P. (2010). Making learning happen: A guide for post-compulsory education (2nd ed.). Sage. https://uk.sagepub.com/en-gb/eur/making-learning-happen/book235014
Ramsden, P. (2003). Learning to teach in higher education (2nd ed.). Routledge Falmer. https://www.routledge.com/Learning-to-Teach-in-Higher-Education/Ramsden/p/book/9780415303453
Rust, C. (2002). The impact of assessment on student learning: How can the research literature practically help to inform the development of departmental assessment strategies and learner-centred assessment practices? Active Learning in Higher Education, 3(2), 145–158. https://doi.org/10.1177/1469787402003002004
Sadler, D. R. (1989). Formative assessment and the design of instructional systems. Instructional Science, 18(2), 119–144. https://doi.org/10.1007/BF00117714
Srivastava, M. (2017). Oral assessments: A review of challenges and strategies. Journal of Education and Practice, 8(9), 91–95. https://www.iiste.org/Journals/index.php/JEP/article/view/35660
Sweller, J. (1988). Cognitive load during problem solving: Effects on learning. Cognitive Science, 12(2), 257–285. https://doi.org/10.1207/s15516709cog1202_4
Sweller, J., Ayres, P., & Kalyuga, S. (2011). Cognitive load theory. Springer. https://doi.org/10.1007/978-1-4419-8126-4
Thomas, M. O. J., & Holton, D. (2003). Technology as a tool for teaching undergraduate mathematics. International Journal of Mathematical Education in Science and Technology, 34(2), 175–192. https://doi.org/10.1080/0020739031000078999
Wiliam, D. (2011). Embedded formative assessment. Solution Tree Press. https://solutions.ascd.org/en/embedded-formative-assessment
Downloads
Published
Issue
Section
License
Copyright (c) 2026 Nandini Rai, Yogesh Thakkar, Ravindra Salvi

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
