The article is devoted to a comparative analysis of the 23 most popular and/or automated competency assessment platforms registered in the Russian
Register of Software. The purpose of the study is to identify the predominant characteristics of the functioning Russian automated competency assessment
systems based on the analysis and generalization of the data available and to determine the modernization potential of the Innopolis University automated
platform.
For the first time in the scientific field, the article presents the classification of automated competency assessment platforms.
The presented results indicate that more than half of the studied automated platforms are not able to assess the level of formation of competencies
as a whole, since they are focused only on identifying knowledge and cognitive skills, without affecting professional skills. As a result of the analysis
of automated competency assessment platforms, a number of their predominant characteristics are identified, which may be significant as part of the
development or modernization of such platforms. These characteristics include virtualization and simulation of real production processes; a step-by-step
evaluation process involving sequential performance of the theoretical part of the tests first and then of the practical part; comprehensive use of a variety
of assessment methods and tools.
The identified characteristics are taken as the basis for the modernization of the automated platform of Innopolis University and are recommended as
the crucial element in the development of automated competence assessment systems.
Keywords: automated system, assessment, competency, assessment methods and tools, certification.
References
- D. V. Irtegov, T. V. Nesterenko, T. G. Churina. Automated assessment systems for programming tasks: development, use and prospects//Bulletin of the NSU. Series: «Information Technology». 2019. Vol. 17. № 2. P. 61-73.
- K. Ali, N. Barhom, M. Duggal. Online Assessment Platforms: What is on Offer?//European Journal Of Dental Education. 27. 2022. 10.1111/eje.12807.
- A. Jurane-Bremane. Digital Assessment in Technology-Enriched Education: Thematic Review//Education Sciences. 2023. 13 (5): 522-522. doi: 10.3390/educsci13050522.
- M. Bandtel, M. Baume, E. Brinkmann et al. (eds.). Digital Assessments in Higher Education — White paper of a Community Working Group from Germany, Austria and Switzerland, Berlin: Hochschulforum Digitalisierung, 2021.
- B. Hass., C. Yuan., Z. Li. On the Automatic Assessment of Learning Outcome in Programming Techniques. 2019. 274-278. doi: 10.1109/ISKE47853.2019.9170370.
- C. N. Blundell. Teacher use of digital technologies for schoolbased assessment: a scoping review, Assessment in Education: Principles, Policy & Practice, 2021. doi: 10.1080/0969594X.2021.1929828.
- B. Cipriano, N. Fachada, P. Alves.. Drop Project: An automatic assessment tool for programming assignments//SoftwareX. 18. 2022. 101079. doi: 10.1016/j. softx.2022.101079.
- C. Sauerwein., S. Oppl, G. Iris et al. Towards a Success Model for Automated Programming Assessment Systems Used as a Formative Assessment Tool//ITiCSE, Turku, Finland, June, 2023. doi: 10.1145/3587102.3588848.
- S. Combefis. Automated Code Assessment for Education: Review, Classification and Perspectives on Techniques and Tools//Software. 2022. 1 (1):3-30. https://doi.org/10.3390/software1010002.
- L. Freise, U. Bretschneider. Automized Assessment for Professional Skills — A Systematic Literature Review and Future Research Avenues//18th Internationale Tagung der Wirtschaftsinformatik, Paderborn, Germany. September 2023.
- J. C. Paiva, J. P. Leal, A. Figueira. Automated Assessment in Computer Science Education: A State-of-the-Art Review//ACM Trans. Comput. Educ. 22, 3, Article 34 (September 2022), 2022. 40 p. https://doi.org/10.1145/3513140.
- Kim, Man, Dol. Manless on-line auto group discussion system for competency assessment. 2020.
- C. Li-Chen, L. Wei, Judy C. R. Tseng. Effects of an automated programming assessment system on the learning performances of experienced and novice learners//Interactive Learning Environments, 31:8, 2023, 5347-5363. doi: 10.1080/10494820.2021.2006237.
- I. Mekterovic, L. Brkic, B. Milasinovic, M. Baranovic.. Building a Comprehensive Automated Programming Assessment System//IEEE Access. 2020. P. 1-1. 10.1109/ACCESS.2020.2990980.
- M. Tarek, A. Ashraf, M. Heidar, E. Eliwa. Review of Programming Assignments Automated Assessment Systems//2022 2nd International Mobile, Intelligent, and Ubiquitous Computing Conference (MIUCC), Cairo, Egypt, 2022. P. 230-237. doi: 10.1109/MIUCC55081.2022.9781736.
- J. C. G. Ocampo, E. Panadero. Web-Based Peer Assessment Platforms: What Educational Features Influence Learning, Feedback and Social Interaction?/In: O. Noroozi, B. De Wever (eds). The Power of Peer Learning. Social Interaction in Learning and Development. Springer, Cham, 2023. https://doi.org/10.1007/978-3-031-29411-2_8.
- P. Isaias, P. Miranda., S. Pifano. Framework for the analysis and comparison of e-assessment systems. 2019. 276-283.
- M. Shavrovskaya, A. Pesha. The Analysis of the Online Platforms for Evaluating the Students’ Supra-Professional Competencies. 2021. doi: 10.1007/978-981-16-0953-4_94.
- S. Geiss, T. Jentzsch, N. Wild, C. Plewnia. Automatic Programming Assessment System for a Computer Science Bridge Course — An Experience Report//2022 29th Asia-PacificSoftware Engineering Conference (APSEC), Japan, 2022. P. 527-536. doi: 10.1109/APSEC57359.2022.00074.
- T. Staubitz, D. Petrick, M. Bauer et al. Improving the Peer Assessment Experience on MOOC Platforms. 2016. 389-398. doi: 10.1145/2876034.2876043.
- S. Zhang. Review of automated writing evaluation systems//Journal of China Computer-Assisted Language Learning, Vol. 1 (Issue 1). 2021. P. 170-176. https://doi.org/10.1515/jccall-2021-2007.
Authors