Interactive surveys during online lectures for IT students Olena S. Holovnia1 , Natalia O. Shchur1 , Iryna A. Sverchevska1 , Yelyzaveta M. Bailiuk1 and Oleksandra A. Pokotylo1 1 Zhytomyr Polytechnic State University, 103, Chudnivska str., Zhytomyr, 10005, Ukraine Abstract The article investigates student response systems (SRS), and how to apply them to facilitate students’ engagement and to improve the overall students’ experience during online lectures. The authors give an overview of different student response systems (Mentimeter, AhaSlides, Kahoot!, Wooclap, Socrative, Poll Everywhere, and Slido) and make a comparison of their features. The work describes the experience of using the Mentimeter student response system at online lectures in the Operating Systems course for second-year students IT students of Zhytomyr Polytechnic State University (Software Engineering, Computer Science, Computer Engineering, and Cybersecurity specializations). The data is collected using observation, surveys and taking existing data. Data analysis methods include visual analysis (box plots, Q-Q plots, histograms) and statistical analysis (descriptive statistics, Shapiro-Wilk normality test, F-test, Kruskal-Wallis rank sum test). The study provides experimental results showing an increase in the number of students’ answers within the lectures. It also highlights IT students’ problems and preferences during online lectures. The authors give recommendations on using SRS during online lectures, aimed at improving the lecturer’s interaction with the audience. Keywords student response systems, Mentimeter, online lectures, blended learning, online learning 1. Introduction From the spring of 2020, due to the COVID-19 pandemic, students of Zhytomyr Polytechnic State University (Zhytomyr, Ukraine) were attending lectures online. The importance of online learning only increased in 2022, during the war [1]. Although giving lectures through online conference applications, like Google Meet, Zoom or BlueButton, was a novel experience for most of the teaching staff, gradually, we have adapted to working in new conditions [2, 3]. However, certain difficulties remain. Particularly, most lecturers experience a leak of communication with students (no eye contact, no confirmation if students are listening, and hard to estimate students’ understanding and engagement). The lecturer may demand that students keep their web cameras on during the lecture. However, this approach may reduce connection stability and cause low-quality video and audio then the bandwidth is not high enough. Furthermore, some students do not have CTE 2022: 10th Workshop on Cloud Technologies in Education, May 23, 2022, Kryvyi Rih, Ukraine " olenaholovnia@gmail.com (O. S. Holovnia); thalitana@gmail.com (N. O. Shchur); sverchevska.ia@gmail.com (I. A. Sverchevska); liza.bailiuk@gmail.com (Y. M. Bailiuk); a.a.polish4uk@gmail.com (O. A. Pokotylo)  0000-0003-0095-7585 (O. S. Holovnia); 0000-0002-1182-4799 (N. O. Shchur); 0000-0001-7306-3836 (I. A. Sverchevska); 0000-0002-4961-7816 (Y. M. Bailiuk); 0000-0002-1587-235X (O. A. Pokotylo) © 2023 Copyright for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0). CEUR Workshop Proceedings ceur-ws.org ISSN 1613-0073 CEUR Workshop Proceedings (CEUR-WS.org) 65 web cameras on devices with a stable network connection. Conversely, students face another kind of challenge when attending online lectures. Given less control, they may be distracted more often and, thus, listen less carefully. For the same reason, asking students questions online could be less effective. Similarly, students have limited ability to show their misunderstanding or disengagement without speaking up or writing a message directly to the lecturer, and the latter may be uncomfortable for some of them. Student response systems (SRS) are often used to capture and hold students’ attention, facilitate students’ experience during classes, promote their engagement, to make online lectures more person-oriented. SRSs have also been used in universities before the COVID-19 pandemic, but new restrictions force educators to pay more attention to these tools. While numerous researchers report a positive effect of using SRSs, especially when first introduced to students, it is also crucial to investigate ways of using the above-mentioned tools thoughtfully and efficiently. The purpose of the article is to investigate student response systems and their application to facilitating students’ engagement as well as overall students’ experience during online lectures, to formulate recommendations for more effective usage of these systems. 2. Related work SRS are known by various names, including audience response systems (ARS), personal response systems (PRS), electronic voting systems (EVS), polling systems, and clicker systems. Many researchers explore the benefits, challenges and implications of using SRS as a learning tool. The publications review demonstrates that the most popular systems are Kahoot! [4], Socrative [5] and Mentimeter [6]. Wang [7] has noted that the game-based student response system Kahoot! managed to boost students’ engagement, motivation and learning after using it repeatedly for five months. Roman et al. [8] offer to use Socrative as an effective instrument that allows for minimizing learning disruptions as a consequence of the recent COVID-19 outbreak is also. Quiroz Canlas et al. [9] describe their experience using the Mentimeter online platform in the Computer Science lecture classes. Furthermore, recent reports suggest [10, 11, 12, 13, 14, 15], that personal response systems help to engage students in active and self-regulated learning, and enhance their collective efficacy, satisfaction and learning achievements. Despite the wide range of demonstrated benefits, many authors note that student response systems have some disadvantages. For example, Barnett [16] found that SRS use faces financial, pedagogical, and technical limitations. Kay and Lesage [17] chose to group the types of SRS limitations into student-based, teacher-based, and technology-based categories. Ault and Horn [18] provide guidelines for teachers when planning, implementing and monitoring the use of student response systems. Tkachuk et al. [19, 20, 21] developed methods of applying mobile technologies to university students’ training during the COVID-19 lockdown and showed how to use Plickers audience response system for that purpose. 66 3. Methodology 3.1. Research settings and methods The study investigated different SRS and, particularly, Mentimeter in the terms of their applica- tion to facilitating students’ engagement and overall students’ experience during online lectures. The research used quantitative analysis (for most of the data) and, partially, qualitative analysis (for the literature review). The experimental part of the study was conducted at Zhytomyr Polytechnic State University during one semester in 2021 and involved second-year IT students. We investigated their participation in the lectures on Operating Systems. Also, the research included the further implementation of Mentimeter SRS at the lectures, subsequent analysis and formulating the recommendations for using SRS for this purpose. 3.2. Participants At the time the research began (February 2021), there were four academic groups on Software Engineering specialization and two academic groups on Computer Science specialization (126 students), and also one academic group on Computer Engineering and two academic groups on Cybersecurity (62 students). All the mentioned above students have had very similar training during the previous three semesters. Meanwhile, the Software Engineering and Computer Science students had lectures in Operation Systems separately from the Computer Engineer- ing and Cybersecurity students. Considering this, the Software Engineering and Computer Science students comprised the control group (CG), whereas the Computer Engineering and Cybersecurity students comprised the experimental group (EG). 3.3. Data collection tools The data collection involved taking existing data, observation and survey. We used several tools for data collection, including the following. • Paper and electronic document management systems of Zhytomyr Polytechnic State University for the scores gained by CG and EG students on their previous exams (existing data). • Rating lists of the Operating Systems course for the data about attending the lectures (observation). • Mentimeter automatic answers counters for the data about students’ answers during the lectures (observation). • Google Meet video recording of Operating Systems lectures for the data about students’ answers during the lectures (observation). • The text versions of Google Meet chats of Operating Systems lectures for the data about students’ answers during the lectures (observation). • Google Forms for the data about students’ experience during online lectures (survey). 67 3.4. Data analysis methods The methods used to define the homogeneity of the sample included visual and statistical analysis of students’ average scores on previous sessions. The visual analysis involved box plots, Q-Q plots, and histograms. The statistical analysis included descriptive statistics (median, mean, standard deviation) and inferential statistics (Shapiro-Wilk normality test, Kruskal-Wallis rank sum test). General patterns shown by the survey were explored through visual analysis (histograms). Statistical differences between CG and EG were tested using visual analysis (histograms, Q-Q plots) and statistical analysis (Shapiro-Wilk test, F-test, Kruskal-Wallis test). 3.5. Implementation At the beginning, the comparison of different SRS has been done, and the Mentimeter SRS has been chosen. The CG and EG have been formed. The homogeneity of the sample has been defined based on a statistical analysis of the average scores of students on previous sessions. We were using Mentimeter SRS at online lectures in the Operating Systems course for CG and EG for one semester (from February 2021 until May 2021) to gain the experimental data. Course in operating systems is a normative discipline in the curriculum of Software Engineering, Computer Science, Computer Engineering and Cybersecurity at Zhytomyr Polytechnic State University and contains 32 academic hours of lectures on this course. In 2021 the lectures were organized separately for Software Engineering, Computer Science students (CG) and students of Computer Engineering and Cybersecurity specializations (EG). The lectures were given by the same lecturer (Olena Holovnia), on the same topics and simultaneously (within a few hours of one another or one day of one another, depending on the university timetable). The other tools used to present the material to students include Google Meet (within Google Workspace for Education) and electronic presentations (WPS Presentations) with similar content. Mentimeter was introduced to the EG students during their online lectures in Operating Systems, whereas the CG students attended regular online lectures with questions asked through text chat or using microphones. Figure 1 presents the example of Menimeter slides used for lectures in EG. It demonstrates the Mentimeter leaderboard as the results of a graded quiz are summing up. The student at the top of the diagram (nicknamed “Bahogabaguguwongas”) is going to be the winner as he or she just gave the most precise and quick answer. Students may use both nicknames or their true names within leaderboard quizzes. After 13 lectures (26 academic hours) an anonymous online survey for both CG and EG has been conducted. The survey contained questions about students’ experiences during the lectures on Operating Systems. The general patterns shown by the survey have been analysed. To investigate differences between the CG and EG we compared self-reported levels of students’ engagement (based on the data from the survey) and conducted a statistical analysis of the number of students’ answers during the lectures (based on the data collected within the lectures). A statistically-significant difference in the number of answers per student in CG and EG has been found. We also analysed the EG students’ experience with Mentimeter (also gained from the anonymous survey). The further implementation of Mentimeter (February 2022, April 2022 – June 2022, September 68 Figure 1: Students’ results on the Mentimeter leaderboard during the online lecture about scheduling and dispatching. 2022 – November 2022) allowed us to continue accumulating experience and formulate practical recommendations for more efficient usage of SRS at online lections. The details about the research along with the results obtained are covered in section 4. 4. Results 4.1. An overview of student response systems available for free The main functions of such services as Mentimeter, AhaSlides, Kahoot!, Poll Everywhere, Slido, Wooclap, Socrative and their ability to involve the student audience in the educational process through polling were considered (table 1). A more detailed comparison of the products’ features available for free is shown in table 2. Mentimeter is a simple and convenient online service for creating polls and voting in real-time. The basic features of the platform are provided for free. However, the free plan has several limitations: the number of questions is no greater than 2, and it is impossible to customize the appearance of the questionnaire and export it to other services. Most of the features of AhaSlides are immediately available for free, there is no limit to the number of questions that can be used in the presentation. However, a significant drawback is the maximum number of participants – only 7 people can simultaneously join the presentation. Paid access to the platform provides much wider opportunities. They are manifested, in particular, in the number of students who can be involved in the survey, and the ability to export an extended report on the survey results. 69 Table 1 Student response systems (product comparison). Product Platforms Website Founded Located Starting price details supported $9.99 per month, cloud, SaaS, Mentimeter mentimeter.com 2014 Sweden free version available, web-based, free trial available Android, iOS $4.95 per month, cloud, SaaS, AhaSlides ahaslides.com 2019 Australia free version available, web-based free trial available €5.00 per month, cloud, SaaS, Kahoot! kahoot.com 2013 Norway free version available, web-based, free trial available Android, iOS $6.99 per month, cloud, SaaS, Wooclap wooclap.com 2014 Belgium free version available, web-based free trial available cloud, SaaS, $59.99 per year, Socrative socrative.com 2010 Canada web-based, free version available Android, iOS cloud, SaaS, $13.99 per month, Poll Everywhere polleverywhere.com 2007 USA web-based, free version available Android, iOS cloud, SaaS, €10.00 per month, Slido slido.com 2012 Slovakia web-based, free version available Android, iOS Kahoot! is primarily a game-based learning platform for creating educational tests, games and quizzes. The gameplay is simple – all players simultaneously answer questions on their devices and gain points for each correct answer. At the end of the competition, the number of points of all participants is displayed on the screen. Free access allows you to create only two types of questions: quiz, i.e. questions with “multiple-choice” and “true or false”. Another survey tool that is widely used in western schools (particularly in the United States) is the Socrative educational platform. Socrative service is designed to organize and use a voting system using any gadgets, computers, tablets, or mobile devices that can display surveys. However, the number of participants should not exceed 50 people. Poll Everywhere is an online service for creating polls with different types of questions. A feature of this tool is the ability to create polls that involve answering questions for a long time. An interesting feature is a graphical way of displaying users’ answers to open questions (in the form of a text wall, word cloud, quotes or a moving line). In the free version of Poll Everywhere, the maximum audience size is 40 users, but there are no restrictions on the number of questions in the survey. Slido is an easy-to-use tool for audience engagement. This tool is often used in large events. 70 Table 2 Student response systems (features comparison). Free version Poll features Mentimeter AhaSlides Kahoot! Wooclap Socrative Slido Everywhere available Maximum unlimited 7 10 1000 5 40 100 participants Maximum number 2 unlimited unlimited 2 1000 unlimited 3 of questions per event multiple- choice, poll, find on multiple- multiple- multiple- image, choice, multiple- choice, choice, rating, multiple- open-ended, choice, word cloud, word cloud, multiple- open-ended, choice, word cloud, Question word cloud, quiz, open-ended, choice word cloud, true/false, clickable images, types open-ended, rating, scales, quiz find short ranking, scales, open text, image a number, answer survey, ranking ranking, choice matching, Q&A, survey sorting, competitions fill in the blank, brainstorming Q&A Yes Yes Yes Yes No Yes Yes Quiz Yes Yes Yes Yes Yes Yes Yes Real-time vote Yes Yes Yes Yes Yes Yes Yes Anonymous Yes Yes Yes Yes Yes Yes Yes voting PowerPoint Yes Yes No Yes No Yes Yes integration Google Slides No Yes No Yes No Yes Yes integration Microsoft Teams Yes No No Yes No Yes Yes integration Screen sharing Yes No Yes Yes Yes Yes Yes Data analysis Yes Yes Yes Yes Yes Yes Yes tools Reporting, Yes No Yes No Yes No No export results Design Yes Yes Yes No Yes Yes Yes Security and Yes Yes Yes Yes Yes Yes Yes Privacy Support Yes Yes Yes Yes Yes Yes Yes 71 With its help, participants of conferences, trainings, seminars, and public lectures can ask questions to the speaker, as well as vote for the best questions, so that speakers can answer exactly those that are interesting to the majority. The number of events in the free version is unlimited, but there is a limit on the number of participants – up to 100 per event. Also, the free version has the ability to conduct 3 polls during 1 event, including 1 quiz and 1 brainstorming session. Unlike the platforms discussed above, the Wooclap service provides the ability to create a survey with different types of questions and allows users to attract a large audience (up to 1000 people). However, the maximum number of questions in the free version is 2. In general, all the services under consideration have similar functionality. For further research, the Mentimeter platform was chosen. It has a convenient and intuitive interface and supports multiple-choice, word cloud, and open-ended types of questions, along with questions with scales and ranking. The Free Mentimeter plan allows an unlimited number of students to participate, so it can be used at lectures for a large audience, which is not unusual at Zhytomyr Polytechnic University. The service has a limited number of questions per event, but this could be enough when combined with traditional questions through a web meeting chat or students’ microphones. 4.2. The homogeneity of the sample To research the homogeneity of the sample, the average score on previous sessions for CG and EG students has been compared. The descriptive statistics for the average score of students are shown in table 3. Table 3 The descriptive statistics parameters for the average score (control and experimental group). Group Median Mean Standard deviation Control group 75,25 76,18 14,42 Experimental group 75,35 74,74 15,12 The box plot for average students’ scores in CG and EG is presented in figure 2. It shows that these two samples are visually similar, and there are no visible outliers. The results of visual analysis and Shapiro-Wilk normality test (shapiro.test( ) function in R) for the student’s average score in CG and EG showed visual differences between the normal distribution and distributions in CG and EG (figure 3), along with p-values 0,0001616 and 0,02458 respectively, which is less than the significance level 0,05. So, we can conclude that the data significantly deviate from a normal distribution. The resid- uals analysis also showed a significant deviation from a normal distribution visually (figure 4) and through the Shapiro-Wilk normality test (p-value = 0,00004022, which is considerably less than the significance level 0,05). Given the mutual independence of the samples and also the deviation from normality for both samples, the Kruskal-Wallis rank sum test was used to compare the two samples. The Kruskal- Wallis test is a statistical test to determine whether two or more population means are different 72 Figure 2: The boxplot diagram for the student’s average score in CG and EG. Figure 3: The normal Q-Q plots for the student’s average score distribution in CG and EG. 73 Figure 4: The histogram and the normal Q-Q plot for the student’s average score distribution residuals. and does not require the assumptions of normality [22, p. 115-120]. The kruskal.test( ) function in R has been used. The test showed p-value = 0,5589, which is more than the significance level 0,05, showing no statistically significant difference between the medians of CG and EG. Taking into account all mentioned above, we can consider the samples as homogeneous. 4.3. General patterns shown by the survey After 13 lectures students were given an online survey using Google Forms. The survey was anonymous and contained an identical set of questions, except the Computer Engineering and Cybersecurity students (EG) were been also asked questions about their experience on Mentimeter. Despite the total number of Software Engineering and Computer Science students differs from the total number of Computer Engineering and Cybersecurity students, the number of students taking the survey was 31 persons in each case. Given the significantly different sizes of CG and EG, this observation forms a specific interest, which, however, goes beyond the scope of this research. As expected, students in both the CG and EG reported some degree of difficulty during online lectures. When asked the question about holding their attention within lecture (“It’s more difficult for me to hold my attention during online lecture than during regular lectures”), more than 61% in each group choose the answers 2-5 on the scale from 1 (“No, it doesn’t seem like 74 me at all”) to 5 (“Yes, it’s definitely about me”), of which almost a half chose the answers 4 or 5 (figure 5). Figure 5: The distribution of the survey answers. Question: “It’s more difficult for me to hold my attention during online lectures than during regular lectures”. The majority of respondents admitted they may distract during online lectures. About 20% of students in both groups do it quite often or very often (figure 6). Figure 6: The distribution of the survey answers. Question: “There is a temptation to do other things when attending online lectures. It happens that I do that”. Approximately a quarter of respondents stated the importance to them (4 or 5 on 1..5 scale) if the lecturer knows, that it is he or she the one who answered the question or made a comment within the class (figure 7). Also, a quarter of students reported they agree or mostly agree (4-5 on 1..5 scale) with a statement: “I rarely participate in discussion during lecture, because I’m not quite sure if I would look smart and competent enough”), as shown in figure 8. This information should be taken into consideration when choosing the type of the questions during the lecture. 75 It may indicate that some students need anonymous quizzes during the lectures whereas other students may want to be identified by the lecturer when they answer. Therefore, the online quizzes have to contain both anonymous and non-anonymous tasks. Figure 7: The distribution of the survey answers. Question: “It is important to me if the lecturer knows that it was me the one who answered the question or made a comment within the class”. Figure 8: The distribution of the survey answers. Question: “I rarely participate in discussion during lectures, because I’m not quite sure if I would look smart and competent enough”. As well, students in both groups are predictably interested in gaining extra points for correct answers within lectures (figure 9). However, about half of respondents showed less interest in such a way of getting extra points and, therefore, may be better motivated by other factors. The students of EG also were proposed to answer the questions about their Mentimeter experience. Most of the respondents enjoyed the Mentimeter online surveys. When asked to estimate how much they liked the surveys, 83,9% of the students chose 4 or 5 points out of 5. The rest of the students (16,1%) reported a more neutral attitude, choosing 3 points out of 5 (figure 10). 76 Figure 9: The distribution of the survey answers. Question: “I am more eager to participate in discussions during lectures if I can gain some extra points for that”. Figure 10: The distribution of the survey answers. Question: “How much did you like the Mentimeter experience during the lectures in Operating Systems?”. When asked to choose Mentimeter features they enjoyed the most (the multi-choice question, figure 11), students reported they liked the new experience (83,9%), the lectures becoming more diverse, containing less teacher’s monologue (71%), the opportunity to interact more with fellow students and teachers (58,1%), the opportunity to check yourself and your understanding of the material (48,4%), the fun pictograms, animation (48,4%), the opportunity to compete with the others answering the questions where the correctness and speed were assessed (25,8%), the support of the access from different devices (22,6%). Only one respondent reported that liked nothing specific about Mentimeter quizzes (forming 3,2%). Also, no one has chosen the open-ended option (“Other: ”). 77 Figure 11: The distribution of the survey answers. Question: “What Mentimeter features did you enjoy the most?” (multi-choice). 4.4. Differences between the control and experimental group A comparison of self-reported levels of students’ engagement was done. During the anonymous survey, the students were asked to choose how engaged they had been at lectures in the Operating Systems. The possible answers included five levels, followed by explanations. • “My level of engagement is high. It is important for me to understand the lecture and not miss any discussion no matter if I am participating or just listening” • “My level of engagement is above average. I’m trying to understand all the material. However, if I miss something, I would read about it elsewhere” • “My level of engagement is average. I’m trying to understand the part of the material which seems important to me. If later I would need something I’ve missed then I would dig into it” • “My level of engagement is below average. I’m trying to understand the course in general. It looks like I wouldn’t understand some parts of the material, but it is impossible to know everything” • “My level of engagement is quite low. I’m trying to understand some parts of the material, mainly those which seem interesting to me or those which are easy to understand” The anonymity of the survey and neutral formulations without noticeable judgement give reasons to assume that the respondents tried to answer fairly. The results of the survey are given in table 4. The histogram in figure 12 shows that the students of EG reported higher levels of engagement noticeably more often than the students of CG. Researching the differences in CG and EG students’ engagement during online lectures, the analysis of the number of students’ answers during quizzes and discussions has been done. The total number of students who participated in quizzes and discussions during lectures in Operating Systems among CG and EG are given in table 5. Table 5 needs a few important notes. The modules, marked with the asterisk symbol (*) took 2 lectures to discuss (4 hours instead of 2). The lectures held within the experiment were as follows. 78 Table 4 The comparison of the levels of students’ engagement (self-reported, anonymous). CG EG Level of engagement Number of Percentage of Number of Percentage of students students students students High engagement 2 6,5% 7 22,6% Engagement above-high 9 29,0% 13 41,9% Medium engagement 18 58,1% 7 22,6% Engagement below-medium 1 3,2% 1 3,2% Low engagement 1 3,2% 3 9,7% Number of answers 31 31 Figure 12: Levels of students’ engagement in CG and EG (self-reported). Table 5 The comparison of students’ answers count during 8 modules of the Operating Systems course in the experimental and control group. EG CG Module EG (Menti) EG (Chat+microph.) EG (Total) CG (Chat+microph.) Module 1 26 23 49 26 Module 2 11 22 33 29 Module 3 22 3 25 13 Module 4 12 0 12 9 Module 5* 23 0 23 14 Module 6* 3 4 7 2 Module 7 0 24 24 22 Module 8* 5 2 7 6 Number of students 62 126 79 • Module 1. Operating systems overview. • Module 2. The main principles of the operating systems (part 1). • Module 3. The main principles of the operating systems (part 2). • Module 4. Concurrency. • Module 5. Scheduling and dispatching. • Module 6. Memory management. • Module 7. File systems. • Module 8. Security. The above-mentioned modules do not cover all the course materials. This list contains the modules, presented to the students exactly within the period of the experiment, meaning, in particular, observation and counting the number of answers. Student chat messages count doesn’t include organizational questions and answers. Most lectures in the experimental group involved two Mentimeter questions (the limit for the free plan). The lecture on module 6 contained one Mentimeter question. Only the lecture on module 7 contained no Mentimeter questions. Most lectures in the experimental group also included questions answered in chat (otherwise there are zero chat answers). Besides, some students tend to send one answer in several chat messages. Such answers are counted as one. Table 6 contains the answers count from Table 5 divided by the number of students in the respective group (CG or EG). Table 6 The comparison of students’ answers count per student during 8 modules of the Operating Systems course in the experimental and control group. EG CG Module EG (Menti) EG (Chat+microph.) EG (Total) CG (Chat+microph.) Module 1 0,42 0,37 0,79 0,21 Module 2 0,18 0,35 0,53 0,23 Module 3 0,35 0,05 0,40 0,10 Module 4 0,19 0,00 0,19 0,07 Module 5* 0,37 0,00 0,37 0,11 Module 6* 0,05 0,06 0,11 0,02 Module 7 0,00 0,39 0,39 0,17 Module 8* 0,08 0,03 0,11 0,05 Number of students 62 126 In both tables (table 5 and table 6), the answers count per student in EG exceeds the answers count per student in CG. The histogram in figure 13 presents the data from table 6 visually. However, it is important to note that the number of students participating in discussions among students who used Mentimeter is decreasing by the end of the semester. We assume it may be partially caused by becoming Mentimeter more routine for students. 80 Figure 13: The histogram showing the comparison of answers count per student in CG and EG. In order to investigate the existence of statistical differences between CG and EG, we analysed the distributions of answers count per student in both groups. Both distributions are visually close to the normal distribution (figure 14), although this assumption might be inaccurate due to the small size of the samples. According to the results of the Shapiro-Wilk normality test, the p-values for the distributions are 0,6485 for CG and 0,3934 for EG, both are greater than the significance level 0,05. So, we can conclude that the data does not significantly deviate from a normal distribution. The homogeneity of variance of given distributions can be investigated through the F-test (var.test function in R). The ratio of variances is 0,1096445 which is less than 1, and p-value for the F-test is 0,009256, which is less than 0,05. Therefore, the homogeneity assumption of the variance is not met. However, the samples are mutually independent. The Kruskal-Wallis rank sum test can be applied. The null hypothesis and the alternative hypothesis were as follows. H0: both CG and EG have been drawn from identical populations with the same median. H1: CG and EG have different medians. The test showed p-value = 0,01541, which is less than the significance level 0,05. Therefore, we reject the null hypothesis and accept the alternative hypothesis: CG and EG have different medians. We found a statistically-significant difference in the number of answers per student in CG and EG. Also, according to the visual analysis, the number of answers per student in EG is greater than the number of answers per student in CG. 81 Figure 14: Normal Q-Q plots for the distributions of answers per student in CG (left) and EG (right). 4.5. The recommendations for using student response systems within online lectures The analysis of the experience of using Mentimeter during the experiment and its further implementation within the next semesters allowed us to formulate recommendations on efficient usage of SRS at online lections. The recommendations are as follows. • Test online quizzes before the lecture. Moreover, it is highly recommended when new question types are used. • Clarify quiz questions. Some quizzes engage fewer students than expected not because the question is hard, but because the question has an unclear formulation. • Select the relevant question types in each case. • Combine anonymous and non-anonymous quizzes. The anonymous quizzes are recom- mended to engage students who are less confident or answer less quickly. Non-anonymous quizzes attract students who are more active when given the ability to compete. • Add some extra points for students being active during online lectures. The anonymous survey shows that such an approach could additionally motivate some part of the students. • Combine SRS with the more traditional way of interacting with students within the online lecture. Students may write the answers in the meeting chat or turn on the microphone and answer orally. This may prevent interactive surveys from becoming routine for students, so the students would still consider interactive quizzes novel and entertaining. • Combine various types of questions to keep students interested. 82 We see online lectures as a challenge that leads to new opportunities. Taking into account the experience of lectures during online and mixed learning also gives educators a promising option to facilitate students’ experience in regular lectures. Interactive surveys also help the lecturer to see more full feedback and could be used for self-analysis. 4.6. Discussion The article investigated SRSs and their application to facilitating students’ engagement and overall students’ experience during online lectures. We also formulated recommendations for more effective usage of SRS. After a comparison of SRS available for free, the Mentimeter SRS had been chosen for this research. Although all the analysed SRS provide similar functionality, Mentimeter supports questions of multiple types and has no limitations on participant number. Our experience of using Mentimeter within online lectures in Operating Systems for IT students showed the SRS effectiveness for facilitating students’ engagement, as the number of answers per student during the lectures with Mentimeter was greater than the corresponding value without Mentimeter. The difference in the number of answers per student proved to be statistically significant. Furthermore, we formulated recommendations for efficient usage of SRS within online lections. The recommendations summarized our experimental findings, as well as the experience of giving lectures with the use of SRS, and may be applied in online and mixed learning. However, there are some limitations of the study, including the following. • Part of Mentimeter quizzes are anonymous, and the same student may answer more than once. Therefore, it is difficult to take into account highly active students in this case. • The research does not take into account students with disabilities who may experience difficulties answering quickly through SRS and prefer other tools (like a microphone). Moreover, we believe that the findings presented in this article are generalizable and could be applied to lectures for IT students at large. Future studies should focus on the analysis of using different SRS within other course activities (namely, practice and lab classes), as well as choosing the SRS for lectures on other courses for IT students (Cryptology, Cybersecurity, Networking, Higher Mathematics and others). 5. Conclusions SRS are widely used during online and offline student activities. The related work overview shows variuos ways of applying SRS in education. The comparison of the free features of Mentimeter, AhaSlides, Kahoot!, Wooclap, Socrative, Poll Everywhere, and Slido SRS showed they have similar functionality with differences in some features like the maximum number of quizzes, a maximum number of participants or types of questions supported. The Mentimeter SRS had been chosen for this research because it allows an unlimited number of participants and supports questions of multiple types. 83 The experimental part of the study focuses on IT students of Zhytomyr Polytechnic State University (Software Engineering, Computer Science, Computer Engineering, and Cybersecurity specializations). The work provides experimental results on using the Mentimeter student response system at online lectures in the Operating Systems course. During one semester the lectures for Computer Engineering and Cybersecurity second-year students (experimental group) included Mentimeter quizzes, while the second-year students of Software Engineering and Computer Science (control group) during the lectures were questioned only using online meeting chat and microphones. The number of students’ answers in both cases was analysed, showing a statistically-significant difference between the groups. The authors also analyse the data collected from the anonymous survey, which includes the self-reported level of students’ engagement, students’ problems, preferences and suggestions, and students’ answers about their Mentimeter experience during lectures. The results of the study showed an increased number of students’ answers during the lectures in the experimental group. Most of the students from the experimental group, who take part in the survey, reported an increased level of engagement and note that they liked their Mentimeter experience. The analysis of the survey also showed that students in the control and experimental group experienced similar difficulties when attending online lectures. The recommendations on using SRS during online lectures for the lecturer’s interaction with the audience include testing online quizzes before the lecture, clarifying quiz questions, selecting the relevant question types, combining anonymous and non-anonymous quizzes, adding extra points for active students, combining SRS with the traditional way of interacting, and combining various types of questions. Future studies should focus on the analysis of using different SRS during other course activities and choosing the SRS for lectures on other courses of IT students. References [1] V. Kovalchuk, S. Maslich, L. Movchan, Digitalization of vocational education under crisis conditions, Educational Technology Quarterly (2023). doi:10.55056/etq.49. [2] N. Pinchuk, O. Pinchuk, O. Bondarchuk, V. Balakhtar, K. Balakhtar, N. Onopriienko- Kapustina, M. Shyshkina, O. Kuzminska, Personal indicators of occupational stress of employees working remotely in a pandemic quarantine, Educational Technology Quarterly 2022 (2022) 129–142. doi:10.55056/etq.8. [3] A. L. Miller, Adapting to teaching restrictions during the COVID-19 pandemic in Japanese universities, Educational Technology Quarterly 2022 (2022) 251–262. doi:10.55056/etq. 21. [4] S. Koppitsch, J. Meyer, Do points matter? The effects of gamification activities with and without points on student learning and engagement, Marketing Education Review 32 (2022) 45–53. doi:10.1080/10528008.2021.1887745. [5] S. Muir, L. L. Tirlea, B. Elphinstone, M. Huynh, Promoting Classroom Engagement Through the Use of an Online Student Response System: A Mixed Methods Analysis, Journal of Statistics Education 28 (2020) 1–13. doi:10.1080/10691898.2020.1730733. [6] A. Wood, Utilizing technology-enhanced learning in geography: testing student response 84 systems in large lectures, Journal of Geography in Higher Education 44 (2020) 160–170. doi:10.1080/03098265.2019.1697653. [7] A. I. Wang, The wear out effect of a game-based student response system, Computers & Education 82 (2015) 217–227. doi:10.1016/j.compedu.2014.11.004. [8] C. Roman, M. Delgado, M. García-Morales, Socrative, a powerful digital tool for enrich- ing the teaching–learning process and promoting interactive learning in chemistry and chemical engineering studies, Computer Applications in Engineering Education 29 (2021). doi:10.1002/cae.22408. [9] F. Quiroz Canlas, S. Nair, A. Nirmal Doss, Mentimeter App in Computer Science Courses: Integration Model and Students’ Reception, in: 2020 12th International Conference on Education Technology and Computers, ICETC’20, Association for Computing Machinery, New York, NY, USA, 2020, p. 1–5. doi:10.1145/3436756.3436757. [10] T. Tetep, A. Suherman, E. Dimyati, H. Hermansyah, P. Melati, A. Darojat, The Use of Mentimeter Applications in Online Learning during the Covid-19 Pan- demic at the MGMP PPKn Garut Regency, Journal Pekemas 3 (2018) 51–56. URL: http://web.archive.org/web/20220804025339/https://ejournals.institutpendidikan.ac. id/index.php/PEKEMAS/article/view/42. [11] K. K. Patterson, P. Ritwik, C. A. Kerins, A. Adewumi, Real-time measurement for effec- tiveness of novel educational endeavors during the COVID-19 pandemic, J Dent Educ. 85(Suppl 1) (2020). doi:10.1002/jdd.12363. [12] J. D. Benson, K. A. Szucs, M. Taylor, Student Response Systems and Learning: Perceptions of the Student, Occupational Therapy In Health Care 30 (2016) 406–414. doi:10.1080/ 07380577.2016.1222644. [13] D. Lamb, L. Knowles, P. Rattadilok, D. Towey, J. Walker, Can Clicker Technology and the Latest Online Response Systems Enhance Student Engagement? A Comparative Study of Two Approaches, in: C. Hong, W. W. K. Ma (Eds.), Applied Degree Education and the Future of Work: Education 4.0, Lecture Notes in Educational Technology, Springer Singapore, Singapore, 2020, pp. 287–301. doi:10.1007/978-981-15-3142-2_22. [14] G. Katsioudi, E. Kostareli, A Sandwich-model experiment with personal response systems on epigenetics: insights into learning gain, student engagement and satisfaction, The FEBS Journal 11 (2021) 1282–1298. doi:10.1002/2211-5463.13135. [15] C. Habel, M. Stubbs, Mobile phone voting for participation and engagement in a large compulsory law course, Research in Learning Technology 22 (2014). doi:10.3402/rlt. v22.19537. [16] J. Barnett, Implementation of personal response units in very large lecture classes: Student perceptions, Australasian Journal of Educational Technology 22 (2006) 474–494. doi:10. 14742/ajet.1281. [17] R. H. Kay, A. Lesage, Examining the Benefits and Challenges of Using Audience Response Systems: A Review of the Literature, Computers & Education 53 (2009) 819–828. doi:10. 1016/j.compedu.2009.05.001. [18] M. J. Ault, C. K. Horn, Increasing Active Engagement: Guidelines for Using Student Response Systems, Journal of Special Education Technology 33 (2018) 207–216. doi:10. 1177/0162643418775745. [19] V. V. Tkachuk, Y. V. Yechkalo, S. O. Semerikov, The research of process of applying mobile 85 ICT by university students: mobile testing systems and mobile means of multimedia development, Educational Dimension 1 (2019) 125–146. doi:10.31812/educdim.v53i1. 3839. [20] V. Tkachuk, Y. V. Yechkalo, S. Semerikov, M. Kislova, Y. Hladyr, Using Mobile ICT for Online Learning During COVID-19 Lockdown, in: A. Bollin, V. Ermolayev, H. C. Mayr, M. Nikitchenko, A. Spivakovsky, M. V. Tkachuk, V. Yakovyna, G. Zholtkevych (Eds.), Information and Communication Technologies in Education, Research, and Industrial Applications - 16th International Conference, ICTERI 2020, Kharkiv, Ukraine, October 6-10, 2020, Revised Selected Papers, volume 1308 of Communications in Computer and Information Science, Springer, 2020, pp. 46–67. doi:10.1007/978-3-030-77592-6_3. [21] V. V. Tkachuk, Y. V. Yechkalo, S. O. Semerikov, S. M. Khotskina, O. M. Markova, A. S. Taraduda, Distance learning during COVID-19 pandemic: mobile information and communications technology overview, Educational Dimension 7 (2022) 282–291. doi:10.31812/educdim.7612. [22] M. Hollander, D. A. Wolfe, Nonparametric Statistical Methods, John Wiley & Sons, 1973. 86