Open access peer-reviewed chapter - ONLINE FIRST

Innovative Methods in Computer Science Education

Written By

Kinga Pusztai

Submitted: 26 November 2022 Reviewed: 23 December 2022 Published: 07 February 2023

DOI: 10.5772/intechopen.109708

STEM Education - Recent Developments and Emerging Trends IntechOpen
STEM Education - Recent Developments and Emerging Trends Edited by Muhammad Azeem Ashraf

From the Edited Volume

STEM Education - Recent Developments and Emerging Trends [Working Title]

Dr. Muhammad Azeem Ashraf and Dr. Samson Maekele Tsegay

Chapter metrics overview

73 Chapter Downloads

View Full Metrics

Abstract

With the appearance of rapid technological changes, the culture of receiving and communicating information, as well as attention, has also changed. Some of the pedagogical methods that were well-proven a few decades ago have now become obsolete. It is important to replace them with new methods that are related to the views of the Z and alpha generations. This chapter deals with the innovation of the ELTE IK Computer Science B.Sc. subject. My innovations are related to the method of digital storytelling and gamification. Besides a brief description of the methods, I will present the changes implemented in my courses, such as making videos or the use of edutainment applications. My innovations have been studied over five semesters, researching students’ feedback on the changes and modifying some elements based on students’ feedback. Furthermore, I was also curious about the changes’ impact on students’ results. From my research, it is clear that the introduced tools motivated the students to spend more time learning the subject, leading to deeper mastery.

Keywords

  • digital storytelling
  • making video
  • multimedia
  • gamification
  • edutainment
  • computational thinking

1. Introduction

Based on their relationship with the information society, we distinguish between generations X, Y, Z, and Alpha [1]. Members of Generation X, today’s 36–50-year olds, are referred to as “digital immigrants.” They are the ones who have come close to the Internet in their adulthood. Members of the Y generation were born in the 1980s and 1990s and are called “digital natives” because they were exposed to the Internet as young children. The members of Generation Z are mostly high school students, but their first representatives have already started their university studies. They are the “Facebook generation,” who do not even know what life is like without the Internet anymore, their primary communication interface is no longer e-mail, but social networking. Lower school and younger children make up the Alpha generation. The use of personal communication devices is a matter of course for the members of the Alpha and Z generations, they wake up and go to bed with smartphones, they are always available and constantly connected to each other online. They can easily manage the rapid flow of information and often change their activities during multitasking. Thus, it is difficult to capture their attention with traditional, frontal tools. They prefer visual presentation over long, unstructured texts.

Of course, there have also been many criticisms of generational theories (e.g., [2]). It is true that the differences between individuals are more diverse than the differences between generations, however, the learning habits of today’s students have changed. In one of my surveys, I asked students about their learning habits using an online questionnaire. Over several semesters, I was able to interview a total of 128 students, 69% of whom preferred watching the tutorial videos, only 16% of whom chose the text notes, and 15% of whom chose the presentation (Figure 1).

Figure 1.

Student feedback on the use of video.

All these results show that it would be important for us as teachers to introduce new methods in our classes that better support the communication and motivational structure of today’s net generation.

A possible solution to this problem is the application of digital storytelling in education and gamification These methods also fit in with the philosophy of computational thinking, so their application supports the development of computational thinking skills.

Advertisement

2. Methods used in the courses

2.1 Digital storytelling

2.1.1 History of digital storytelling

The storytelling method was first complemented by visuality. In the 1970s and 1980s, several visual storytelling works were born in the United States that combined traditional, personal narration with visual elements Visual storytelling was able to reach a wider audience with its personal tone and the stories of everyday people, so it became popular.

The public could easily identify with the personal messages, “video postcards,” that were introduced first in traveling theater performances, and later even on television. The stories made the audience recall their own experiences and the video postcards soon became very popular. At the same time, the method became more widely used in various fields of education [3].

With the explosive spread of digital technology, it has become easier to create personal messages and short films that were previously created only by mobilizing large technical equipment. The first workshop about “personal narrative short film” making was held in 1993 at the American Film Institute in San Francisco. The Center for Digital Storytelling (known as StoryCenter since 2015), led by Joe Lambert, was established at the same place a year later, and its members developed a digital storytelling methodology that combines classical storytelling with the multimedia possibilities of digital technology. The nonprofit organization’s objectives included empowering people who are not professionals to display their life stories in a self-contained, individual character, cinematic way. They saw themselves as a movement, the focus of their workshops was on capturing individual life paths, they wanted to create a community with the participation of people in a similar situation. The StoryCenter has collaborated with nearly 100 US universities and colleges to incorporate digital storytelling into a range of courses [4].

In Europe, the method was formally introduced in 2003, when the regional service BBC Wales, based on the experiences of other digital storytelling courses held in various locations, started its own program and center, called Capture Wales, under the leadership of Daniel Meadows. At the same time, the Australian Centre for the Moving Image in Melbourne also began to work extensively with the Centre for Digital Storytelling.

BBC Wales reached a wide audience with its grand project where every night the audience could see a personal digital story related to the region. Due to the success of the show similar projects have been initiated in many educational, health development, and community areas. As a result, Wales had become the European Centre of digital storytelling.

Digital storytelling is interpreted and used differently in different countries and institutions. To present and discuss these, festivals are held annually in Wales and international conferences in different countries every few years.

2.1.2 Definitions of digital storytelling

There are many different definitions of “digital storytelling” (DST). Daniel Meadows defines digital stories as “short, personal multimedia tales told from the heart” or “multimedia sonnets from the people” [5].

Digital storytelling can be defined by Leslie Rule as “the modern expression of the ancient art of storytelling. Digital stories derive their power by weaving images, music, narrative, and voice together, thereby giving deep dimension and vivid colour to characters, situations, experiences, and insights” [6].

According to the most common definitions, DST is a new learning organization process that combines traditional storytelling with the use of digital tools [7]. The idea is that students do not use digital tools for their ends, but create unique narratives, specific multimedia creations that capture the attention and enthusiasm of their peers and generate communication on the topic within the learning community.

Some researchers (e.g., Lanszki [4] and Ohler [8]) move away from a tool-using approach to focus on the process, that is, the emphasis is not on digital but on storytelling. In their opinion, good digital storytelling is based on well-written text, and the use of technology is only for expressing content. “Medium and technology provide a supportive background for the creation of a fragmented world based on individual perception, summarized in a narrative framework, and thus based on causality” [9].

In the case of science subjects, it can be assumed that—due to the age of the students—few digital stories exploring individual life situations are produced. It is, therefore, necessary to interpret the definition of digital storytelling widely: not only individual life stories are included in the class of digital stories but also narrated audio-visual presentations created using the steps of the method. This is how DST has been interpreted by the professors at the University of Houston [10] and this is the interpretation I have used in my classes.

2.1.3 Introducing the digital storytelling method

A digital story is a 2–3-minute individual multimedia narrative, usually written in E/1, told in the student’s own voice [4]. The narrator emphasizes the most important elements of the story in keeping with the tight time frame, and the logical priority of the events is increased [11]. Too long a film degrades the quality of the digital story.

The steps of the method are defined in the following five stages [12]:

  1. Students first write the text that serves as the basis of the story. The ideal digital story is essential to the point and concise (Formally: approx. 250 words, which is an A/4 page).

  2. In the second stage, students recite their written text, that is, create an audio file.

  3. In the third stage, students design and create the visual elements needed for the text. It is important not to use more than 15–20 images.

  4. The fourth stage is editing, where students use a video editing software of their choice (e.g., Microsoft Movie Maker, Sony Vegas, or online editing apps that can be used with smart devices such as WeVideo and Power Director) to create their digital story.

  5. The final stage is to screen, discuss and evaluate the work.

Digital storytelling is also very popular with students because it uses activities that students already do in their peer groups: sharing pictures, videos, and stories. However, during the application of the method, these activities are supplemented by a phase of independent learning and thinking, as well as a critical and constructive interpretation of each other’s work.

Using this method, learners are active participants in learning. They feel responsible for the quality of their work and therefore pay sustained attention to the content they are working on.

DST also promotes the development of internal motivation, as students create for the outcome and for the experience itself, while being pervaded with a sense of surpassing themselves and their abilities [13].

In addition to its positive effect on students [14] and performance [15], it improves students’ problem-solving skills, the ability to learn independently, and the development of critical thinking.

The use of DST promotes students’ active and creative participation in the teaching-learning process, in which independent and cooperative forms of.

One of the most important features of DST is that during the process, students go from the abstract to the concrete level step by step [16].

2.2 Gamification and edutainment

Nowadays, the most common definition comes from Deterding in 2011, which says: “Gamification is the use of game design elements in non-game contexts” [17]. The concept of gamification in the public mind appeared in the early 2000s and has become popular since 2010, but its origins can be traced far back.

Fuchs [18] found examples of the use of gamification in the army dating back to early Roman times.

Zichermann and Linder [19] give an example of how Napoleon used gamification in 1795 when he organized a nationwide puzzle with a reward of 12,000 francs to solve the problem of storing food supplies for war. Thanks to the game, even before Pasteur, master chef and pastry chef Nicolas Appert invented the pasteurization process, which allowed food to remain fresh for more than 4 months.

The ancestor of today’s widespread loyalty programs is the Green Stamps scheme, implemented by S&H in 1896 [20], which allowed customers to collect green stamps and points in a point-collecting book. When the book was full, customers could redeem it for various interesting products.

Two very important concepts appear in the definitions of gamification: game elements and game mechanisms, which are often called game design techniques. Game elements refer to tools taken from traditional and video games and game mechanisms to the application of the operating principle.

Of course, tools will only work effectively if the mechanisms of a game are given: a game is voluntary, promising success, transparent, and properly delimited (providing proper time).

The “out-of-game context” term in the definition implies that the purpose of the game is different from that of gamification. The biggest difference between game and gamification is that a game is an activity providing entertainment or amusement whereas the purpose of gamification is to do something to achieve a predefined goal in real life.

One interesting subset of gamification is edutainment, which is “a portmanteau combining the words ‘Education’ and ‘Entertainment’. It is delivering information and knowledge by means of entertainment for making it enjoyable” [21].

This term was first used by The Walt Disney Company in 1948 to talk about the “True Life Adventure” series [22].

Edutainment technology comes in many forms. A streaming video platform or a prepackaged learning product can be categorized as edutainment if it has both entertainment and educational value. Edutainment is very much an issue in developing modern digital and hybrid curricula for the classroom, and for supplementary educational use [23].

Advertisement

3. Application of the methods in my courses

“Algorithms and Data Structures” is a two-semester course that students can take in the second semester of the first year and the first semester of the second year. In both semesters, the courses consist of a two-credit lecture and a three-credit practice, which does not include a lab where they can program the learned algorithms. Students must take an exam on the lecture material and write mid-term and end-term tests on the practical material, which forms the basis of their practical grade. Many other courses are based on these courses, so it is important that students complete them on time.

Due to the theoretical nature and early appearance of the subject Algorithms and Data Structures, I found it necessary to use the possibilities of the above-mentioned innovative methods in the practical lessons. I also had the students comment on the changes with the student’s anonymous questionnaires. in the form of an anonymous questionnaire.

As a first step, I introduced the point system in the semester [24], that is, it is not only the results of the tests that matter in the evaluation. Students must write two tests, each of 60 points. The compulsory minimum score for both papers is 20 points. (If someone does not reach the mandatory minimum on tests, they must retake them, that is, failed tests cannot be compensated for by other scores by any other scoring.) If the combined result of the 2 tests reaches 40 points, they can improve their practical grade by a further 20 points. I gave 7 different options for this, which are as follows:

  • One of my biggest and most successful innovations was that I created a—usually playful—edutainment application for each class, with the deadline for submission at the end of the study period. If they solved the assignment completely well, they were given one point and initially 5 minutes of “opportunity.” (I doubled that reward for a more difficult assignment.) The 5-minute “opportunity” can be used for class delays, missing a class in total, or to get more time on the test.

  • “Opportunities” were later given for completing the quizzes correctly. Each practical lesson also included a quiz, which students had to complete with a 70%. I rewarded the solutions between 70% and 100% with “opportunities,” which were measured in minutes.

  • Students could also make tutorial videos, which I rewarded with a maximum of 20 points. This assignment could also be solved in groups.

  • They could cut my recordings, thereby creating several shorter videos. For this, they could get a maximum of 10 points.

  • Students are given homework in every class, which is not compulsory. They had to solve this task by the beginning of the next class at the latest. It was worth two points.

  • They were given programming problems for some topics, which were accepted for submission until the end of the study period, and rewarded with a maximum of 10 points depending on the difficulty of the assignment.

  • During the semester students were given the opportunity to carry out more serious research on a topic related to the course and to share it with us in some form for a maximum of 20 points

I also tried to change the course of the lessons, we started each lesson with a Kahoot, which was excellent for revising, activating the students, and creating a relaxed atmosphere.

3.1 Making videos

For my practical lessons, I have made available to the students supplementary materials on my website that explain or illustrate the operation of each algorithm and data structure. Some of these are YouTube videos, others are animations and visualizations from the website www.algoanim.ide.sk [25], and the most important elements are videos and presentations made by the students or by me.

When preparing the students’ digital stories, I required some of the formal features of DST, but not all. For example, I did not ask for the text to be in E/1, but I drew their attention to the brevity and conciseness of the video. Furthermore, I did not accept the use of only still images, I required that all works should use animation. In terms of its form, I preferred the video, but I also accepted the presentation.

I used the method for five semesters. My goal was for the students to create a video of their own based on the shared resources—either as group work or independent creation. As a support, I provided consultation opportunities for the students, which were used by about a third of the students. The initial enthusiasm was great, several people even came to the consultation, but every semester some of the commitments were not realized. It also happened that they undertook a topic as a group, but in the end, only one student made the promised video. In total, 12 videos (8 of students and 4 of my own) and 4 presentations were completed, which can be available on my website.1

I graded the videos differently each semester. In general, the maximum number of points per video was 20. (That means, the maximum available extra point was obtained with this assignment.) If the students made a video together, they shared the 20 points. One semester, however, I offered only 10 points for making a video, but this proved to be insufficient. No video was made this semester, and the reason is that a demanding video takes a lot of time.

I preferred the group work method for making a video, as group work has many advantages. Initially, I did not even want to accept a video made by individual work, but due to the COVID-19 epidemic, personal classes were replaced by online classes, making collaborative work much more difficult. For the first few experimental semesters, only individual videos were made. Later, when education returned to traditional personal classes, more and more groups—usually consisting of two people—were formed. The videos created in groups were also worth 20 points, which the students divided among themselves. In addition, I asked the groups to keep a “project diary,” which included who had completed each subtask. This showed that the students were excellent at the organizational assignments, generally undertaking and solving assignments in equal measure.

I was generally satisfied with the quality of the videos. Most of the videos were made with some kind of video editing software (Figure 2a), and the smaller part was a video version of an animated PowerPoint presentation (Figure 2b). I gave some aspects related to the content, which the students followed. I found a small mistake in only one video, which was later corrected.

Figure 2.

Screenshot from a student video.

I asked the students to make short videos of a maximum of 5 minutes. They tried to comply with this, only three videos exceeded the 5-minute limit, but they were very close to 10 minutes. Thus, the average time of the video was 4.95 minutes, the standard deviation was 3.22 and the median was 4.04. I think I will have to advertise this even more prominently in the future.

I was also curious about the students’ feedbacks, which I investigated using online questionnaires.

In my first questionnaire, I researched their study habits. This was completed by 128 students. A five-point Likert scale was rated at an average of 4.77 that tutorial videos are available on my website, and students think it’s a good idea to make a video for extra points. According to them, video production “involves the student in the learning, they feel it is important to prepare in detail, while making the video they understand perfectly how the algorithm works.”

In another questionnaire, I was wondering how students rate their peers’ videos. The questionnaire was completed by 22 students, with an average rating of 4.55 for all videos (Figure 3) and all but one student (93%) rated it as helpful to their learning.

Figure 3.

Student feedback on the student videos.

After making the video, I asked the students about video making. At first, I was curious about their motivation. Many of them emphasized giving help over their peers, as they understood the material and found it easier to learn with the help of video. Some were inspired by the lack of easily understandable Hungarian literature for the course material. But there were those who chose this task to get extra points because they have been making videos for a long time and liked making a video.

Secondly, I was curious about the difference between design and implementation. Most of the students did not have unexpected difficulties or did not spend more time making the video because they felt free to implement it or had already planned the video in advance. However, there was a student who did not plan ahead and took more time to make the video. According to him, he had to start over several times, redoing it, because he came up with a new idea in the process, which would make the assignment more understandable or better illustrated.

Finally, I wondered whether it helped them to learn the course material, which they rated an average of four on a scale of five. They were also asked to write their personal opinions on the assignment, one of which was: “I think it is very useful because the person who does it is more immersed in the subject and someone who might be behind in the subject can understand it more easily than from literature.”

This semester, I’m doing more research. It would be nice if a high-quality video could be made to all the material for the two-semester subject, but that would take a lot of time. Therefore, I thought of using the lessons I recorded during online teaching, cutting and possibly re-editing them to create short tutorial videos. These videos will not only help students to learn the material but may also be used by students to create their own videos. I think that by doing this, the time of video making will be reduced and the way it is done will possibly be simplified, which may help me to engage more students, so hopefully, more student videos will be made in a semester.

While cutting my recordings, another idea occurred to me: what if I gave this assignment to the students? Editing one lesson sounds easy, but it needs to be watched several times to do the assignment well. That way, the student who undertakes the assignment is unwittingly learning the material. Of course, the maximum score that can be obtained by doing this is only half of the score that can be obtained by making a real video.

My plan is working, the students are very happy to apply for the task, and I am sure that at least one video will be produced for each lesson of the semester. As a result of the edits, the videos are shorter and of better quality, so I think they are more useful than if I were to share the footage with the students.

I was also curious about the students’ feedback, which I researched through an anonymous online questionnaire completed by 22 students. On a five-point Likert scale, the question “How much do you like the idea of make tutorial videos for each lesson by editing last year’s recordings?” was rated 4.55 (with a standard deviation of 0.67 with five mode and median) 81.8% of students think that making shorter videos of the recordings, possibly more, is more useful than sharing uncut recordings, and only 18.2% of students chose uncut recordings (Figure 4). Students could also comment on my idea, one of them is: “It is easier to process the material in several smaller parts, videos are helpful.”

Figure 4.

Feedback from students on which type of video is more useful.

3.2 Edutainment apps

During the two semesters, I created a total of 46 edutainment applications for a total of 34 topics. I used the following four frameworks to create these—usually playful—“apps”: LearningApps, Kahoot, Quizlet, and HotPotatoes. These were either assigned as optional homework or were part of the class.

I have used the optional homework apps for several semesters and have modified them several times based on constructive feedback from students. This is one of the reasons why 38 apps were created for the 24 topics. Some of them are presented in Gamification in Higher Education [26], but all of them are available on my website.2

These tasks were characterized by high student participation, with a total of more than 4800 views of the apps, which means that on average an app had more than 141 views. (In Crops, the correct solution is less.) Student activity is also evidenced by the result that there was an optional task where 18 out of 20 students dealt with the task, which means 90% participation. Student feedback was surveyed using online anonymous questionnaires. A total of 343 students rated the apps (over five semesters) with an average rating of 4.56 (with a standard deviation of 0.56) and 96.2% said that the app helped them with their learning. Although there was no obligation to write a review of the apps, there were more than I expected (90). Some of these were “Useful and easy to learn with,” “Very good, it made me realize that I had some shortcomings,” “It summarizes what you need to know in a concise way, I find it very useful and cool,” “It’s imaginative and helps a lot to put into practice the knowledge you’ve acquired,” “I found the apps very useful, they added a lot to writing successful tests,” “They helped me a lot in understanding algorithms/data structures as I prefer visual learning. It is a playful way to understand the material, reducing the energy involved,” “I think these assignments have really, really boosted my performance in the subject. Playful and interactive learning can move university students in the same way. Especially in the midst of such a dry online education, these easier, yet very helpful and enjoyable tasks were refreshing. It was nice to be able to do them until they were 100%, it was also positive feedback and we always had to guide ourselves to the right solutions. The extra credit was also a great motivation, but anyone who did it just for that reason must have learned along the way, wittingly or unwittingly. I think it’s a very good idea and I’m grateful that we had the opportunity to do it from this subject.”

In the second semester, I started the lessons with Kahoot quizzes. I find it very useful because it is easy to use, it does not take too much time, but the students really like it, it brightens up the course of the lessons, it briefly summarizes the knowledge of the previous lesson, it confronts the students with their shortcomings and after using it, the students pay more attention. In addition, it also sends me feedback on the success of the solutions to each question and the student’s performances.

To use a Kahoot quiz in class, it is not enough to have a teacher’s computer and a projector or a smart board, each student needs to have a phone or laptop. To start, I launch the app on the teacher’s computer from Kahoot.com, which projects a game pin. Students join the game by entering the projected pin and a nickname on Kahoot.it. After that, the game starts. Each question is projected to the audience (Figure 5a), and they answer using their phones, where they see only the colors and shapes of the answers (Figure 5b). A given amount of time (usually 20 seconds) is allowed for each question. The time can be varied; I increased it to 30 seconds for several questions. There are several types of questions to choose from, the best known being simple quizzes with up to four choices, then true/false questions, type answer questions, and puzzles. Questions can have multiple correct answers. Students are awarded points for correct answers, which also take into account speed. After each question, it shows the podium positions (1–3) based on the points scored, and we have the opportunity to project the image again and discuss the answer to the question.

Figure 5.

(a) Kahoot question, (b) Kahoot question on student’s phone.

In compiling a Kahoot quiz, I considered the following aspects.

  • First of all, I consider it important that a Kahoot quiz is not too long and does not want to repeat too much knowledge. (A quiz for me consists of 5–7 questions.) A Kahoot that is too long becomes boring takes a lot of time out of the lesson and contains more information than the students can remember, so it is not effective.

  • It is important to pay attention to the time limit. If the question is too long, then more time should be allocated, but if you overestimate the question and allocate more time than necessary, then the competitive spirit is reduced, and thus the students’ motivation.

  • One of Kahoot’s drawbacks is that it shows the images in a small size. An image is at its largest when it is lying flat and its aspect ratio is closest to 350:220.

I researched the students’ feedback using an anonymous questionnaire, which was filled out by 21 students. All of them liked the fact that we repeat Kahoot at the beginning of the lesson, and on a five-point scale, they rated its usefulness with an average of 4.65 (standard deviation of 0.81, a mode and median of 5). Some opinions about it: “It creates a good mood for the entire lesson”, “Absolutely good, playful but also useful,” “Useful for interactive revision and checking existing knowledge.”

3.3 Canvas quizzes

The success of edutainment applications gave the next idea on how to better engage students in online lessons. For each class of the two semesters, my colleagues and I, at my suggestion, prepared a 10-point canvas quiz that students had to complete with a minimum of 7 points. I rewarded results above 7 points with “opportunities,” which allowed them to get more time on a test or miss one more class. The quizzes were to be completed as homework and the essence of which is to deepen the knowledge learned in class.

When editing the quizzes, we paid attention to the following principles:

  • It should be solved in about 10 minutes.

  • Include a theoretical question, but most of it should be a practical task to deepen the knowledge of the algorithms and data structures learned in class.

  • The quiz should be able to be completed several times.

  • The correct solutions should be visible after a certain time.

A total of 54 students filled out the questionnaire about the quizzes. The reliability index of the questionnaire is 0.88.

On a scale of one to five, the question “How much did you like the quizzes used in the practice?” question was rated 4.19 (with a standard deviation of 0.89), and the question “How much did the quizzes help you learn?” was rated 4.54 (with a standard deviation of 0.91) (Figure 6). While the most common rating for the first question was a 4, for the second question more than half of the answers were 5. Students were also allowed to comment on the quizzes, which was not compulsory, but still received more comments than expected. Some of them were “The quizzes help a lot in preparation, I think it’s an excellent idea,” “They are very funny and creative. I like that they playfully provide knowledge. It worked for me,” “I find both the Canvas quizzes and the apps very useful. Many times, the student may think he understands the material, but during the quizzes he encounters different examples than in class and it can be easy to find out that he hasn’t understood everything well,” “They were very motivating.” “The quizzes helped me a lot in learning, I especially liked that it was not necessary to solve it the first time, but several attempts were possible.” “Sometimes you have to ‘hunt’ for the information to fill it in correctly, but they are absolutely good for practice. Some specific cases did not occur in practice, so it absolutely supported the understanding of the material.”

Figure 6.

Student feedback on the use of quizzes.

Advertisement

4. Impact of using innovative methods

4.1 Comparison of practical grades

In the above chapters, I have presented the changes made in Algorithms and Data Structures I and II and the positive feedback of the students, but I think it is more important that the positive impact of the changes is reflected in the student’s grades, so I have researched this as well.

In my research, I compared the results of my experimental groups with the results of groups whose courses did not include the innovative elements I introduced. Completing the course is a two-step process, students must first obtain a practical grade and then take an exam. Although the primary objective of the course I taught was to obtain a practical grade, I made a comparison of the results of both grades.

For the practical grade, I used my previous groups as a control group, whose students attended the course in a traditional way. During the semester, students are required to write two tests. The tests of the “Test” groups consisted of five tasks, for which they were supposed to be given 1.5 hours, but I allowed an extension of time for those who requested. The “Innovative” groups consisted of 6 tasks per test, also given 1.5 hours, but the extension of time was not automatic, but conditional. (Students can get “opportunities” during the lessons, which can also be used to extend time.) Five tasks of the test were similar to the “Test” group, and task 6 was to write an algorithm using a learned technique. (The students found this task the most difficult.) It can therefore be said that the “Innovative” groups had to solve a more difficult test in the same amount of time compared to the “Test” groups. Table 1 shows the results of my “Test” and “Innovative” groups. My “Innovative” group scored 8.464 (39 and 31.306) better on average on the mid-term test and 12.125 (42.355 and 30.23) better on the end-term test, for a total of 23.29 (87.6075 and 64.316) more points for the semester, for an average grade improvement of 1.55 (3.75 and 2.2). The changes in the standard deviation values of my “Innovative” groups (2.013 lower than the mid-term paper, 1.3 higher than the end-term paper, 0.4 lower for the total score, and 0.07 lower for the marks) indicate, that the students performed more consistently. The mode of grades of the “Innovative” groups varied between 3 and 5 (average 3.75), while this value of the “Test” groups ranged between 1 and 3 (average 2.2). In the “Innovative” groups, the median of the grade also ranged between 3 and 5 (average 3.75), while in the “Test” groups this value ranged between 2 and 3 (average 2.6).

CodeMid-termEnd-termTotal scoreGrades
AverageSTDaAverageSTDaAverageSTDaAverageSTDaModeMedian
Results of innovative groups
19_442.28.2347.437.3889.6316.474.540.9433
19_1135.417.7138.4410.7979.521.663.60.8433
20_642.479.9943.7910.9894.9824.794.241.0355
20_13399.9339.7610.3686.3224.033.91.0444
Average39.77896542.35598.77587.607521.73754.070.96253.753.75
Results of test groups
15_1229.2710.7424.338.2456.820.22.671.132
15_1324.829.2721.276.5347.6522.682.081.1212
16_1325.5512.832.195.958.6221.592.8123
18_838.5213.2333.8112.5375.7325.883.380.9523
18_1138.378.8539.559.6882.7820.383.73133
Average31.30610.97830.23857664.31622.146293210342.22.6
Difference8464−201312.12513.01523.2915−0.40851138−0.07151.551.15

Table 1.

Comparison of practical grades.

Standard deviation.


The results also show that the students took advantage of the opportunities to get extra points, so they invested more work in the course during the semester.

4.2 Comparison of exam grades

I chose a different control group when examining the exam marks. I thought that my survey would better reflect reality if I compared the performance of the examined groups on the same assignments, so I took all students from the entire year as the control group. Furthermore, by choosing the control group in this way, I was able to more easily solve the problem of comparing the results of the students who took the online exam with the results of their fellow students who took the online exam, while I was able to compare the results of the students who participated in the face-to-face exam with the results of the students who participated in the face-to-face exam. This allowed me to conduct my observations over several semesters, which led to more reliable results.

Figure 7 shows the results of my “Test” and “Innovative” groups. Similar to the practical grades, the improvements were observed here as well (0.63, 0.44, 0.24, and 0.32), with an average improvement of 0.4 compared to their year-group peers.

Figure 7.

Comparison of exam grades.

Based on my statistical tests, it can be said that my innovations positively impacted not only the students’ motivation but also their performance.

Advertisement

5. Conclusion

In a rapidly changing world, education needs to keep pace with developments. As today’s growing generation is born into an online world, education needs to open up to smart devices. The application of digital storytelling or gamification in education shows us a possible path to this.

My article deals with the innovation of a specific university course and its effects. First, I briefly introduced the methods I used, then I showed how I innovated in my Algorithms and Data Structures I and II courses, and finally, I examined how my changes impacted my students. In addition to the student’s feedback, I also investigated the impact on their performance.

In summary, the high participation of students in the surveys, the content of their answers, and their achievement clearly proved that the use of these computational thinking teaching activities increases the efficiency and quality of the knowledge transfer process, helps in understanding and deepening the course material, and adds color to the lesson. In all the surveys, students clearly expressed their satisfaction and became more interested and motivated.

Based on all of this, I recommend using the innovative methods presented in the article in higher education, which can be used to improve computational thinking skills even more.

References

  1. 1. Nagyés Á, Kölcsey A. Mit takar az alfa-generáció? Metszetek. 2017;6(3):20-30
  2. 2. J. Ollé, A spekulatív jellegű digitális nemzedékelméletek kritikai áttekintése, 2014. Available from: https://www.slideshare.net/ollejanos/a-spekulativ-jellegu-digitalis-nemzedekelmeletek-kritikai-attekintese [Accessed: October 10, 2021]
  3. 3. E. Anthropolis, A digitális történetmesélés története, 2012. Available from: https://digitalistortenetmeseles.hu/tortenete/ [Accessed: April 3, 2022]
  4. 4. Lanszki A. Digitális történetmesélés és tanulói tartalom(re)konstrukció. Új Pedagógiai Szemle. 2016;66(3-4):82-88
  5. 5. EduTech Wiki, Digital storytelling, 2019. Available from: http://edutechwiki.unige.ch/en/Digital_storytelling. [Accessed: April 11, 2022]
  6. 6. Sadik A. Digital storytelling: a meaningful technology-integrated approach for engaged student learning. Educational Technology Research and Development. 2008;56(4):487-506
  7. 7. Michalski P, Hodgesés D, Banister S. Digital storytelling in the middle childhood, special education classroom: A teacher’s story of adaptations. Teaching Exceptional Children Plus. 2015;8(4):3
  8. 8. Ohler J. Digital Storytelling in the Classroom, New Media Pathways to Literacy, Learning and Creativity. Thousand Oaks, CA: Corwin Press; 2013
  9. 9. A. Lanszki, A tanulói aktivitás szerepe a digitális történetmesélésben, in Interaktív oktatásinformatika, Lévai Papp-Danka, Szerk., Budapest: ELTE Eötvös Kiadó, 2015, pp. 79-91
  10. 10. University of Houston, What is digital storytelling?, 2021. [Online]. Available from: https://digitalstorytelling.coe.uh.edu/page.cfm?id=27&cid=27&sublinkid=29 [Accessed: October 26, 2021]
  11. 11. Meadows D. Digital storytelling: Research-based practice in new media. Visual Communication. 2003;2(2):189-193
  12. 12. H. C. Barrett, How to create simple digital stories, 2009. [Online]. Available from: http://electronicportfolios.com/digistory/howto.html [Accessed: April 02, 2021]
  13. 13. A. Lanszki, A digitális történetmesélés mint komplex tanulásszervezési eljárás, in Digitális történetmesélés a nevelési-oktatási folyamatban, A. Lanszki, Szerk., Eger, Líceum Kiadó, 2017, pp. 22-43
  14. 14. Abdolmanafi-Rokniés SJ, Qarajeh M. Digital storytelling in EFL classrooms: The effect on the oral performance. International Journal of Language and Linguistics. 2014;4:252-257
  15. 15. Smeda N, Dakichés E, Sharda N. The effectiveness of digital storytelling in the classrooms: A comprehensive study. Smart Learning Environments. 2014;1(6):21
  16. 16. A. Lanszki, A digitális történetmesélés relevanciája a tanárképzésben, in Felkészülés a pályára, felkszülés az életre, F. Iván, Szerk., Eger, EKF Líceum Kiadó, 2015, p. 89−97
  17. 17. S. Deterding, M. Sicart, L. Nacke, K. O’Haraés, D. Dixon, Gamification: Using game-design elements in non-gaming contexts, Proceedings of the 2011 Annual Conference Extended Abstracts on Human Factors in Computing Systems, pp. 2425-2428, 2011
  18. 18. Németh T. English Knight: Gamifying the EFL Classroom. Piliscsaba: Pázmány Péter Katolikus Egyetem Bölcsészet- és Társadalomtudományi Kar; 2015
  19. 19. Zichermannés G, Linder J. The Gamification Revolution: How Leaders Leverage Game Mechanics to Crush the Competition. New York: McGraw-Hill; 2013
  20. 20. Nagy JR. A gamification megjelenése a magyar középiskolákban. Budapest: BMGE; 2017
  21. 21. Sidrah S. This is how edutainment can be a creative solution to all your problems!, [Online]. Available from: https://leverageedu.com/blog/edutainment/ [Accessed: January 15, 2021]
  22. 22. A. Belyh, Edutainment, [Online]. Available from: https://www.cleverism.com/lexicon/edutainment/ [Accessed: November 1, 2022]
  23. 23. Technopedia, Edutainment. [Online]. Available from: https://www.techopedia.com/definition/5506/edutainment. [Accessed: February 23, 2021]
  24. 24. Nádoriés G, Prievara T. IKT módszertan: Kézikönyv az info-kommunikációs eszközök tanórai használatához. Budapest: Tanárblog.hu; 2012
  25. 25. Veghés L, Udvaros J. Possibilities of creating interactive 2D animations for education using HTML5 canvas javascript libraries. eLearning and Software for Education. 2020;2:269-274
  26. 26. Kovácsné Pusztai K. Gamification in higher education. Teaching Mathematics and Computer Science. 2021;18(2):87-106

Notes

  • https://people.inf.elte.hu/kinga/algoritmusok1/video.htm
  • https://people.inf.elte.hu/kinga/algoritmusok2/seged_kitoltott.htm

Written By

Kinga Pusztai

Submitted: 26 November 2022 Reviewed: 23 December 2022 Published: 07 February 2023