Adam Sprague, Bellin College, adam.sprague @ bellincollege.edu
Over the past twenty years, online formative assessment has emerged as a valid pedagogical strategy from the combination of research in both formative assessment and computer-assisted assessment. In fact, numerous scholars have synthesized a plethora of knowledge in these two fields of research (Clark, 2012; Conole & Warburton, 2005; Nicol, 2009). One common thread found within these syntheses is that technology can be used successfully by instructors for evaluative purposes (Brown, 1997; Skorczynska, del Saz Rubio, & Carrió-Pastor, 2016). This realization may mean that writing instructors could use student response systems (SRS) to help evaluate how their writers are progressing toward various writing skills as SRS have been used successfully to evaluate a wide range of other skill sets in courses ranging from Sports Management to English as a Second Language. Furthermore, students consistently report that SRS are easy to use and improve their engagement in these environments (Dervan, 2014; Sprague, 2016; Steed, 2013; Williamson-Leadley & Ingram, 2013). As a result of SRS providing immediate, targeted feedback that improves overall learning (Angus & Watson, 2009; Kibble, 2007; Wang, 2007), I wanted to test how they would respond to the use of Kahoot, a mobile SRS, as a way to evaluate their progress with paragraphing within the second unit of a college-level writing course.
Kahoot is an Internet-based SRS that enables students to practice skills in a fun and inviting atmosphere. Teachers can create quizzes, puzzles, surveys, and polls, and students can respond during class time by using a smartphone or computer. By mimicking a game show, Kahoot encourages students to compete with each other, which, research suggests, both increases motivation to learn and increases engagement with class material (Iaremenko, 2017; Wang, 2015; Zarzycka-Piskorz, 2016).
During the Fall 2018 semester, I tested Kahoot’s newest mode, Kahoot Jumble (KJ), with forty of my own students across two sections of a required, first-year Composition & Professional Writing course to see if the software could serve as an effective modality for demonstrating paragraph writing knowledge after a series of lectures, readings, and activities on paragraph writing conventions. I was particularly interested in KJ because it offered a different experience from the other SRS like Socrative in that the mode encourages even more focus and critical thinking. That is, KJ’s questions challenge students to quickly place answers in the correct order rather than only select a single correct answer from a list of possibilities (see Figure 1).
Figure 1. Kahoot Jumble projector layout (on the left) with students’ smartphone layout (on the right). KJ drastically differs from the two other modes in Kahoot. For example, Kahoot Quiz simply asks for the correct answer in multiple-choice fashion, and Kahoot Survey allows teachers to gather only students’ opinions about the prompt they create.
Using KJ as a teacher is an easy and straightforward process. In order to create a quiz, log into your account and select from the quiz, jumble, or survey options displayed under “Create new Kahoot!” Once you select the jumble option, you will be asked to enter a name for the KJ, select “Go!,” and write the first prompt or question. There are a variety of options available when writing questions for the activity, including uploading videos, pictures, and music in order to encourage thinking. A drag and drop option is also provided for adding pictures. You can also play a YouTube video during a specific prompt/question by placing a URL address in the box requiring a website ID.
Once you add the prompt or question (e.g., “Correctly organize the following sentences to make a paragraph”), and you have added any other multimedia features, you can include up to four “answers” for students to drag and drop into the correct order. The answers can be single words or short phrases, but both the questions and answers have character limits. Prompts and questions are limited to 80 characters, while the answers are limited to 60 characters.
You can also adjust the amount of time to answer each question and the number of points each question is worth. Once you have completed the prompt/question, select “+ Add question” at the bottom of the page until you have completed the quiz. After adding the last question, select “Save & Continue” to be asked about language, privacy settings, and the primary audience. There is also an option to include a description of the jumble and the difficulty level of the KJ.
Then, once ready to present your KJ, log in and choose your previously created KJ, which is then displayed on the screen. Students then visit www.kahoot.it via their browser, enter the PIN displayed on the main projector screen, and type their name or nickname (which will then be displayed on the main screen). All names entered are then shown to the class, so both students and teachers can see who has joined the session. Once everyone is accounted for, simply click to start the KJ.
During the KJ, the three top-scoring students will be displayed after each question. This is a useful way of introducing a competitive element, particularly if there’s a reward for the winner. An especially useful feature is that each time you deliver a KJ, the data from all of the participants’ responses are saved. You can choose to download this after the session is over either as a Microsoft Excel file or to import the data directly to Google Drive.
In my class, I introduced paragraph organization and transitional phrases, and my students learned how to place sentences in the appropriate location of a paragraph by only looking for key phrases. They then learned a number of basic transitions presented in the textbook, They Say/I Say. For example, we analyzed basic one-word, two-word, and three-word transitional phrases like “for example,” “next,” “the primary reason,” “another point,” “in conclusion,” “this means,” “moreover,” and so on. We then discussed how such phrases usually appeared in a very specific part of a paragraph. We also analyzed previously published articles and essays to understand this point more fully.
Rather than rely on what I have regularly done, that is, cut up paper copies of paragraphs I’ve written and distribute the randomized sentences to groups to re-order, during the KJ I presented them with the key phrases we had covered and then assigned them to drag them into the correct order on their smartphones and laptops. Immediately, I noticed improved engagement and fun levels compared to the non-digital alternative. Additionally, the results report (a downloadable spreadsheet) allowed me to see who was struggling. I learned much more about each individual learner this way than by the much more difficult approach of walking around the room and checking each student’s work, as one of the sections of my course had an enrollment of more than thirty.
Although it is natural for students to improve over the course of the semester, the average essay grades in the course rose from unit 1 to unit 2 when KJ was implemented in regard to paragraph organization and transition use. Ten points of each essay grade were linked to paragraph organization and transitions, and the average in this category rose from an 8.3 or 83% in unit 1 to a 9.1 or 91% in unit 2. While this positive change in academic performance is encouraging, 14 students also commented on the use of KJ in a short, anonymous survey emailed to them three months after the class concluded and final grades were released. Key written responses included:
I [can’t] believe you made all of those [prompts/questions] for us. They really helped me understand how to do a good paragraph.
The [KJ] games were fun. It was better than reading. It made me really want to win too.
My favorite part was that you gave us cool little prizes for winning. I wish all my teachers at [the college] used [KJ] for review especially for [course title] because [the teacher] is never around and [he/she] doesn’t explain anything and [he/she] doesn’t review anything either.
[KJ] helped me with [transitions]. Words like moreover I don’t even get. I honestly hated the book but [the KJ] told me which ones to use.
I really liked the games. They also showed us exactly what you wanted [in] the essays. It helped me get [an] A.
They were good. I just liked that I could play it after class.
Despite this positive feedback, we know that technology can fail and have several downsides. First, students can be bumped from the game if their WiFi connection drops, which did occasionally happen. Another concern may be the level of noise KJ will create itself and promote in the classroom. In true gameshow fashion, KJ plays music in the background and uses sound effects to mark when time to respond is running low for a particular prompt. While the music and sound effects can encourage engagement with the software, it could also be stressful and cause the classroom to become quite noisy as students yell in excitement or agony over gaining and losing points. Additionally, everyone will need either a phone or laptop in order to participate fully and may feel singled out if they do not have such technology. It is harder to measure and evaluate individual learning if they are then paired in groups versus tackling the KJ independently. Finally, I would not recommend using KJ for each unit. Teachers also need to be aware that KJ’s ease of use and functionality might lead to becoming too reliant upon it rather than varying pedagogical approaches to appeal to a variety of learning styles.
Even with these concerns in mind, the advantages of KJ vastly outweigh the disadvantages. Those kicked out of the game by poor WiFi can easily be partnered up with a peer, and the majority of group activities, digital or non-digital, tend to bring with them a certain expectation for noise. After my experimentation with KJ, I can confidently recommend this modality as an effective way to create intrinsic motivation among writers because it allows them to engage more deeply with their instructor and peers because of its collaborative nature. I feel strongly that such engagement and intrinsic motivation are key to encouraging long-term retention. KJ provides an enjoyable and meaningful learning environment that, if implemented carefully, may further increase the likelihood that students will end the course with a higher writing proficiency than if KJ were not used at all as evident by the rise in essay grades in unit 2 versus unit 1 mentioned above.
While there are certainly numerous ways to teach paragraphing, it can safely be argued that KJ positively impacted my students’ grades and afforded them a more collaborative, engaging, less confusing unit compared to when KJ was not used. Certainly, they could have scored just as highly or perhaps even more highly on the unit 2 essays without the use of KJ; however, KJ provided an enjoyable environment that differed from a textbook, PowerPoint, or traditional lecture. Though more research is needed to fully understand KJ’s efficacy in the writing classroom, and this experiment was conducted with a relatively small sample size, these results are nonetheless a promising addition to the ongoing conversation regarding SRS. I strongly believe that KJ could be an excellent platform for grammar instruction and evaluation in relation to a number of topics.
I will definitely use this tool in future semesters to check my students’ comprehension of paragraphing. As an assessment tool, I think it has some benefits as a way of getting a general sense of knowledge or skill in the room because the nature of the activity demands full class participation and provides a lens through which to view individual results. More importantly, KJ is a useful way of breaking up class sessions and re-energizing students who display signs of boredom. Most importantly, students reported loving the activity and requested to do more throughout the remainder of the semester. Given the easy interface and low learning curve, why not give it a try and share your own results?
Angus, S. D., & Watson, J. (2009). Does regular online testing enhance student learning in the numerical sciences? Robust evidence from a large data set. British Journal of Educational Technology, 40(2), 255-272. https://doi.org/10.1111/j.1467-8535.2008.0091.x
Brown, J. D. (1997). Computers in language testing: Present research and some future directions. Language Learning and Technology, 1(1), 44-59. Retrieved from MLA International Bibliography database. (Accession No. 2002650154)
Clark, I. (2012). Formative assessment: Assessment is for self-regulated learning. Educational Psychology Review, 24(2), 205-249. https://doi.org/10.1007/s10648-011-9191-6
Conole, G., & Warburton, B. (2005). A review of computer-assisted assessment. ALT-J, 13(1), 17-31. https://doi.org/10.1080/0968776042000339772
Dervan, P. (2014). Increasing in-class student engagement using Socrative (an online Student Response System). AISHE-J, 6(3), 1801-1813. http://ojs.aishe.org/index.php/aishe-j/article/view/180/283
Kibble, J. (2007). Use of unsupervised online quizzes as formative assessment in a medical physiology course: Effects of incentives on student participation and performance. Advances in Physiology Education, 31(3), 253-260. https://doi.org/10.1152/advan.00027.2007
Nicol, D. (2009). Assessment for learner self‐regulation: Enhancing achievement in the first year using learning technologies. Assessment & Evaluation in Higher Education, 34(3), 335-352. https://doi.org/10.1080/02602930802255139
Skorczynska, H., del Saz Rubio, M., & Carrió-Pastor, M. L. (2016). Second language teaching and technology. An overview. In Technology implementation in second language teaching and translation studies: New tools, new approaches (pp. 13-32). Singapore: Springer.
Sprague, A. (2016). Improving the ESL graduate writing classroom using Socrative: (Re)considering exit tickets. TESOL Journal, 7(4), 989-998. https://doi.org/10.1002/tesj.295
Steed, A. (2013). Technology in the classroom. Teaching Business & Economics, 17(3), 7-9. through application of digital games in an English language classroom Retrieved from Education Research Complete database. (Accession No. 93286186)
Wang, A. I. (2015). The wear out effect of a game-based student response system. Computers and Education, 82, 217–227. https://doi.org/10.1016/j.compedu.2014.11.004
Wang, T-H. (2007). What strategies are effective for formative assessment in an e‐learning environment? Journal of Computer Assisted Learning, 23(3), 171-186. Retrieved from ERIC database. (Accession No. EJ762695)
Williamson-Leadley, S., & Ingram, N. (2013). Show and tell: Using iPads for assessment in mathematics. Computers in New Zealand Schools: Learning, Teaching, Technology, 25(1-3), 117-137. https://www.otago.ac.nz/cdelt/otago065360.pdf
Zarzycka-Piskorz, E. (2016). Kahoot it or not? Can games be motivating in learning grammar? Teaching English with Technology, 16(3), 17–36. Retrieved from MLA International Bibliography database. (Accession No. 2016651621)