NSF Awards: 1657160
Programming is increasingly becoming a necessary 21st century skill. As more people turn to online resources to learn more about programming, it is necessary to provide resources that both measurably engage and teach the necessary concepts. Unfortunately, there are not enough teachers to help guide learners through difficult concepts or provide encouragement when learners become frustrated or disengaged. This work begins to address these gaps by providing learners a full introductory programming curriculum in the form of an online puzzle game. Users learn programming concepts by solving puzzles (debugging existing code). The game provides various help, including customized help messages, encouragement, and adaptive levels for more practice.
NSF Awards: 1657160
Programming is increasingly becoming a necessary 21st century skill. As more people turn to online resources to learn more about programming, it is necessary to provide resources that both measurably engage and teach the necessary concepts. Unfortunately, there are not enough teachers to help guide learners through difficult concepts or provide encouragement when learners become frustrated or disengaged. This work begins to address these gaps by providing learners a full introductory programming curriculum in the form of an online puzzle game. Users learn programming concepts by solving puzzles (debugging existing code). The game provides various help, including customized help messages, encouragement, and adaptive levels for more practice.
Continue the discussion of this presentation on the Multiplex. Go to Multiplex
Michael Lee
Assistant Professor
Welcome to our 2018 STEM for All Video Showcase submission! We are excited to see all the other great work posted for this event. Our work, Gidget (helpgidget.org), continues to engage thousands of people all over the world with programming. We hope this video leads to lots of great engagement, discussion, and possibly collaboration! We are looking forward to your comments and questions!
Scot Osterweil
Research Scientist
Gidget looks like a very worthwhile effort. I would love to understand more about the measurable outcomes that you mention in the video. I'd also be interested in hearing more about evidence that Gidget is working for kids who otherwise wouldn't respond to CS education (the only study you mention involved honors students). It seems plausible that it might be effective for others, so it's worth sharing whatever evidence you've surfaced.
Michael Lee
Assistant Professor
Whoops, I did not initially see the "reply" button. Response below!
Michael Lee
Assistant Professor
Thank you for the excellent questions Scot! I'll refer to my papers in my response, below (also available at my website: pixel42.com/cv).
We primarily measured engagement (operationalized primaily as time on task, and also as tasks completed), and learning outcomes (using a language agnostic pre-post test of knowledge based on Tew & Guzdial's FCS1 work from 2011). Engagement papers include: Lee & Ko 2011, Lee & Ko 2012, Lee et al. 2013; Learning papers include: Lee & Ko 2015.
Beside working with minority youth, we also used Gidget in several summer camps for teenage girls, and teenagers from rural areas (Lee et al. 2014, Jernigan et al. 2015). In addition to our youth, there is evidence that we can change adults' negative preconceptions of coding with a short introduction to CS using Gidget (Charters et al. 2013). We are continuing to learn more about our users and iteratively improving Gidget!
Please let me know if you have more questions!
Jessica Hammer
Assistant Professor
Thanks for sharing this video! I'd like to hear more about the design of the game, and in particular how it's related to other games in the "program a robot" space. That's a very popular genre for teaching computer science concepts. Why did you decide to focus on debugging? What advantages does that have over other approaches to robot-path-building games? What choices did a debugging approach imply for your design?
Michael Lee
Assistant Professor
Thank you for the visit and question Jessica! You brought up many great things that I will try to address in order:
We started with the idea that we had to keep online users engaged and our first several studies were sets of controlled experiments with different game interactions (Lee & Ko 2011), visuals (Lee & Ko 2012), or features (Lee, Ko, & Kwan 2013). As for the game design, we tried to think of a plausible story (what could you program to help do tasks, why should you help with these tasks?) that might engage and be meaningful for our users. We decided to have a robot that is tasked with cleaning up a chemical spill and save animals in the process. Unfortunately, the robot, Gidget, was damaged during transport and needs your help to fix its programs!
Debugging fit naturally with the story (Gidget is broken so you have to help) and medium (debugging == puzzles). In fact, many of the people who played the game did not realize they were (learning) programming and were playing for the sake of playing a game. Throughout our testing, we heard regularly from our learners that the "puzzles" were fun and satisfyingly challenging – we rarely, if ever, heard debugging described using these positive terms outside the context of the game. We published a paper describing these design decisions in more detail (Lee et al. 2014).
Although there path-finding and moving objects are the main objectives of the levels, the levels get progressively harder using several programming concepts. The first few levels use simple commands (e.g., up 2, down, left 4), but the language is quite expressive, requiring learners to use conditionals, loops, functions, and objects. If you are interested in learning more, the curriculum is described in more detail in our paper examining the use of assessments in the game (Lee, Ko, & Kwan 2013), and the language is described in detail in my dissertation (Lee 2015).
Eli Meir
I've been playing a bit with gidget after watching your video. It's nicely done with the pop up hints. I wasn't able to trigger the "frustration" setting that you mention in your video though - how would I see what happens when I get frustrated?
I'm also curious why you decided to make up your own language rather than modeling after an existing language?
Also, this is a bit more trivial, but why is it that putting a number after a movement command uses up more "energy" than the same thing using commands without numbers. i.e.
up
up
uses less energy than
up 2
Michael Lee
Assistant Professor
Thank you for your question and trying Gidget!
We are learning new things (e.g., we identified machine learning classifiers that correlate with task abandonment, Yan et al. 2017) and continuing to test and add new features. Currently we have two of help systems enabled: the dictionary (popups), and idea garden (Jernigan et al. 2017) help. We used our frustration helper in a controlled experiment and it is currently disabled as we implement improvements.
We modeled the Gidget language after Python, though it is definitely harder to see in the first set of levels which use movement commands such as up, left, grab, and drop (which were required for the game dynamics). As you progress through the game, you may notice more similarities with Python syntax.
Great catch with the energy use! Initially, energy was going to be a major factor in the game, but we found that it was largely unnecessary for adding challenge to the game levels and more often frustrated users. To answer your question: if I remember correctly, "up up" uses 2 energy units, while "up 2" uses 3 energy units. This is due to the way Gidget tokenizes and evaluates the code. "up" is evaluated as one step, while "up 2" evaluates the literal as an extra step. This explanation can get quite confusing for the learner, which is why we ultimately decided to lessen the importance of energy in the game.
Robert Zisk
Graduate Student
Thanks for the video! My first question is similar to Jessica's. I see that you focused on debugging, but have you thought of using the same context but have the students develop the code themselves to get gidget to their goal? It seems like it would be a good compliment to the current game.
Also, I see that you measured engagement as time on task and the number of tasks completed. Have you collected any data on students' perceptions of CS and coding after using the game?
Michael Lee
Assistant Professor
Thank you for visiting Robert!
You are right that students should be able to develop their own code! The game is designed with a specific learning objective(s) for each level, covering many of the concepts a student would encounter in an introductory computer programming (CS1) course. The idea was to provide some scaffolding for each level, where the learner has to look at all the existing code holistically to determine what is correct/wrong, necessary/unnecessary, while also being introduced to a new concept/statement/command. We found that learners used many different strategies for debugging their code, including deleting all the existing code to try completely on their own (they do have the option to restore the starting code in case they need to refer back to it).
Gidget also includes a "level designer" that allows learners to create their own goals, starting code, and (intentionally) broken code. This mode is only available after a learner completes the game – we wanted them to have one complete, directed experience with all the available features of the language before giving them a blank slate.
We definitely collected users' perceptions of CS and coding after using the game. We found that adults' initial, negative preconceived notions of programming could change to be more positive after a short experience with Gidget (Charters et al. 2014). We also found that high school teens viewed computer science more favorably after playing Gidget (Lee et al. 2014), but it did not necessarily change their minds about their career aspirations (though most said they could see how they could apply computer science towards their intended careers). Some of our camp and Saturday program participants continued to play Gidget outside of the classroom from their homes.
Susanne Steiger-escobar
Hi Michael, I enjoyed playing Gidget. Are you tracking the students taking the after school programs? I understand they may not be changing their minds in regards to career aspirations, but are they more inclined to take a CS course in high school?
Michael Lee
Assistant Professor
Dear Susanne, Thank you for playing Gidget and for asking your question! Yes, in our most recent study, we asked the middle school students to take a post-survey before the end of their Saturday program. Every single student responded 'yes' to wanting to take more CS-related courses outside of their regular school day if available. Almost every student (86%) responded 'yes', they would want to take more CS-related courses inside (i.e., as part of) their regular school day if available. Unfortunately, many of these students do not have these CS-specific resources/courses inside or outside of school available to them (especially in the middle school level), so we are working on efforts to improve this. We have plans to keep working with the same students (in addition to new students) over a long(er) period of time, so we hope to see them continue developing and improving their skills with CS!
shamsi moussavi
Very interesting work!
Is there a plan to add more advanced level after designer level?
Thank you for sharing your work and video.
Michael Lee
Assistant Professor
Thank you! Yes, we are always looking to improve the experience and content of the game. More specifically to your question, we are reviewing our curriculum's learning objectives and adding both researcher/educator-created content and procedurally generated content to give learners more interesting and customized learning experiences.
Further posting is closed as the showcase has ended.