We’ll be presenting two papers at EDM this year. The first, “Toward Data-Driven Example Feedback for Novice Programming,” explores generating adaptive example-based feedback, which presents a partial solution to a programming problem when a student is stuck. His results suggest that by leveraging student data, we can generate higher-quality, more adaptive examples than just using an expert solution, but the results may depend on a student’s ability to select which feature they want to see completed. The second, ”One minute is enough: Early Prediction of Student Success and Event-level Difficulty during a Novice Programming Task,” is a collaboration with the D3 and Game2Learn labs at NCSU, and presents a model for predicting whether a student will succeed at a given programming problem. The model is able to predict with impressively high accuracy with only 1 minute of data on a 20+ minute programming task.
If you’ll be at LAK this year, check our our papers at the Educational Data Mining in Computer Science (CSEDM) Workshop. We’ll be presenting “A Comparison of Two Designs for Automated Programming Hints,” work with Joseph Jay Williams at the University of Toronto comparing users’ perceptions of different hint types. We’ll also be presenting “ProgSnap2: A Flexible Format for Programming Process Data,” work completed as part of the CS-SPLICE Program Snapshot working group.
We will be at SIGCSE this year, where Rui will present our paper on “Exploring the Impact of Worked Examples in a Novice Programming Environment.” The paper evaluated the Peer Code Helper system, which presents partial worked example code with scaffolded self-explanation prompts. Our results show that these worked examples help students start the programming task quicker, but this advantage fades as students continue to work on the assignment.
Check out the abstract:
Research in a variety of domains has shown that viewing worked examples (WEs) can be a more efficient way to learn than solving equivalent problems. However, only a few studies have explored the effect of WEs in the domain of programming and even fewer in block-based programming environments. We designed a system to display WEs, along with scaffolded self-explanation prompts, in a block-based, novice programming environment called Snap!. We evaluated our system and the impact of programming WEs during a high school summer camp with 22 students. Participants completed three programming problems and had access to WEs on either the first or second problem. We found that access to WEs did not significantly impact students’ learning, but they may have lowered students’ intrinsic cognitive load. Students who had WEs completed more objectives in the programming problems, but this difference was not significant. Our results suggest that WEs save students time initially, compared to writing code, but afterwards students need time to process the WE. We find that WEs have the potential to improve students’ learning efficiency when programming, but that these effects are nuanced and merit further study.
Also, keep a lookout for our paper co-authored with the Game2Learn lab on “Defining Tinkering Behavior in Open-ended Block-based Programming Assignments.”