Projects

Update – Our Website Has Moved: go.ncsu.edu/hintslab

Please visit us at: https://go.ncsu.edu/hintslab


Overview

For an overview of some of our most recent work, check out this video from the AAAI Spring Symposium: AI in K-12 Education.

iSnap: Intelligent Programming Support

Website | Code | Datasets]

iSnap is a programming environment designed to lower the barriers that novices face when first learning to program. It combines two effective support features: block-based programming and adaptive hints and feedback. iSnap is based on Snap!, an online programming environment where students construct programs from drag-and-drop blocks, which reduce the initial challenges of programming syntax. Snap! makes programming more interactive, visual and creative by letting students build games, stories and simulations.

iSnap augments this environment with detailed logging and intelligent support features including on-demand programming hints and feedback. These hints are generated from student data, allowing them to easily scale to new classrooms and problems (see below for more). Even in a block-based programming, students can still get stuck, and instructors are not always available to help. When this happens, iSnap can check students’ code for errors and offer suggestions for how to move forward. For more information and a demo of iSnap, visit: go.ncsu.edu/isnap.

Key Findings:

  • A pilot study showed qualitative evidence that students are capable of using iSnap to overcome difficulties and complete assignments, but it also revealed important challenges, such as the risk of students abusing the help features [SIGCSE’17].
  • A later study showed that many students perform poorly on homework assignments, but almost all students who followed iSnap’s hints performed adequately (despite the fact that weaker students may be more likely to request hints) [AIED’17].
  • Despite these successes, other work shows that many students do not ask for help when they need it, suggesting that interface and hint-quality issues may be preventing iSnap from having its desired impact on student learning [ICER’17].

Future Directions:

  • Measure the impact of iSnap’s hints and feedback on student learning.
  • Explore new types of intelligent programming support, such as worked examples, principle-based hints, and self-explanation prompts.
  • Support more creative, open-ended programming projects, such as game making and data-analysis, where the learner helps to define the goal of the program.

SourceCheck: Data-driven Hint Generation

SourceCheck is the algorithm that powers iSnap’s data-driven hints and feedback. These hints are generated automatically, with no need for an instructor or expert model. SourceCheck uses the solutions of previous students to build a model of correct programming behavior for a given problem. It can use this model to provide new students with adaptive hints, pointing towards a similar, correct solution. While SourceCheck’s primary application is the iSnap environment, the algorithm is programming language agnostic, and it has also been applied to Python code.

A major component of the SourceCheck research project is to establish methods for evaluating the quality of data-driven hints by using experts to rate and generate hints. This allows researchers to benchmark and compare the growing collection of data-driven programming hint generation algorithms (over 25 since 2015!). Additionally, an objective measure of hint quality allows researchers to test hypotheses about what changes will improve that quality of data-driven hints from a given algorithm.

Key Findings:

  • SourceCheck reproduced the feedback of human tutors 76-88% as well as another tutor for real student hint requests on two programming assignments. However, it also generated up to 50% more hints, suggesting that hint quality hinges on filtering out unhelpful hints [EDM’17].
  • SourceCheck requires only 10-20 student solutions to generate its highest-quality hints. However, data-driven hint quality does not always increase with more data – it can even decrease as more correct student solutions are added to the training dataset [AIED’18].
  • Generating hints with expert solutions, rather than student data, generally yields higher-quality hints. With some hint generation algorithms, a single expert solution outperforms a whole dataset of student solutions [AIED’18].

Future Directions:

  • Create hints that tell a student not just what to do, but also why by combining data-driven approaches with light-weight expert modeling.
  • Make hints more adaptive by leveraging models of student knowledge and detecting student progress through assignments.
  • Extend SourceCheck to additional programming languages and environments.

Understanding Student’s Help-seeking Behavior

Many programming help features, such as iSnap’s hints, require the student to initiate a request for help. However, knowing when and how to seek help is a metacognitive skill that many students are still developing. Students may not always ask for help when they need it (help avoidance), or they may ask for too much help, using it to avoid engaging with the assignment (help abuse). Computer-based help may provide unique opportunities to overcome barriers to productive help-seeking, for example providing students help confidentially, without any social cost. This project studies how students seek and use help while programming, with the goal of using better user interface design to encourage productive help-seeking.

Key Findings:

  • In one study, the quality of the very first data-driven hints that a student received in iSnap significantly correlated with how many future hint requests that student made, suggesting that low-quality hints may deter students from seeking help when they need it [AIED’17].
  • However, many students who needed help never asked for it in the first place, despite performing poorly on the assignment. Even if hints were perfect, many students would never see them [AIED’17].
  • Many factors impact students’ decisions to seek help from both instructors and automated help like iSnap, including students’ desire for independence, their trust in the help system, their previous experiences with computer-based help and the accessibility and salience of the help [ICER’17].

Future Directions:

  • Explore how to reframe help features as tools that savvy programmers use, rather than sources of help that may threaten a students’ sense of independence.
  • Develop a model of how students seek and use help when programming and test the model through empirical studies.

Evaluating Programming Interfaces

Block-based, visual programming languages, like Scratch, Alice, Blockly and Snap! are becoming more popular for teaching novices how to code for the first time. These languages replace the textual syntax of programming languages with drag-and-drop blocks or menus that eliminate syntax errors. Intuitively, this should make programming easier for novices, but more empirical research is needed to explore the advantages and limitations of these languages. This project has compared students’ performance when using block-based, textual and frame-based programming environments. The goal of this work is to better understand how to best use these alternative programming interfaces to improve student learning.

Key Findings:

  • Middle school students using a block-based programming interface completed a programming assignment more quickly, more completely and spending more time on task, than those using an otherwise identical textual programming environment [ICER’15].
  • Middle school students using Greenfoot’s Stride frame-based programming interface also progressed through an assignment quicker than their textual counterparts, but the effects were less dramatic. Students using Stride also spent less time correcting syntax errors [ICER’16].

Future Directions:

  • Continue to evaluate cutting-edge programming interfaces as they are developed.
  • Evaluate other elements of novice programming environments, such as their focus on visual, interactive output and creative, open-ended projects, to determine their effect on student learning and engagement.
  • Explore how block-based interfaces interact with programming help features (perhaps differently than in textual environments).