css.php

Tool Talk | April 4, 2019

Using Gradescope at Hunter

The following blog post was written by Katherine St. John, Professor of Computer Science at Hunter College.


In Fall 2017, a newly revamped introductory computer science CSci 127 launched. The course has and is a survey of computer science with a strong focused on analytic reasoning and programming. With the new version, we wanted to increase the programming assignments as well as give more timely feedback.  In the past, instructors and graduate student graders graded manually, with usually a 2-3 week lag in the students seeing the results of their work. This time lag limited the usefulness of the feedback and lessened the number of programming assignments that could be graded.

In the last five years, there has been several on-line platforms for automating the grading of programs.  Gradescope was highly recommended for this task. It excelled at that, allowing immediate feedback on student work, that would not be possible without it. But it’s a second feature, of managing and organizing paper and pdf’s, that has been equally valuable. I’ll start by describing the second feature since it’s beneficial for anyone who gives paper-based exams or has students submit pdf’s of their work, and end this post with the automatic programming grading.

Organizing Piles of Paper

The most useful feature of Gradescope is in organizing the massive amount of paper for our large lecture courses.  For our introductory course, we had 650 students in the fall (and 350 in the spring lecture). We scan all the paper submitted and upload the scans to Gradescope (the standard photocopy machines on campus have a scanning-to-pdf option). Using optical character recognition software, Gradescope identifies about 95% of the students automatically from their handwritten name and EmpID (the remaining 5% are written outside the region or have serious legibility issues).  The automation of entering names into a spreadsheet removes the tedium of entering scores into a spreadsheet (saving about 10-15 minutes for every 100 students entered).

For larger lectures, it makes possible “exit slips” where students answer challenges and give feedback on what they wish we had spent more time. The scanning of the slips for 650 students takes about 25 minutes and aligning with course roster about 5 minutes. We do the same for the weekly recitation section quizzes and the final exam. For each, we give a template for each question, and the optical character recognition does an excellent job a grouping together short answers and  simple drawings, and a reasonable job with mathematical equations. It does not do well at automatically grouping essays, but it does have an option of organizing student submissions by self designed categories (i.e. “Correct:, “Correct but has minor typos/grammar issues”, etc). Then, each folder can be graded en masse, applying the grading rubric points (and comments) to each student’s work.

Instead of returning work on paper, the platform has an option to send out email and allow grading requests on-line. This removes the concerns about lost exams, altered exams, and allows for grading mistakes to be fixed for all students  in the same folder, instead of just those who spoke up.

Consistent Rubrics and Feedback

Since grading rubrics can be set up in advance, and can include detailed messages, we have found that it has lessened the regrading requests, since the students see a detailed explanation of why they earned the points, instead of a hurried, scrawled comment. The rubric has a checkbox option for the grader with the option for individual feedback. The rubrics can be set up in advance and shown to the student when they see their graded exam. We use rubrics for both open-ended essay and design questions as well as for short-answer questions.

Grading Programs

This is the original reason we looked into Gradescope.  For CSci 127, we give 60 short programming assignments over the semester, in a variety of languages (primarily Python & C++). To submit a program, a student drops the file on the course webpage. In 30-60 seconds, the screen refreshes with a detailed list of the points earned and the option to resubmit. The students can resubmit as often as they would like up to the deadline.

Behind the scenes, for each program, we write a grading script that is uploaded in advance  to Gradescope.  When a student submits a program, Gradescope “rents” out a virtual machine from the cloud (from the logs, it’s often Amazon Web Services), uploads the grading script and student program to this container, and runs it. The results are computed (and stored in JSON format) and displayed to the student. The containers run a standard flavor of the popular Unix operating system, and any program that can be openly available can be easily run in the grading scripts.

Our first cohort of students has progressed through the programming sequence, we have added Gradescope to more courses. It is now incorporated into CSci 127, 135/136, and 235, as well as some sections of CSci 150 and 160.


To learn more about Gradescope, check out these resources below:

Gradescope website (Go ahead and request a demo!)

Getting started

Grading a question

AI-Assisted grading

Comments are closed.

Powered by WordPress. Designed by WooThemes

Need help with the Commons? Visit our
help page
Send us a message
Skip to toolbar