Generate individualised feedback on writing in larger student cohorts

Dan Hasan
|
December 23, 2021
DOMAIN
STEM
Class Size
250
Instructor Workload
Learner Workload
ABOUT THE INSTITUTION
ABOUT THE INSTRUCTOR(S)
ABOUT THE INSTITUTION
ABOUT THE INSTRUCTOR(S)

Context

Dr. Adam Cardilini’s 2nd-year bachelor's course in science and society brought together around 250 students of diverse international backgrounds for one trimester, exploring the interplay between scientific knowledge and societal issues. Students were encouraged to be innovative in their choice of medium and approach in completing their assignments, with some students submitting infographics or podcasts. A summative written report and portfolio marked the final assessment of the course, asking students to communicate their developed understanding of a controversial issue.

Dr. Cardilini ultimately aimed to provide detailed, real-time, and actionable feedback on each students' written assignment, as an essential part of the development of their writing and argumentation skills. In practice however, larger student cohorts make this aim increasingly impossible, resulting in a lack of individually-tailored guidance. Wondering whether at least a partial solution was possible, the instructor decided to use the AI-powered Automated Feedback tool to give students the chance to receive personalised feedback on their written work. Throughout the course, students could opt to use the tool inside their D2L environment to generate feedback suggestions on lower-order writing skills such as grammar and style. Using these suggestions, they could iterate on an improved final submission, as well as increasing their autonomy and self-guidance throughout the learning experience.

Learning journey
Gamified peer review
A framework to integrate game design elements into peer review for enhanced engagement and critical thinking
access now
Learning journey
Self-assessment for learner engagement
A framework to integrate game design elements into peer review for enhanced engagement and critical thinking
access now
Learning journey
Developmental portfolio for lifelong learning
Promote skills transfer and career readiness through technology-enhanced student portfolios
access now

Learning objectives

  • Unit learning outcomes (ULOs) included, for example, asking students to articulate views and construct critical arguments relating to scientific knowledge and societal issues.
  • Global learning outcomes (GLOs) related to the development of students' communication and critical thinking skills.

Learning activities

In this course, students practiced their essay writing, critical thinking and argumentation skills in various written assignments, deepening their understanding of the relationship between scientific and non-scientific knowledge. Students could choose whether to use the Automated Feedback tool to receive instantaneous feedback on their writing, according to criteria which the instructor determined. The tool parsed each uploaded document and highlighted areas of potential conflict with the established criteria, giving an explanation and suggestion to the user, and letting them rate whether the feedback was helpful or incorrect. The criteria chosen were, for example, sentence length, correct use of abbreviations, use of passive voice, and present tense. 


The list of criteria chosen by the instructor also displays overall performance with each criterion
The list of criteria chosen by the instructor also displays overall performance with each criterion

In total, there were two assignments for which students could make use of the tool, and it was possible to use the tool multiple times per assignment, for example to re-check work after making suggested edits. Comments from the Automated Feedback tool could be included in students' final portfolios, as evidence showing how they responded to feedback and improved their work.

The document viewer highlights actionable suggestions in the sidebar
The document viewer highlights actionable suggestions in the sidebar

Learning activities based on the Bloom taxonomy are mainly at the level of:

  • Evaluating own written work according to instructor-determined criteria

Assessment of learning outcomes

The written report made up 30% of the overall grade, with the portfolio also being mandatory to hand in at the end of the course. The rest of the grade was made up from other assignments such as quizzes.

Notable outcomes

  • Automated Feedback allowed students to receive comprehensive, personalised feedback on their writing skills without having to rely on the instructor's feedback. This allowed skills to be improved without causing extra work for the instructor.
  • Despite not being mandatory for students, a significant proportion made use of the tool to check their writing, with some students using it several times. This suggests that students perceived value in the generation and implementation of this feedback.
  • Certain criteria were found to be more useful than others. For example, word-count was checked by the tool but can also usually be checked in most modern word processors, meaning students often defaulted to this instead. On the other hand, passive voice was found to be an important stylistic element where Automated Feedback proved useful for checking and providing suggestions.
  • Students were reported to have been happy to take the AI-generated feedback 'with a grain of salt'. Where the generated comments were not accurate, they still encouraged students to think critically about their writing.
"Ultimately I'd like to provide detailed feedback for every single assignment but that's unrealistic. Automated Feedback did something I couldn't provide for students." - Dr. Adam Cardilini, Lecturer, Deakin University

The role of the instructor

  • A mention was made by the instructor at the beginning of the course, telling students that the tool was available to check a limited number of elements in their writing.
  • For students who had checked their work with the tool, analytics were generated allowing the instructor to get a sense of general student activity and performance on the assignment.
  • The instructor remarked that they would like to make the use of the tool an explicit part of the assignment process, asking students to show received feedback and how it was used to make improvements. This is already in place in other courses delivered by the instructor.

Added value of technology

The instructor noted that the tool was more accessible than similar writing-check tools such as Grammarly, pointing to the consistency of grading and the assignment-marking process, as well as heightened visibility on student activity. Giving students the opportunity to take responsibility of their own learning process with this tool heightens student autonomy and allows writing to improve without requiring the instructor to individually address each student's work.

An overview of student analytics provides a snapshot of who had handed in draft assignments
An overview of student analytics provides a snapshot of who had handed in draft assignments

Possible variation

Share on social media

Recommended use cases

The University of Arizona utilized student-centric approach and FeedbackFruits solutions to ensure compliance with the Quality Matter Standards

Explore Temasek Polytechnic’s journey with FeedbackFruits, from initial implementation to plans for leveraging AI to enhance peer evaluation and group member assessments

Learn how NHL Stenden made the transition from traditional portfolios to a streamlined, user-friendly approach that enables students to concentrate more fully on learning

We gather the most valuable resources and learnings on pedagogy and email them to you every month.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Filling out this form means you agree to our Privacy Policy.