Dr. Adam Cardilini’s 2nd-year bachelor's course in science and society brought together around 250 students of diverse international backgrounds for one trimester, exploring the interplay between scientific knowledge and societal issues. Students were encouraged to be innovative in their choice of medium and approach in completing their assignments, with some students submitting infographics or podcasts. A summative written report and portfolio marked the final assessment of the course, asking students to communicate their developed understanding of a controversial issue.
Dr. Cardilini ultimately aimed to provide detailed, real-time, and actionable feedback on each students' written assignment, as an essential part of the development of their writing and argumentation skills. In practice however, larger student cohorts make this aim increasingly impossible, resulting in a lack of individually-tailored guidance. Wondering whether at least a partial solution was possible, the instructor decided to use the AI-powered Automated Feedback tool to give students the chance to receive personalised feedback on their written work. Throughout the course, students could opt to use the tool inside their D2L environment to generate feedback suggestions on lower-order writing skills such as grammar and style. Using these suggestions, they could iterate on an improved final submission, as well as increasing their autonomy and self-guidance throughout the learning experience.
In this course, students practiced their essay writing, critical thinking and argumentation skills in various written assignments, deepening their understanding of the relationship between scientific and non-scientific knowledge. Students could choose whether to use the Automated Feedback tool to receive instantaneous feedback on their writing, according to criteria which the instructor determined. The tool parsed each uploaded document and highlighted areas of potential conflict with the established criteria, giving an explanation and suggestion to the user, and letting them rate whether the feedback was helpful or incorrect. The criteria chosen were, for example, sentence length, correct use of abbreviations, use of passive voice, and present tense.
In total, there were two assignments for which students could make use of the tool, and it was possible to use the tool multiple times per assignment, for example to re-check work after making suggested edits. Comments from the Automated Feedback tool could be included in students' final portfolios, as evidence showing how they responded to feedback and improved their work.
The written report made up 30% of the overall grade, with the portfolio also being mandatory to hand in at the end of the course. The rest of the grade was made up from other assignments such as quizzes.
"Ultimately I'd like to provide detailed feedback for every single assignment but that's unrealistic. Automated Feedback did something I couldn't provide for students." - Dr. Adam Cardilini, Lecturer, Deakin University
The instructor noted that the tool was more accessible than similar writing-check tools such as Grammarly, pointing to the consistency of grading and the assignment-marking process, as well as heightened visibility on student activity. Giving students the opportunity to take responsibility of their own learning process with this tool heightens student autonomy and allows writing to improve without requiring the instructor to individually address each student's work.
The Peer Review tool was used within a nursing course to improve students' professional communication skills
Dr. Andreas Osterroth at University of Koblenz and Landau faciliated a rigorous feedback process that stimulated active engagement and critical thinking, using FeedbackFruits tools.
The University of Delaware minimized time spent on group work facilitation, while maximizing students' performance and collaboration skills.