How Automated Feedback helped optimise students' academic writing at University of Groningen

Dan Hasan
|
March 19, 2021
DOMAIN
Language
Class Size
350-400
Instructor Workload
Learner Workload
LMS
ABOUT THE INSTITUTION
ABOUT THE INSTRUCTOR(S)
ABOUT THE INSTITUTION
ABOUT THE INSTRUCTOR(S)

Context

This first-year program, English for International Business, ran for 11 weeks and comprised 24 groups of around 15 students. As a language course in the economics and business faculty, the focus was on improving skills such as academic writing and speaking, as well as collaboration and teamwork. These skills were addressed and assessed with reference to the Common European Framework of Reference for Languages (CEFR), in the form of individual and group written projects.

There were several motivating factors to start using Automated Feedback in this course: the instructor wanted to spend less time correcting common errors in students’ work, and also wanted students to take the initiative to do their own proofreading before submitting their final written projects.

Learning journey
Gamified peer review
A framework to integrate game design elements into peer review for enhanced engagement and critical thinking
access now
Learning journey
Self-assessment for learner engagement
A framework to integrate game design elements into peer review for enhanced engagement and critical thinking
access now
Learning journey
Developmental portfolio for lifelong learning
Promote skills transfer and career readiness through technology-enhanced student portfolios
access now

Learning objectives

  • Students are able to demonstrate a sufficient range of language to give clear descriptions and express viewpoints and to clearly formulate and construct argumentation in English speaking and writing.
  • Students are able to write an academic report in English that is coherent, is grammatically accurate and contains a wide range of subject-appropriate and academic vocabulary.

Learning activities

Automated Feedback is used several times over the course to help students review their own writing before submitting final projects. On the first day of class, a benchmark essay is completed, which provides the instructor with an idea of students’ average level of academic writing. In the next assignment, students choose from a range of topics and write a 500-word report, which is assessed according to CEFR levels. For this assignment, a rubric consisting of 22 criteria (the most common mistakes in students’ academic writing) is used to assess students’ reports. These documents could be partially checked with Automated Feedback, which in this iteration checked only one of these criteria: the use of first-person. The other criteria were checked manually by the instructor and students. A second paper written on a specific paper may also be partially checked with Automated Feedback before handing in as a final version. The final project is a group writing project, where some groups submitted separate parts individually and some submitted the whole paper as a group. Again, this could be checked with Automated Feedback
to generate suggestions for improvement before the final version was handed in.

Learning activities based on the Bloom taxonomy are mainly at the level of:

  • Understanding and applying academic writing skills with particular focus on range, coherence, and accuracy of language.

  • Analyzing their own and each other’s work according to CEFR level criteria.

Assessment of learning outcomes

Feedback is given on areas needed to improve with regards to the learning objectives. Students are marked according to the 22 commonest errors in academic writing. Automated Feedback could be used to check one of these criteria (the use of first person) prior to the final delivery, letting the instructor carry out a final review. Based on the marked errors, a decision is made as to which CEFR level the student has achieved, according to range, coherence, and accuracy. Each project counts for 25% of the total grade.

Notable outcomes

  • With Automated Feedback, students can simultaneously receive feedback on handed-in work almost instantly, rather than having to wait for it to be reviewed one-by-one by the teacher or by peers.
  • Considering the limited time available for lessons (9 x 1.5 hours) over the course, it was beneficial to automate the work of correcting writing and formatting errors. This left more time in class to go over higher-level content in class, rather than simple grammatical and stylistic mistakes.
  • The instructor could see an overview of student performance and progress with the activity which provided insight into, for example, certain recurring issues proving to be problematic for numerous students. 

  • As well as being helpful for the instructor, the knowledge that this overview was available also stimulated students to keep on top of their work and stick to their deadlines.
  • In general, students who chose to check their work with Automated Feedback performed better in their final projects than those who declined to use it.
"My enthusiasm over it is that it’s fantastic! We will definitely keep using Automated Feedback in our online classes." - Jane Mahoney, Teacher of English & Program Coordinator, University of Groningen

The role of the instructor

  • The instructor made sure that each assignment had clear instructions with regards to the (optional) use of Automated Feedback. Students were informed of the possibility to use the tool to check their work and could choose whether or not they wanted to make use of it.
  • Periodically, the instructor checked each groups’ use of the tool inside the platform and overview their activity and performance.

Added value of technology

  • This tool is intended for formative use, to help students check their own work before the final delivery. By focusing on the more frequent writing errors, it leaves instructors more time to give students feedback on the content of their reports.

Possible variation

The instructor can incentivise more elaborate feedback by requiring students to include extra questions while giving feedback ratings to peers. In a summative assessment, this extra element of feedback can also contribute to the overall grade of the assignment.

Share on social media

Recommended use cases

The University of Arizona utilized student-centric approach and FeedbackFruits solutions to ensure compliance with the Quality Matter Standards

Explore Temasek Polytechnic’s journey with FeedbackFruits, from initial implementation to plans for leveraging AI to enhance peer evaluation and group member assessments

Learn how NHL Stenden made the transition from traditional portfolios to a streamlined, user-friendly approach that enables students to concentrate more fully on learning

We gather the most valuable resources and learnings on pedagogy and email them to you every month.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Filling out this form means you agree to our Privacy Policy.