The Peer Review tool was used several times within Minor Data Engineering course consisting of a group of roughly 40 third year students.
The aim was to improve the quality of work during the course and give students more insight into how they are assessed.
Various feedback moments were planned within the course, whereby students would receive feedback from both peers and the expert. Self-evaluation was also used to improve reflection.
There were two times when students provided/received feedback to/from peers, and both were organised through the FeedbackFruits ‘Peer Review’ tool. The first time students received feedback on the design of their group project. The second time about the draft-report version of this same project. Students used the same rubric as the one used by the teacher to assess the final assessment. This helped them provide more focused feedback to each other and also helps think actively about how they will be assessed. The feedback moments all take place (midway) during the course and not just at the end, so that the students have enough time to improve their output several times.
Moreover, students had to assess their own individual work based on the same rubric. This created an extra step for students to actively think about whether they met the requirements. This was then followed by receiving feedback from peers after possible improvements.
The teacher indicates that this improved the quality of the work. In addition, there were fewer questions about the assessment just before submitting the final report. These questions could already be answered in class when the students were busy giving each other feedback. After processing the peer feedback, the students again received feedback during a classroom session with experts.
FeedbackFruits has had added value especially in the practical field – matching students and allowing them to apply a rubric of the teacher. The use of FeedbackFruits has therefore saved the teacher time both with organizing peer review and spending time on giving in-line expert feedback.
Students encountered that after they had already assessed themselves, they were still given the “read and reflect” step. This caused some confusion, as this was a bit of a redundant step.
Based on this insight FeedbackFruits is currently optimizing the flow in all feedback tools. Overall, the teacher would like to see the quality of the written reflections higher. This could be addressed in the instruction space that FeedbackFruits enables.
The proper processing of the received feedback was taken into account in the final grade; this showed to be a clear incentive for the students to participate. The teacher sometimes lacked an overview and mainly ran into problems because the HR is unable to integrate the peer review into the LMS.
- Context: Data Engineering class of 40 students at Rotterdam University of Applied Sciences.
- Learning activities: 2 x peer feedback; on design and end concept + self evaluation
- Didactic goal: Get a better understanding of how students are assessed by each other + improve quality
- Incentives (student): The proper processing of received feedback is taken into account in the final grade.
- Incentives (teacher): Clear timesaver in setting up peer feedback
- Outcomes (benefits):
- Quality of work has greatly improved after peer feedback
- Teachers indicated time-saving in the process
- Outcomes (lessons):
- Students gave each other over-positive feedback – socially acceptable behaviour
- Potential to improve the quality of written reflections
- Completion: Majority of groups (>80%) has finished hand-in & review part
- Ratings: Students rate work very positively, rubric is also used for final assessment.
- Comments: Significant number of comments was placed under criteria