Group work and collaboration skills are essential in every profession, yet throughout their education, many students struggle to perform effectively in group projects. The reasons for this come down to lack of clarity in role divisions, poor communication, and lack of responsibility and free-riding, among others, resulting in a stigma around the very idea of group work. Furthermore, reliance on manual data entry, review allocations, and distribution of grades can make large-scale feedback activities an arduous and time-consuming task.
In this integrated physics course at the University of Delaware, pedagogical technology has been used to gain transparency into group dynamics and grow accountability among students, all while saving time and workload through automation of aspects of the group evaluation process.
Throughout the course, students:
The use of Group Member Evaluation:
Students were automatically assigned to review everyone within their 4-to-6- member group anonymously. For these reviews, students gave quantitative feedback according to a collaboration rubric containing seven criteria across two categories: 'contributions to teamwork', and 'interaction with team members'.
Four levels ('needs development' to 'exemplary') were used and thorough descriptions for each level and criteria were provided to guide students towards leaving accurate ratings.
Students were also able to leave comments as part of their review, but this feedback was held by the teacher rather than being automatically sent to the recipient. These comments were then brought to class discussion by the teacher so the feedback could be processed in a safe and constructive environment.
While students' given review scores contributed to overall individual grades in the course, the teacher was able to use manual grade adjustments to correct any discrepancies in these grades, making use of the 'detect outliers' feature to quickly review whether any students had given or received abnormally high or low scores.
The instructor manually adjusted students' scores as necessary to make sure all students received a fair and representative final grade. Students could view their grading breakdown in Peer Grader in Canvas.
For the instructors:
Before using FeedbackFruits, the instructor relied on a 'conglomeration of google forms and spreadsheets' to keep track of the feedback of hundreds of students, requiring manual data entry and sharing grades afterwards. With Group Member Evaluation, the feedback process was centralised, and allocations and grading were able to be automated, making it easier to scale up the process.
As well as motivating completion for the students, the instructor benefitted from being able to see each students' completion with the activity, "so there could be no tricks". For example, it was now easy to identify if certain students had not started the activity or made any progress with reviews, so appropriate measures could be taken.
For the students:
By holding feedback, it was ensured that students gave reviews which were 'honest and precise', without feeling they had to dilute their reviews in case they were negatively perceived. As feedback comments were discussed in class, students still benefited from these comments, while the instructor could safeguard the process and ensure an amiable environment for this to take place.
While progressing with the Group Member Evaluation activities, students can always see their progress percentage. This helps to both motivate continuance by highlighting any remaining tasks, and provide transparency on how grading is weighted.
"Group work is not always easy - but in the end it's proven that group work... works: no matter where you go in life you'll have group work. That's how we learn - we communicate with each other." - Christina Wesolek, University of Delaware
Clear instructions were given to students in various places throughout the course. As well as specific instructions within the Group Member Evaluation activity, additional reminders and announcements were made at points throughout the course to remind students of remaining tasks and deadlines.
In the activity instructions for example, the instructor detailed how feedback should be given according to the rubric, how anonymity would be used, that feedback would be held, and how the grade weighting worked. This gave transparency to students and ensured clear expectations.
To ensure fair grading, the instructor used 'detect outliers' to quickly overview the scores students had given themselves and each other, highlighting any cases where, for example, students had scored themselves very highly and others poorly. With manual grade adjustments, the instructor maintained the final say in how students' overall grades were distributed.
Even with the large amount of comments produced from the 100 students, the instructor was able to overview everything in the 'read received feedback' step within the activity. While not necessarily scrutinising each and every comment, this overview proved effective in highlighting any outstanding remarks, or recurring themes which could then be addressed later in class.
Learn how NHL Stenden made the transition from traditional portfolios to a streamlined, user-friendly approach that enables students to concentrate more fully on learning
Learn how instructors at ESSEC Business School utilized pedagogical tools and AI to transform formative assessment in large student cohorts
Discover how instructors at Reyjkavic University used gamification to enhance the peer review process