Join our Webinar on December 17th: From Regulation to Innovation: What the EU AI Act Means for EdTech
chevron_right

Promote collaborative learning at the University of Delaware

Dan Hasan
|
October 3, 2022
DOMAIN
Education
Class Size
Instructor Workload
Learner Workload
ABOUT THE INSTITUTION
ABOUT THE INSTRUCTOR(S)
ABOUT THE INSTITUTION
ABOUT THE INSTRUCTOR(S)

Context

Group work and collaboration skills are essential in every profession, yet throughout their education, many students struggle to perform effectively in group projects. The reasons for this come down to lack of clarity in role divisions, poor communication, and lack of responsibility and free-riding, among others, resulting in a stigma around the very idea of group work. Furthermore, reliance on manual data entry, review allocations, and distribution of grades can make large-scale feedback activities an arduous and time-consuming task.

In this integrated physics course at the University of Delaware, pedagogical technology has been used to gain transparency into group dynamics and grow accountability among students, all while saving time and workload through automation of aspects of the group evaluation process.

Learning journey
Gamified peer review
A framework to integrate game design elements into peer review for enhanced engagement and critical thinking
access now
Learning journey
Self-assessment for learner engagement
A framework to integrate game design elements into peer review for enhanced engagement and critical thinking
access now
Learning journey
Developmental portfolio for lifelong learning
Promote skills transfer and career readiness through technology-enhanced student portfolios
access now

Learning objectives

  • Students develop skills necessary for peer evaluation such as interpersonal skills, efficient communication, and evidence-based practices.
  • Students can demonstrate listening skills, punctuality, and dependability in group work.

Learning activities

Throughout the course, students: 

  • worked together in groups of 4-6 for Problem-based learning (PBL) work and semester projects, and in groups of 2-3 for lab work
  • completed evaluations of their own, and their peers' performance. These anonymous peer evaluations were a space to give feedback on all group dynamics related to PBL, semester projects, lab work, and any other group topics.

The use of Group Member Evaluation: 

Students were automatically assigned to review everyone within their 4-to-6- member group anonymously. For these reviews, students gave quantitative feedback according to a collaboration rubric containing seven criteria across two categories: 'contributions to teamwork', and 'interaction with team members'.

Four levels ('needs development' to 'exemplary') were used and thorough descriptions for each level and criteria were provided to guide students towards leaving accurate ratings.

Students were also able to leave comments as part of their review, but this feedback was held by the teacher rather than being automatically sent to the recipient. These comments were then brought to class discussion by the teacher so the feedback could be processed in a safe and constructive environment. 



While students' given review scores contributed to overall individual grades in the course, the teacher was able to use manual grade adjustments to correct any discrepancies in these grades, making use of the 'detect outliers' feature to quickly review whether any students had given or received abnormally high or low scores.

Learning activities in terms of Bloom’s Taxonomy were at the levels of:

  • Evaluating: 
    peers' performance in a group project according to a given rubric

Assessment of learning outcomes

  • FeedbackFruits activities made up 25% of the total grade.
  • 60% of this came from completing the evaluations
  • The remaining 40% was derived from the average score from self and group evaluations.

The instructor manually adjusted students' scores as necessary to make sure all students received a fair and representative final grade. Students could view their grading breakdown in Peer Grader in Canvas.

Notable outcomes

For the instructors: 

Before using FeedbackFruits, the instructor relied on a 'conglomeration of google forms and spreadsheets' to keep track of the feedback of hundreds of students, requiring manual data entry and sharing grades afterwards. With Group Member Evaluation, the feedback process was centralised, and allocations and grading were able to be automated, making it easier to scale up the process.

As well as motivating completion for the students, the instructor benefitted from being able to see each students' completion with the activity, "so there could be no tricks". For example, it was now easy to identify if certain students had not started the activity or made any progress with reviews, so appropriate measures could be taken.

For the students: 

By holding feedback, it was ensured that students gave reviews which were 'honest and precise', without feeling they had to dilute their reviews in case they were negatively perceived. As feedback comments were discussed in class, students still benefited from these comments, while the instructor could safeguard the process and ensure an amiable environment for this to take place.

While progressing with the Group Member Evaluation activities, students can always see their progress percentage. This helps to both motivate continuance by highlighting any remaining tasks, and provide transparency on how grading is weighted.

"Group work is not always easy - but in the end it's proven that group work... works: no matter where you go in life you'll have group work. That's how we learn - we communicate with each other." - Christina Wesolek, University of Delaware

The role of the instructor

Clear instructions were given to students in various places throughout the course. As well as specific instructions within the Group Member Evaluation activity, additional reminders and announcements were made at points throughout the course to remind students of remaining tasks and deadlines.

In the activity instructions for example, the instructor detailed how feedback should be given according to the rubric, how anonymity would be used, that feedback would be held, and how the grade weighting worked. This gave transparency to students and ensured clear expectations.

To ensure fair grading, the instructor used 'detect outliers' to quickly overview the scores students had given themselves and each other, highlighting any cases where, for example, students had scored themselves very highly and others poorly. With manual grade adjustments, the instructor maintained the final say in how students' overall grades were distributed.

Even with the large amount of comments produced from the 100 students, the instructor was able to overview everything in the 'read received feedback' step within the activity. While not necessarily scrutinising each and every comment, this overview proved effective in highlighting any outstanding remarks, or recurring themes which could then be addressed later in class.

Possible variation

Share on social media

Recommended use cases

Learn how NHL Stenden made the transition from traditional portfolios to a streamlined, user-friendly approach that enables students to concentrate more fully on learning

Learn how instructors at ESSEC Business School utilized pedagogical tools and AI to transform formative assessment in large student cohorts

Discover how instructors at Reyjkavic University used gamification to enhance the peer review process

We gather the most valuable resources and learnings on pedagogy and email them to you every month.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Filling out this form means you agree to our Privacy Policy.