Institution-wide scalability and transparency in peer assessments at Deakin University

Dan Hasan
|
October 14, 2021
DOMAIN
STEM
Class Size
7 - 700
Instructor Workload
Learner Workload
ABOUT THE INSTITUTION
ABOUT THE INSTRUCTOR(S)
ABOUT THE INSTITUTION
ABOUT THE INSTRUCTOR(S)

Context

The problem: The need for scalable, flexible tooling to support varied and widespread course designs

Instructors at Deakin University needed to find a suitable tool to facilitate group feedback within roughly 40 units across multiple STEM schools, with a total of around 6,000 students per year. Classes ranged from small-scale project groups of less than 10 students to large-scale courses of 700, and from first to third year bachelor's as well as master's programs.

The solution: FeedbackFruits Group Member Evaluation

With such varied and widespread course designs and pedagogical approaches came the need for scalable, flexible tooling which supported elements such as reusable rubrics, and group- and grade-synchonised LMS integration. Previously, it had been reported that group projects, especially team assessments, had not been fair and had suffered from free-riding. Group Member Evaluation was able to provide a transparent and accessible platform for both students and instructors to streamline peer assessments and bring visibility to each stage of the process.

Constructive alignment

Learning objectives

Learning objectives varied between subjects but each unit had in common 
Deakin University's Graduate Learning Outcome, "Teamwork" (GLO7):

  • working and learning with others from different disciplines and backgrounds

Learning activities

A variety of different learning methodologies such as team-based learning, project-based learning, and design-based education, were employed across the different schools and units.

The use of Group Member Evaluation: 

The common usage of Group Member Evaluation centred around the evaluation of peers’ skills in group work settings, with certain features being utilized: 

  • The inclusion of the Group Contribution Grading feature, whereby students’ grades are based on the contribution of each individual to the group deliverable. This group contribution factor could be verified by the instructor before accounting for the final grade in the activity.

Example activities: 

Team-based learning activity: 

  • Student from the engineering school worked in a team-based learning setup to design and build a bridge, with the guidance of an industry expert.
  • After the design and implementation phases, student teams used Group Member Evaluation to reflect on what went well and what could have been improved, both as a self-evaluation and in the form of feedback comments to group members.

Before-and-after task: 

  • Teachers assigned 27 self-assessment questions at the start and end of each course asking students to reflect on aspects such as teamwork-readiness, contribution, and confidence.
  • These before-and-after tasks, along with the formative and summative evaluations, and a final self-reflection, formed a learning trajectory which encouraged self-awareness and regulation throughout all stages of each course

Learning activities based on the Bloom taxonomy are mainly at the level of:

Evaluating

own and other’s written work according to given criteria

Assessment of learning outcomes

GME was used for self and peer review of teamwork skills formatively at the mid-point of the team task as a means of ascertaining students’ individual performance and group contributions. Another check also took place as a summative assignment at the end of the course, allowing for a holistic reflection comparing the start and end of each learning experience. Following the university’s assessment policy, group work accounted for no more than 50% of the final grade in any unit.

Notable outcomes

  • The tools provided a means of measuring student engagement by providing data on completion of activities. Across all units, a total of between 95-97% of students successfully completed their peer feedback assignments.
  • With increased visibility for teachers and between students on their performance on and contribution to group work, students had more opportunity for autonomy and self-regulation, “around 10% of students were held accountable by their peers for not meeting the standards”.
  • Students’ career-readiness skills were supported through these collaborative activities, particularly where received feedback substantiated skill acquisition and development, “the feedback you get from your team - this is evidence”.
  • The inclusion of Group Contribution Grading gave instructors insight into group dynamics and individual performance, allowing for visibility into potential conflicts or cases of free-riding which could then be addressed before becoming more problematic.
“What I've learned over the few years being involved - you shouldn't be limited to the current technology, you should always be looking to push that boundary. If you have a need, that need can be met. You just have to voice that to someone like FeedbackFruits so they understand what it is.” - Tiffany Gunning, Team Leader Teaching and Learning Special Projects SEBE, Deakin University

The role of the instructor

  • Explanation videos detailing both the usage of each tool and the motivation for its inclusion in the course were produced by the university, available to both instructors and students at any point over the programme.
  • Data on student performance and engagement was downloaded and compiled for each student involved in Group Member Evaluation activities, “looking particularly at the Group Skill factor”.
  • The instructional design team were responsible for designing activities particular to the needs of each instructor and maintaining a centralised library of rubric criteria and templates.
  • The team at Deakin also utilised a ‘community of practice’ where course design and learning technology insights could be shared, as well as assistance offered for specific activity setup requests.

Added value of technology

Features such as synchronisation of groups and grades from the native LMS; copying existing course designs for new units; and accessing activity templates from centralised libraries, made the scaling and integration of FeedbackFruits tools simple, even across faculties and schools with distinct and unique needs. It was reported that the Group Member Evaluation tool is being considered for use in the university’s ‘Authentic Assessment Project’, whereby it would be used as a ‘touchpoint’ in every program offered in the faculty.

Possible variation

During the initial setups of Group Member Evaluation activities, it was decided that every peer feedback comment would be held and checked by instructors before releasing these comments to the intended recipient. This was due to concerns about quality and appropriateness of comments. However, no inappropriate remarks were encountered after a considerable period of time, leading instructors to enable the release of feedback immediately after the deadline had passed, letting students process feedback more quickly.
Providing students with resources about giving and receiving constructive criticism can further help to ensure the smoothest possible feedback practices across learning setups.

Share on social media

Recommended use cases

A story of how a faculty enhance engagement with technology-enhanced interactive study materials and peer assessment

Leiden University utilizes FeedbackFruits Competency-Based Assessment solution to track competencies of students

Imperial College London utilizes FeedbackFruits to elevate self and peer assessment process for the students

We gather the most valuable resources and learnings on pedagogy and email them to you. Once a month. Unsubscribe whenever you want.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Filling out this form means you agree to our Privacy Policy.