In this semester-long project on data science and artificial intelligence, students worked in groups on research projects based on various subjects within these topics. In a problem-based learning approach, the groups chose projects based on topics provided by third-party clients, and at the end of the course they delivered a report and final presentation. The course took place in the third year of the degree program, and was split into three phases across the semester.
The instructor chose to incorporate Group Member Evaluation in order to address the problem of free-riding in group work, give students a platform where they could give feedback to each other, and also give them a chance to improve based on the feedback they received. With a strong focus on group project management, planning, and evaluation throughout the course, it was imperative to provide students with a space to complete these feedback tasks, while simultaneously allowing the instructor to maintain oversight of the process.
Students can practice giving and receiving constructive feedback, collaborating and communicating in a group setting, and reflecting on the group process.
Other learning objectives are part of the program but are not related to the peer feedback component.
This course consisted of three phases, with the first of these covering the setting-up of the project. At the end of this first phase, lasting seven weeks, groups hand in their project plan and give a presentation to both the clients who provided the topics and the internal examiners. The second phase, also lasting seven weeks, follows a similar structure, and the third and final phase, lasting three weeks, consisted of a final presentation and evaluation whereby students were asked to defend their research directions and choices made. Group Member Evaluation was use twice, in the second and third phases, and a final feedback discussion took place afterwards, where students reflected on what they had learned from the feedback. Group consultations also took place, providing a chance to personally discuss feedback on group processes and performance. The feedback given within the tool was based on three questions: what did the reviewee do well; where could they improve; and what did you especially appreciate in this person? Additional comments were also allowed but not mandated. Reviewers could see who they were reviewing but these ratings were anonymous to the recipient.
As a pilot course, and a pilot feedback activity, the Group Member Evaluation was a formative component and not graded. However, in order to meet the attendance and participation requirements both instances of the activity needed to be completed.
“What made me happy is that some students really had some good ideas about what they could improve on.” - Instructor
Group dynamics can be harder to keep an eye on in online courses when compared to face-to-face sessions. As well as making available a digital overview of peers’ grades and performance, Group Member Evaluation makes class management simpler for the instructor as every student’s self-assessment and feedback comments are available in one interface.
Imperial College London utilizes Group Member Evaluation to elevate self and peer assessment process for the students
Texas A&M University School of Public Health decided on a campus-wide adoption of the FeedbackFruits tool suite to enhance student engagement and implement authentic assessment.
Explore how FHSU utilized FeedbackFruits’ solutions to elevate the peer feedback process and stimulate collaborative learning, and what encouraged FHSU to let FeedbackFruits be a supporter in achieving their pedagogical goals.