Foster collaborative learning and student participation in a blended course at Maastricht University

Dan Hasan
|
April 23, 2021
DOMAIN
STEM
Class Size
50
Instructor Workload
Learner Workload
LMS
ABOUT THE INSTITUTION
ABOUT THE INSTRUCTOR(S)
ABOUT THE INSTITUTION
ABOUT THE INSTRUCTOR(S)

Context

In this semester-long project on data science and artificial intelligence, students worked in groups on research projects based on various subjects within these topics. In a problem-based learning approach, the groups chose projects based on topics provided by third-party clients, and at the end of the course they delivered a report and final presentation. The course took place in the third year of the degree program, and was split into three phases across the semester.

The instructor chose to incorporate Group Member Evaluation in order to address the problem of free-riding in group work, give students a platform where they could give feedback to each other, and also give them a chance to improve based on the feedback they received. With a strong focus on group project management, planning, and evaluation throughout the course, it was imperative to provide students with a space to complete these feedback tasks, while simultaneously allowing the instructor to maintain oversight of the process.

Constructive alignment

Learning objectives

Students can practice giving and receiving constructive feedback, collaborating and communicating in a group setting, and reflecting on the group process.

Other learning objectives are part of the program but are not related to the peer feedback component.

Learning activities

This course consisted of three phases, with the first of these covering the setting-up of the project. At the end of this first phase, lasting seven weeks, groups hand in their project plan and give a presentation to both the clients who provided the topics and the internal examiners. The second phase, also lasting seven weeks, follows a similar structure, and the third and final phase, lasting three weeks, consisted of a final presentation and evaluation whereby students were asked to defend their research directions and choices made. Group Member Evaluation was use twice, in the second and third phases, and a final feedback discussion took place afterwards, where students reflected on what they had learned from the feedback. Group consultations also took place, providing a chance to personally discuss feedback on group processes and performance. The feedback given within the tool was based on three questions: what did the reviewee do well; where could they improve; and what did you especially appreciate in this person? Additional comments were also allowed but not mandated. Reviewers could see who they were reviewing but these ratings were anonymous to the recipient.



Learning activities, according to Bloom’s Taxonomy, were mainly at the level of:

  • Analyzing: a research plan and determining a course of action to address the client’s topic
  • Evaluating: the collaboration and communication process throughout the group project
  • Creating: a research project which incorporates skills and knowledge gained throughout the course

Assessment of learning outcomes

As a pilot course, and a pilot feedback activity, the Group Member Evaluation was a formative component and not graded. However, in order to meet the attendance and participation requirements both instances of the activity needed to be completed.

Notable outcomes

  • The instructor found the tool clear to use and was pleased with the quality of feedback being exchanged in the Group Member Evaluation activity, and was surprised with the openness of some of the groups.
  • It was noted that the first round of feedback, about halfway through the course, saw higher general engagement than the second, which took place at the end.
  • Students were able to generate concrete ideas for improvement with the activity.
“What made me happy is that some students really had some good ideas about what they could improve on.” - Instructor

The role of the instructor

  • The instructor left detailed instructions inside the tool about how and when, and what sort of feedback should be given, with particular emphasis on the benefit of considerate, constructive feedback.
  • Announcements were also made to all students which signalled that the Group Member Evaluation tool would be used, as well as its purpose.
  • The student activity was checked inside the tool before project meetings, as well as at the end of period 2, to see if anything in particular needed to be discussed with the co-examiners. In period 3, the data from the second feedback moment was compared with the previous one to see if there was a trend.

Added value of technology

Group dynamics can be harder to keep an eye on in online courses when compared to face-to-face sessions. As well as making available a digital overview of peers’ grades and performance, Group Member Evaluation makes class management simpler for the instructor as every student’s self-assessment and feedback comments are available in one interface.

Possible variation

Share on social media

Recommended use cases

Leiden University utilizes FeedbackFruits Competency-Based Assessment solution to track competencies of students

Imperial College London utilizes FeedbackFruits to elevate self and peer assessment process for the students

Texas A&M University School of Public Health decided on a campus-wide adoption of the FeedbackFruits tool suite to enhance student engagement and implement authentic assessment.

Subscribe for educational articles, practical pedagogical tips & tricks, event updates, invites and more. Unsubscribe anytime.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Filling out this form means you agree to our Privacy Policy.