Last week on November 24, FeedbackFruits hosted its third webinar with Microsoft.
During the webinar, Kaja Sinead Selvik (Enterprise Channel Manager at Microsoft), together with John David Baird (Educational Developer at Reykjavik University), and Helena Coll Sanchez (International Partner Success Manager at FeedbackFruits) discussed how FeedbackFruits tools enhance learning in Canvas and Microsoft Teams through data insights and a case study. You can find the complete recordings and Q&A here.
For those who couldn’t attend, this article will go through the key takeaways that you, as an education expert, can’t afford to miss!
You can also rewatch the webinar, now available on our website and Youtube.
Since 2020, FeedbackFruits has been trusted by Microsoft to provide innovative teaching tools to enhance engagement and interaction within the Teams environment.
According to Emily Glastra, Director Public Sector, Microsoft Netherlands:
“By connecting and integrating FeedbackFruits with Microsoft Teams for Education, we ensure a holistic approach within higher education and prepare students for tomorrow’s job market."
For those who’ve never heard of FeedbackFruits, you might wonder who we are and what exactly we provide. FeedbackFruits was started by Ewoud de Kok with a mission: helping educators to make every course engaging. To fulfil this goal, we developed and introduced a complete, LMS-integrated pedagogical tool suite that supports varied course designs and learning activities. Furthermore, we collaborate with top educational institutions worldwide to constantly upgrade our tools with new features and add new tools to harbour more pedagogical approaches.
Check out our Product updates column to discover our latest tools and features.
Since its beginning in 2013, FeedbackFruits has been proud to support over 100 institutions worldwide in realising their pedagogical visions. Especially in the Nordics, our tools have been trusted to help instructors foster authentic assessment for learning, peer feedback, group work and lifelong skills.
For institutions in the Nordic region, the pedagogical model has mostly centred on:
This pedagogical model is something that FeedbackFruits “have been greatly inspired by and looked up to”, remarked Helena, our International Partner Manager.
So how do our tools play a part in fostering these pedagogical values? To measure the impact of FeedbackFruits, we conducted student surveys with our pilots. For the Nordic region, our tools were ranked 4.6/5 in terms of usefulness by a group of 65 students. Furthermore, they expressed great enthusiasm for the capacity of self-paced learning, continuous interactions and collaboration with peers and instructors, plus interactivity with the course materials.
“This tool provided a better overview of students’ performance and progress over the duration of the course” - acknowledged an instructor when commenting on the students’ survey results.
At the international level, we have already achieved a major impact based on the evaluation results shared by the 7 involved institutions. Up to 82% of the students said their final work was critically enhanced by participating in FeedbackFruits activities. Furthermore, 80% of students reported that taking part in FeedbackFruits' group activities helped students to become better collaborators, increase their participation level, and enhance group dynamics. FeedbackFruits peer feedback activities also allowed students to develop their critical thinking as they practiced giving constructive feedback, and maintain a growth mindset when processing peer comments, with 71% of the respondents noting an improvement in their feedback skills.
The data provides quantitative evidence of FeedbackFruits’ impact on student performance. However, to what extent have the tools influenced the teaching and learning process?
John Baird, Educational Developer of Reykjavik University attempted to answer this question, by sharing his experience of using FeedbackFruits Group Member Evaluation (GME) to facilitate peer assessment in group work, and how he measured the impact of peer assessment experience on student engagement and learning outcomes.
“When we look at the literature, we see a lot of value in peer and self assessment in group work context, both from the learning and practical perspective,” emphasized John when explaining that self and peer assessment have long been utilized at Reykjavik.
From the learning point of view, peer assessment is considered an effective strategy in addressing free-riding, ensuring fair, accurate assessment of individual performance, and identifying potential conflicts within groups.
During the Spring semester 2021, the Reykjavik team piloted FeedbackFruits Group Member Evaluation tool across 3 courses. Each course covered different subjects(Law, Psychology, and Business) with different set-ups and usage of Group Member Evaluation. A summary of course setups can be found below:
As the table shows, in course A, the tool helped instructors evaluate students’ group performance when participating in in-class discussion. The evaluation rubric focuses on a few items, namely “Why are you there?”, “What did you prepare for?”, and “Participation”. For course B, the instructor used Group Member Evaluation for the group evaluation factor in a semester long project, with a more extensive rubric. Course C, unlike the other 2, was a shorter course, with Group Member Evaluation being used in 4 group assignments to let group members evaluate each other by the end of the course.
Not only did the tool allow for rubric and criteria setting for peer and group member evaluation, Group Member Evaluation offered instructors several other options to modify and enhance the assessment activity.
The first functionality John mentioned was the seamless Canvas integration, allowing for automatic synchronisation of course groups from the native LMS. This, according to John, “is obviously extremely useful”.
Other features that John and his team utilized were:
For further information on these features, check out this article.
Once the 3 courses ended, the instructors gathered feedback from students in terms of their experience with self/peer assessment and Group Member Evaluation. The survey items were adapted from Elliot and Hggins (2005). Overall, the responses revealed that the implementation of course B was more successful than that of course A and C.
The result then caused John to question: “What is the reason for such a difference among the courses?”
To answer this, John and his team decided to look into the relationship between the survey items, using the factor analysis method. Factor analysis is based upon the idea that for a collection of observed variables, there is a smaller set of underlying variables called factors. These factors can explain the interrelationship among the observed variables - in this case, the survey items.
Firstly, the team did was to pull all the survey data from the 3 courses. From this the instructors were able to identify the items that were related and group these into 5 distinct factors:
The correlation analysis among these 5 factors revealed notable findings:
First of all, there was a positive relationship between perceived learning values, satisfaction of practical aspects, and perception of fairness. This means that as perceived learning values increased, “so does motivation, satisfaction with practical aspects, and perception of fairness”. John then elaborated more on this finding:
“The reason why students are motivated to engage in self and peer assessment, even in the future, is that they see an added learning value in terms of developing group work skills and effective team work behavior.”
Second, satisfaction with the practical aspects has a significant influence on motivation and perception of fairness. In other words, if students are positive about the technical factors of the course (the set-up, instructions, support, etc.), they tend to be more encouraged to engage in the learning process. This also explains why course B received higher levels of satisfaction. According to John, this course had a teaching assistant who helped monitor the implementation and results of the peer and self assessment activity. Also, the rubric of course B could be used as a template for course A, C, and for the future courses.
On the other hand, previous experience didn’t show any correlation with the other factors. Quite often, lack of prior knowledge or training is considered a key barrier to students’ performance in self/peer assessment. The situation is different in the case of Reykjavik, probably because the instructors offered students “multiple opportunities to engage in self and peer assessment”, as commented by John. Furthermore, they also “provided students with a dry-run, a practice opportunity when introducing the group work assignment.”
Besides the survey questions, John asked instructors and students of the three courses to reflect on their experience using Group Member Evaluation for self and peer assessment activity. Despite some technical issues, the received comments were positive, as both teachers and learners were satisfied with the tool.
“The tool makes it possible to implement the group member evaluation factor, and also allows me to follow all groups and provide grades.” - noted the instructor from course A. In addition, the instructor remarked that Group Member Evaluation contributed to an increase in engagement. John then emphasized: “Previously you’d encourage students to come prepared, say how important it was, but now there is actually a way of documenting the process and making students alert.”
What the instructor liked most about Group Member Evaluation, is that it formalized the process around preparation for and participation in class, thus making group contribution grading feasible.
Similarly, instructors of course B showed great satisfaction with the tool. “They found[the tool] really useful in terms of spotting group dynamics issues, presenting a clear process once explained properly, helping to lighten the grading load, and keeping everyone on their toes,both teachers and students.”
In terms of the tool features and interface, the fact that the entire group evaluation process happened in real time made it “super transparent, easy to flag issues and follow up on the learners.”
Most importantly, the teachers acknowledged that using the tool encouraged both the students and themselves to be more mindful about what good group work is and what goals they are working towards.
All instructors agreed that they would definitely use Group member Evaluation for their future group assignments, and also recommended the tool to other colleagues.
John then concluded his presentation with a number of learning points upon facilitating the self and peer assessment activities, which were:
1. Pay attention to the practical aspects in terms of:
a. What? (Whether the task design reflects or promotes teamwork);
b. How? (How instructors evaluate students’ performance and the tools needed for the evaluation)
c. When? (The deadlines of the assignment); Who? (How the roles and responsibilities are distributed among teachers, TAs, and students)
d. Why? (The value and self and peer assessment in relation to the learning outcomes)
2. Videos that demonstrate how to use the tools are extremely useful to help students get familiar with Group Member Evaluation. Also, being able to observe the activity from students’ perspective is absolutely invaluable for teachers.
3. When designing the rubric, instructors need to make sure students have opportunities to explore and apply success criteria, which would open the doors to students in co-designing the rubric.
4. Pushing course reminders at the start of the courses to inform students of the requirements and timeline for the assignments.
5. Imposing hard deadlines or avoiding deadline extension since this would be troublesome for groups to complete the assignments.
6. Providing students with technical support and advice to better understand and use the tool.
7. Closely monitoring the data coming from the group evaluation: both ratings and comments. You can do this in Group Member Evaluation since the tool provides data insights into students' progress
8. Maintaining regular check-ins with your students to get feedback on their experience and progress.
The story of John and the Reykjavik team certainly sparked a lot of interest, as we received quite a few queries and remarks from the audience. Below you can find a list of questions and answers from John.
Question 1: You showed us a fair part of students (from group B) did not want to use self and peer assessment: were the students asked to elaborate on their scoring? I am curious as to whether it is about the feedback concept or the work required?
John’s answer: The short answer is no, they were not asked to elaborate on that rating. In terms of whether it is a concept or a process, I imagine it is a combination of the two. In that course, they did experience technical issues. There is also the question that I flagged at the end of the course: the students used the tool within the in-class discussion. We iterated that design exactly the same this semester, and we’re gonna gather feedback from students on that. It will prove an interesting point of comparison, to see whether students have different experiences.
Question 2: I have a question about instructor feedback from course A, which elements were too difficult to do on your own in the set up? Where was the support needed?
John’s answer: I suppose to a certain extent it is not necessarily specific to the group member evaluation tool. I think it is simply about educational technology in general, and it is an issue we need to encounter. It is often overwhelming to have to sit down and learn a new piece of technology on top of the LMS, Zoom, Teams, and so forth. It was really just having somebody who understood how the tool works, its pitfalls to be able to utilize the tool with. Once it was set up, once they had a chance to explore, they became quite comfortable. Once the data starts coming in, certainly it can be quite overwhelming at first, especially if you’re dealing with a large number of students. So it all comes down to working with the teachers, just to orient them in relation to the data, and how to find out what to focus on.
Question 3: Since motivation increases with the perceived learning value, could you elaborate on how to ensure that students understand the value of peer assessment and that their peer feedback is of high quality?
John’s answer: That is a correlation analysis, so that correlation sort of runs both ways. I think the simple answer to this question is to explicitly address this to the students. Take some time in the beginning of the course when introducing the syllabus, the assessment activity, and talk to students about what their experiences have been of group work, peer evaluation. Get a sense within the room and allow students to share their experience, whether positive, negative or mixed regarding self and peer assessment. It is also important to step into the literature a little bit to share with students what we know about the value of peer evaluation in addressing certain challenges. Finally, make use of any opportunities within your institution to gather feedback as we do here and to transfer that process to the students.
Question 4: The factor analysis was especially interesting. I have a few questions... Did the evaluations impact students’ grades? If so, did you get feedback on how students felt about being graded by other students?
John’ answer: So for the first question, yes it did. The 2nd question, we didn’t ask that specifically, but we did ask about whether group members were the people’s best places to evaluate the contribution of other group members. We also asked whether students felt that the process was fair. So we didn’t link it specifically to the fact that summative assessments. I think it is reasonable to assume that students were clear on the fact that this was contributing to their final grade. They were clear on what the process-based component was, the bulk of the grade was based on the ratings of the peers.
We were extremely grateful for the great conversation with John and Kaja. It was powerful to see educational experts coming together to share their knowledge and experiences on self/peer assessment and pedagogical technology. We look forward to the next webinar!
You can now rewatch the webinar here. Also, visit other events and webinars from FeedbackFruits to pick up some valuable nuggets for your teaching.
Are you using Teams in your institution? If FeedbackFruits is not yet available within your Teams environment or if you have any other questions, you can contact us or click on the blue support button in the bottom left corner on our website.
Explore how to best implement active learning strategies with deep understanding of different modalities
FeedbackFruits announces partnerships with many institutions worldwide over the past 4 months
An overview of the state of competency-based education (CBE) in higher education around the world