On Friday the 10th of July, FeedbackFruits had the pleasure to host a webinar on Peer Learning for Business Schools. We want to thank everybody who attended for the exciting levels of participation and the high quality questions and exchange of ideas. Our thanks to Dylan Fenton and Linda Lee from The Wharton School, for their amazing presentations and insights.
During the webinar, experts at The Wharton School presented how they engaged with students in their largest course to date using the Peer Review tool, while IE Business School featured a use case focusing on evaluating individual student contribution in group work. This showed how Peer Learning can be applied by Business Schools’ courses to improve student’s engagement while fostering both those hard and soft skills vital in todays’ society.
With this article, we will sum up the webinar and its main takeaways in a single place, in case you were not able to attend or if you are interested in knowing more.
In this page you will find:
- The hands-on experience of Wharton School with Peer learning tools
- How IE Business School used the Group Member Evaluation tool to monitor group work and empower self reflection between students.
- Using AI to improve students academic writing skills: the new Automated Feedback tools from FeedbackFruits
- Major Challenges to be aware of when improving your course design
Moreover, you can also find:
- A list of the Q&A that happened during the webinar
- A link to the videos divided by topic of the entire webinar here
- Ewoud de Kok, CEO of FeedbackFruits, talks about the future of education at FeedbackFruits
The hand-on experience of Wharton School with Peer learning tools
The Wharton School’s courses are heavily focussed on group work: for them, collaboration in class is crucial, as well as being able to keep track and evaluate what happens among students when they work together.
At first, Wharton used simple ways to do so, such as paper evaluations, or other third party softwares (such as survey platforms). However, they were “searching for something that was up to date, easily integrated in Canvas and able to use pre-existing groups, with a better use interface, and that could allow for a seamless use by students” said Linda Lee.
Wharton started piloting the Group Member Evaluation tool in Spring 2020, with hundreds of students and among 20 faculties. “It was a big success” says Linda Lee “and it opened the door to explore other tools”.
Indeed, the biggest test arrived with COVID-19, right with the Peer Review tool.
“Many field trip courses were canceled, and we wanted a replacement. We launched a new course in March. Fully remote”. Linda Lee was the lead instructional designer for the course that saw over 2000 students and 700 auditors registering.
“It was the largest class ever taught at Wharton, with weekly 3-hours synchronous class meetings where around 1200 students on average attended live”. This course was led by the Dean of Wharton, Geoffrey Garrett.
The course was focused on a team paper peer review. Wharton needed a peer review tool allowing for multiple due dates, automatic assignment to reviewers, possibility to create milestones and customized rubrics according to faculties’ needs. “FeedbackFruits’ tool allowed us to do all this while also making it possible to export the criterion used to facilitate grading, and we did it for as many as 500 groups!”
Wharton created 3 Peer Reviews with different questions and scales. “The most delightful thing” adds Dylan Fenton “was that we created these assignments once, and then re used the configuration for second and third assignment”
Setbacks? “Not so many” says Linda Lee. "For the undergrad group (1100 students) we experienced the loading time was longer than in general, but student support was kept at a minimal and FeedbackFruits support team always came to the rescue very quickly!”.
In the peer feedback activity, students had to evaluate the quality of a research paper and the argument and conclusion the group made. The Peer Review tool made it possible to set different grading systems, from scales to open ended feedback, to create peer reviews tailored to the class dimension and the different instructors' needs.
How IE Business School used the Group Member Evaluation tool to monitor group work and empower self reflection between students
Speaker: Ananda Verheijen (Head of Teacher Relations, FeedbackFruits)
IE Business School decided to deploy the Group Member Evaluation (GME) tool in order to take a step from a past where there was no dedicated technology giving insights on group collaboration processes.
The goals of IE Business School, by using the GME tool, were to eliminate free-riding (where a student in a group lets others do the job and then gets the same grade as anybody else) and get insight on the students’ learning process.
The school used the tool in different courses in their MBA, in a smaller environment than the one of The Wharton School, for a total of 27 students and 4 groups, each of which including 6-7 people.
Students worked together on a group project and had to evaluate their group members at the end of the course. The scores they received affect their final grades
IE Business School wanted their students to take increased responsibility about their contribution while also increasing their engagement throughout the course.
The experience was overall a success. The students were able to use the tool with minimal instruction and without problems. No complaints about free-riding were recorded. At the same time, students reported feeling more in control of the group dynamics, while enabling them to give feedback anonymously greatly reduced conflicts between them.
The engagement of students with the tool was more than satisfactory, and brought them to engage with other coursemates by writing comments about how others in the group behaved but also by self-reflecting on their behaviour and ratings.
Using AI to improve students academic writing skills: the new Automated Feedback tools from FeedbackFruits
Speaker: Jan Hein Gooszen (Technology Enhanced Learning , FeedbackFruits)
To create it, FeedbackFruits is “using data directly from research on which criteria are the best to assess quality of written work of students” says Jan Hein Gooszen. “Some of these criteria, we noticed, could be fully automated, while others might work better after teachers’ inputs”.
The Automated Feedback tool, thus, uses criteria prioritized by scientific research to give valuable feedback to students that would otherwise require a large amount of time, especially in big classes. Automated Feedback will be taking its place next to the traditional two other feedback types: the student one (peer feedback) and the teacher one (expert feedback).
Automated Feedback would collocate itself where the personal student’s attention to feedback in general is higher (at the beginning of the feedback process) but where professionalism is lower. The goal, here, is indeed that of increasing students’ academic writing skills, while decreasing teachers’ review workload by avoiding repetitive feedback.
Indeed, Automated Feedback will work on “lower order skills”, checking things such as the paper structure (sections and index presence, word count), language issues (abbreviations, passive voices, etc.), layout settings, references, tables, and figures.
Being easily integrated in the LMS as an external tool, Automated Feedback is able to take care of repetitive feedback with a just-in-time approach to students’ learning, right while they are writing their paper. In this way, precious time will be saved for teachers who can focus more on feedback on higher order skills, such as critical thinking or work cohesion.
Major Challenges to be Aware of when improving your course design
While, as pointed out during the Australian Inspiration Day webinar, students should always be told about the benefit of employing a specific tool, other interesting points came up during this webinar.
Among them, it is important to note that instructional designers should keep in mind that deploying an online tool in a class with a big number of students - as Wharton did - may lead to some loading issues both for students and for teachers while downloading the data for their grading. However, Wharton experience demonstrated that these issues are minimal even in the presence of thousands of users.
For any issues or problem, moreover, our support team is always there for you!
Here are listed some of the most interesting questions asked during the webinar, with answers. If you have any further questions, do not hesitate to reach out to us!
Q: Were the students surveyed after the course about their experience with FeedbackFruits Peer Review
A: No, in these use cases students did not know a new tool was being piloted. Lack of issues, however, was already a positive feedback in itself.
Q: Is Peer Review a tool meant only for student peer assessment within groups, as opposed, for example, to student peer formative feedback?
A: Peer Review can be used within groups or across the whole class or section.
Q: Is it possible for students to submit their own content in multiple kinds of media for peer review?
A: This is possible. Deliverables such as video material, documents, audio files, etc. can be delivered as content by students.
(Asked to Wharton School of Business’ speakers)
Q: How the Peer Review integration in Canvas might impact the overall engagement for students and faculty in the course?
A: We think this tool helped increase peer-to-peer engagement in one of Wharton’s first online classes, where it is sometimes harder for students to connect. It also allowed faculty to have insights into how students were working with one another, which is also sometimes difficult in a remote modality.
Q: Is Automated Feedback more indicated for academic and soft skills, or also for hard skills (such as STEM courses like mathematics, physics, etc.)?
A: Automated Feedback can currently be used only for subjects that require academic writing. However, it is possible in the future it will develop in a more comprehensive way.
Schedule a demo today.