This open-enrollment course aims to provide “a new perspective for design and operational decision making at all levels of manufacturing” (1). Students may participate for free, however, to earn a MITx certificate, they need to verify and pay the course fee. The course uses blended learning and is mainly targeted towards graduate-level engineers. It runs for (up to) 12 months.
The instructors used open ended learning activities to sharpen students’ thinking through debate (discussion), peer review and self reflection. Using the Discussion tool allows students to upload their work and comment on each other’s according to a rubric. This aligns with the course goals: inviting collaborative knowledge formation and the assessment of students’ understanding. By using a more student-centred approach, instructors also aimed to cultivate communication and analysis skills in students by triggering deeper thinking and metacognitive skills.
At the same time they wished to stimulate more robust student interactions to mimic physical classroom interactions by making use of the features within the Discussion tool interface.
• Students are able to use critical thinking skills to debate topics with peers, while sharpening argumentative
and collaborative skills.
• Students gain experience with the design and operation of optimal supply chain systems.
Students produce a report based on material covered in face-to-tace sessions, that represents major concepts they have learned within the topic. Using evaluation criteria provided by the teacher in the instructions, learners carry out a peer review which provides input for online discussion.
Learners then participate in an open discussion, providing feedback to each others reports (in line with a rubric included by the instructor), either by commenting on the entire submission or by using in-line comments to highlight certain parts of the submission.
After learners have inserted enough discussion points, the instructor moves to the guidance stage. The instructor looks at the comment activity and guides students to the most commented on or upvoted discussion points and invites students to elaborate further on these points keeping a rubric in mind. In doing this they initiate further debate and trigger deeper thinking.
Lastly, using the input from both the peer feedback phase and the discussion phase, students reflect on what they have learned.All this takes place within the FeedbackFruits tool environment.
• Analyze - key ideas and concepts are addressed using students’ own judgements.
• Evaluate - ideas presented by peers are judged against a rubric for a qualitative assessment.
• Create - procedural systems are designed and operated.
The students are evaluated on their participation in discussions, the quality of written feedback given to their peers, their final report submission, and their written reflection on the feedback they received from peers and the instructor.
• Students reflected on the experience saying that participating in the activity helped them to understand other’s views on their work as well as raising their confidence level. Peers pointed out design flaws in each other’s reports and also offered advice on how to improve their proposals.
• After multiple runs, there were no reports of destructive behaviour, despite initial concerns from instructors.
They did not observe abusive or extreme positive comments.
• Open discussion implies that everybody can see and read everything. As all submissions and comments were visible to everyone, the students of the program became a self-policing community and reported plagiarism if comments appeared that they had been copied by others.
• The instructor incentivised high interaction with the commenting function of the Discussion tool and ended up with many comments, as well as occasional upvotes. It was chosen in the setup that comments were mandatory (without them the assignment is not considered completed) while upvotes where optional.
"With this tool, we've taken a big step towards using class interactions as a way of assessing the students." - Matthew Waterman, Instructor
"Initially I was thinking one-dimensionally but while interacting with my fellow colleagues and teacher my perspective changed dramatically." - Matthew Waterman, Instructor
• Firstly, the instructor explains the requirements for report submission and the purpose of interactivity between learners. They also explain the rubric parameters as well as the grading scheme.
• After students have progressed with the activity, the instructor moves on to the guidance stage. Here, the instructor allows students to see how instructors would review the submissions.
• The instructor looks at most upvoted comments and longest discussion threads and points the students to relevant comments (guidance) and starts a discussion to trigger deeper thinking.
• Instructors are advised to provide clear deadlines for discussions, feedback and final submission.
• The tool provides a wide scalable platform for online discussion and automatic grading.
• In-depth learner engagement analytics are available for the instructor to analyse and gain greater insight into students’ experience with the content.
• Offering these feedback options broadens the assessment opportunities available to instructors and students.
Measuring the amount of upvotes on particular comments provides an alternative way of evaluating students’ contributions. If choosing to do this, instructors should make clear to students that they intend to pay attention to this (1). Instructors could also prompt learners to cite their peers who positively impacted their learning the most in the reflection. This further enhances the collaborative learning process.