Falmouth’s Games Academy delivers studio-style, industry-mirrored teaching across Robotics, Game Development, Computing and AI for Games, Animation, Game Art, and Game Design. Over 200–300 students work in multidisciplinary teams of around 8–15 to create live games, apps, and robots using agile workflows and rubrics aligned with learning outcomes.
Matt Watkins, Senior Lecturer and Program Lead for Robotics, coordinates cross-disciplinary projects and assessment across the Academy.
Falmouth’s Games Academy delivers studio-style, industry-mirrored teaching across Robotics, Game Development, Computing and AI for Games, Animation, Game Art, and Game Design. Over 200–300 students work in multidisciplinary teams of around 8–15 to create live games, apps, and robots using agile workflows and rubrics aligned with learning outcomes.
Matt Watkins, Senior Lecturer and Program Lead for Robotics, coordinates cross-disciplinary projects and assessment across the Academy.
At Falmouth University’s Games Academy, students don’t just study theory, they design, code, and create in teams that mirror real-world game studios. Across degrees in Robotics, Game Development, Computing and AI for Games, Animation, Game Art, and Game Design, more than 300 students collaborate in multidisciplinary teams of 8 to 15. Bringing together art, design, and technology, they produce live, playable, and interactive projects that showcase innovation and teamwork at every stage of development.
The Academy’s mission is to bridge the gap between education and industry. It emphasizes structured, transparent assessment practices that help students develop teamwork, reflective thinking, and real-world collaboration skills, while enabling instructors to manage complex, large-scale projects efficiently and fairly.
As the Academy grew, so did the complexity of its team-based projects. Instructors needed to accurately evaluate individual contributions across large, interdisciplinary teams while ensuring the process was fair, transparent, and scalable.
“We were struggling and looking for a product that would solve the problem we have of gathering students’ reviews of each other. How do they feel about their peers? How can we measure their contribution to teams? How can we understand what they’re all about?”
— Matt Watkins, Senior Lecturer, Falmouth University
Despite the strength of the collaboration, it has become increasingly difficult to monitor the commitment of each student, whether they are coders, designers or robot builders. Manual spreadsheets were time-consuming, feedback was inconsistent, and students were hesitant to offer honest peer evaluations out of concern for relationships or repercussions.
“We were looking for something to monitor their interactions and their contributions, to monitor their peer assessment.”
— Matt Watkins, Senior Lecturer, Falmouth University
Falmouth’s Games Academy was looking for a solution that would:
These goals aligned perfectly with the Academy’s broader educational philosophy: create authentic, collaborative learning experiences that mirror the creative industries.
After exploring several solutions, the Games Academy chose FeedbackFruits’ Feedback & Assessment solution, citing its seamless Moodle integration, scalability, and flexibility in managing complex, team-based workflows.
“We looked and looked, and the only company we found who could facilitate this using Moodle as our portal was FeedbackFruits.”
— Matt Watkins, Senior Lecturer, Falmouth University
The Academy implemented 4 core activities to bring structure and sustainability to feedback and assessment.
Goal:
Encourage students to provide constructive, rubric-aligned feedback to peers ahead of submission, promoting reflection and improvement.
Learning activity design:
One week before each project deadline, students use Peer Review to evaluate peers’ deliverables using shared rubrics directly linked to course outcomes.
“It’s a checklist opportunity to ensure you haven’t missed anything that could improve your grade.”
— Matt Watkins, Senior Lecturer, Falmouth University

.png)
Impact:
Peer Review helped students internalize assessment criteria, refine their final work, and become more reflective about their contributions while providing educators with deeper insights into team collaboration.
Goal:
Guarantee accountability and fairness in large, interdisciplinary team projects.
Learning activity design:
Every 3–4 weeks, teams complete anonymous, rubric-guided evaluations of each member’s collaboration, effort, and professionalism. Staff can monitor engagement trends, spot issues early, and ensure balanced grading.
“It gives us the flexibility to gather what we need, when we need it, and it’s pretty reliable.”
— Matt Watkins, Senior Lecturer, Falmouth University
Impact:
With over 200–300 students spread across 20+ teams, Group Member Evaluation provided a clear, continuous view of team dynamics, replacing manual tracking with transparent, data-informed insights.

Goal:
Provide structured, multi-source evaluation during live project showcases.
Learning activity design:
At Falmouth, students participated in Demo Days where they presented their final projects. Instructors used Skill Review to evaluate presentations in real time, providing scores with sliders (ratings that can be adjusted to the appropriate level), and adding written comments. Feedback was aggregated and instantly available for students to review and reflect on.

Impact:
Students received instant, actionable feedback aligned with professional standards, while educators achieved greater consistency, speed, and confidence in their evaluations.
“We always tie activities to assessment criteria so it never feels arbitrary.”
— Matt Watkins, Senior Lecturer, Falmouth University
Goal
Save time and ensure consistent, high-quality peer and team-based assessment across modules.
Learning activity design
Falmouth University’s Games Academy created reusable FeedbackFruits templates for Peer Review, Group Member Evaluation, and Skill Review. These support 3–4-weekly feedback rounds in large team projects, pre-submission peer reviews, and fast demo-day assessments. Each template mirrors the module rubric, using sliders for quick contribution ratings and grade-boundary rubrics for formal peer review. Embedded in Moodle, the templates are easily adapted across Robotics, Game Programming, and AI for Games.
Impact
Assessment setup is now up to 4× faster, replacing spreadsheets and manual data collection. Educators can focus on mentoring and fostering continuous feedback, while students engage earlier with assessment criteria, teamwork, and industry-style collaboration.
The integration of FeedbackFruits’ tools has fundamentally transformed how the Games Academy manages teamwork and feedback:
“FeedbackFruits allows us to mirror industry practice while maintaining fairness and scalability. It’s reliable, flexible, and lets us gather exactly what we need.”
— Matt Watkins, Senior Lecturer, Falmouth University
By embedding structured feedback into every stage: from collaboration to presentation, the Games Academy has cultivated a culture of fairness, reflection, and transparency that scales effortlessly across hundreds of students.
Initially, students were hesitant to critique their peers. Making peer review a mandatory, graded component encouraged participation and fostered a more honest feedback culture. Anonymous evaluations and instructor oversight further reduced bias.
Faculty noted a significant reduction in administrative workload and an increase in student engagement, as learners began to see feedback not as a requirement, but as a tool for growth.
The Games Academy at Falmouth University continues to enhance its feedback and teamwork practices with FeedbackFruits, focusing on:
These steps reflect Falmouth’s ongoing commitment to sustainable, authentic assessment, where every feedback moment strengthens learning and prepares students for real-world success.
Through its partnership with FeedbackFruits, Falmouth University’s Games Academy has enhanced its ability to deliver fair, scalable, and reflective teamwork assessment, turning the challenge of large-scale collaboration into an opportunity for authentic learning and continuous improvement.
By combining structured peer feedback, automation, and industry-aligned evaluation, the Academy ensures every student’s voice is heard, every contribution recognized, and every project assessed with fairness and transparency.


