From peer assessment to teaching excellence at Imperial College London

Dan Hasan
|
November 22, 2023
DOMAIN
STEM
Class Size
130 – 150
Instructor Workload
Learner Workload
ABOUT THE INSTITUTION

A university often ranked in the top ten in both Europe and in the world, Imperial College London was established in 1907 and pursues “excellence in science, engineering, medicine and business.” It is attended by around 23,000 students and employs around 4,500 members of staff.

ABOUT THE INSTRUCTOR(S)

Demetrios Venetsanos is Head of Student Experience and a Principal Teaching Fellow in the Department of Aeronautics at Imperial College London and has extensive experience in teaching engineering at a number of UK universities.

ABOUT THE INSTITUTION

A university often ranked in the top ten in both Europe and in the world, Imperial College London was established in 1907 and pursues “excellence in science, engineering, medicine and business.” It is attended by around 23,000 students and employs around 4,500 members of staff.

ABOUT THE INSTRUCTOR(S)

Demetrios Venetsanos is Head of Student Experience and a Principal Teaching Fellow in the Department of Aeronautics at Imperial College London and has extensive experience in teaching engineering at a number of UK universities.

Context

Peer- and self-assessment at Imperial have become focus areas for improving teaching practices and learning outcomes. A tool for peer assessment, Group Member Evaluation, has been used with undergraduate engineering students to improve the efficiency of peer- and self-reviews throughout several group projects. This use case explores how instructor Demetrios Venetsanos experienced the switch to using new technology to aid his course design and delivery.

Constructive alignment

The problem: Peer assessment and feedback require attention

Imperial College London (ICL) maintains its reputation for excellence in teaching by striving to evaluate and improve its education standards. These standards are benchmarked by many universities in the UK with data from the National Student Survey (NSS), a student feedback census; and the Teaching Excellence Framework (TEF), which “recognise[s] and reward[s] excellent teaching”. When instructor Demetrios Venetsanos and his department surveyed this data, they uncovered two areas that stood out as requiring attention: assessment and feedback, and student experience. This prompted the search for a solution that would integrate assessment into the learning experience.

To this end, FeedbackFruits Group Member Evaluation was used in a number of courses throughout the Department of Aeronautics, including the first-year course “Engineering Practice 1”. A number of assessments within this course involved group work, peer feedback, and student-centered evaluation, and aimed to develop critical reflection skills as well as help students take responsibility for their learning. Both peer- and self-assessment were used as part of an ‘assessment as learning’ strategy. Using Group Member Evaluation supported ICL’s objectives by helping to set up a transparent and accessible peer assessment framework, with detailed and actionable analytics. This led to a very positive experience for Demetrios, who shared with us his data and experiences as part of a webinar on student satisfaction.

The homescreen view inside Group Member Evaluation. Student analytics show at a glance how many students have completed the activity, displaying metrics for the whole class, different groups, and individual students.
The homescreen view inside Group Member Evaluation. Student analytics show at a glance how many students have completed the activity, displaying metrics for the whole class, different groups, and individual students.

The solution: Self- and peer assessment as learning strategies

Trial and error is an intrinsic part of learning. For Demetrios Venetsanos, cultivating this mindset among first-year students (notoriously hesitant to give feedback) of the engineering track was a crucial challenge, aligning with ICL’s wider institutional goals to improve teaching standards. While areas such as course-level teaching and academic support scored highly using the TEF and NSS ratings, even when compared to the sector and Russell Group (a group of the UK’s leading universities), the student survey identified ‘assessment and feedback’ as falling behind. At the same time, while overall TEF scores were rated at ‘gold’, the student experience rating managed only a silver.

An overview of NSS and TEF scores highlighting two areas (assessment and feedback, and student experience) for which the Department of Aeronautics at ICL saught an integrated solution.
An overview of NSS and TEF scores highlighting two areas (assessment and feedback, and student experience) for which the Department of Aeronautics at ICL saught an integrated solution.

For a highly-ranked university, maintaining an excellent score in every category is paramount, not just to validate reputation, but to ensure a certain standard of education for students.

Demetrios and his department sought a solution based on pedagogy and the tried-and-tested techniques of self-assessment and peer evaluation as ‘assessment as learning’ strategies, referencing Yan and Boud’s (2022) “Conceptualising assessment as learning”.

Benefits of peer and self assessment

Peer assessment:

  • enables interaction and cooperation in the group
  • makes students a critical subject
  • improves qualification of peers’ work, and autonomy.

Self-assessment: 

  • taking responsibility for learning
  • developing subject understanding
  • developing critical reflection

As this strategy had the potential to develop critical skills for student development and teamwork abilities, the department saw it as their duty to research a number of possible technological solutions that could facilitate effective self-and peer assessment.

“Students are trusting their futures in our hands, and that is a huge responsibility… There [was] an indication that there are some things to improve with regard to assessment and feedback. There were a number of tools available, but we selected FeedbackFruits. And that is not a coincidence.”

Why Imperial College London chose FeedbackFruits

So why did Imperial College London choose FeedbackFruits Group Member Evaluation among the many options for peer assessment that were available? According to Demetrios, there were a number of factors that were essential to this decision. Accessibility, for instance, played a major role:

“We saw that it was extremely easy to access. Students prefer to have something that works in a very simple manner.”

Despite most students being of a digital native background, a streamlined user experience and modern, intuitive interface make a big difference to the learning experience of the student, as well as making course design less cumbersome.

“It was the first time we used Group Member Evaluation for this module. From a student perspective, it was easy to access and easy to use. It allowed them to reach a stage where their judgments were well-justified.”

Something as simple as not having extra programs or windows open to complete learning activities increases the efficiency of learning and reduces the extraneous cognitive load caused by unnecessary extra software and navigation. **

“It sits in a seamless manner on the VLE - no extra hardware, no extra software, which is good news for the university.”

Besides user experience, another key factor in the decision to implement FeedbackFruits reported was the added value of transparency and oversight in class dynamics.

“This tool helped us filter out the students who were interested and were actually engaged and contributing. We did have indicators that there were a percentage of students who did not engage. But with this tool we found it was much easier to identify them, approach them, and find out what was going on.”

Taken from Demetrios’ webinar session, an overview of feedback on Group Member Evaluation tool from students, teachers, and managers involved in the Engineering Practice 1 summer module.

The outcomes: Enhanced feedback skills and engagement

1) Students focus on quality over quantity

To evaluate the outcomes of using this new technology in Engineering Practice 1, Demetrios and his department conducted a survey that measured engagement and performance through a variety of metrics such as the percentage of students who had read the assignment instructions, and the average number of comments per week. One data point that stood out was the reduction in the number of comments left as the module progressed. This was interpreted as follows:

“This reduction was not because students did not like providing comments, but [instead] they learned how to become more efficient… so instead of providing five comments, it was possible for them to provide three, more focussed comments. This tool helped them improve their efficiency and their performance in a team.”

2) Critical reflection

Another change in student performance was observed throughout the module in terms of self-reflection:

“For self-assessment, students, from a point onwards, became more honest towards themselves. Toward the end of the project, they were more modest and down-to-earth. By writing down their comments for them to read again and again, they were not forgotten. They were there on the system and they could see them at all times. That helped them form a strategy towards becoming more honest and more reflective.”

The number, and nature, of comments also changed over time, with students becoming more efficient in articulating their thoughts as the module progressed.

Weekly data on average number of comments showed a slight but consistent decrease with time as students improved the efficiency of their reflection skills and left “fewer, but more focussed comments”.
Weekly data on average number of comments showed a slight but consistent decrease with time as students improved the efficiency of their reflection skills and left “fewer, but more focussed comments”.

3) Create a safe space for voicing opinions

As well as gathering and analysing quantifiable data, Demetrios and his department made a number of observations on student and group interactions.

“It also helped with group communication, because it became easy for students to express themselves. Many times what we observe, especially with year one students, is that those who are very loud, they have no problem. But those who are not loud, do not. Through this tool it was possible for them to express an opinion and start a discussion. Gradually those who were not loud became more confident.”

A benefit for both teachers and students alike was the improvement in learners’ ability to self-assess and reflect on their skill development over time.

“The feedback comments were on the system - it was not possible for students to hide from themselves. It was very helpful in terms of students understanding what they’re doing right, what they’re doing wrong, and in this manner, becoming more independent learners.”

4) Save time in design and facilitation

Ultimately, ICL's integration of Group Member Evaluation aimed to elevate assessment and feedback practices and improve the student experience. The tool fulfilled these objectives without causing unnecessary amounts of extra work for the team, helped by seamless integration and access to a 24-hour support team. ”From a teacher’s perspective, it was very easy to set it up and get it running. It helped us identify individuals and provide support to those who needed it - a better use of our resources. [And] from a manager’s perspective, it was very easy to get technical support, and it was a smooth operation.”

Additional data collated in Group Member Evaluation summarises frequently-used keywords in student reviews, as well as visualising student progress to make it easier to judge students’ encounter with the learning activity. Show less
Additional data collated in Group Member Evaluation summarises frequently-used keywords in student reviews, as well as visualising student progress to make it easier to judge students’ encounter with the learning activity.Show less

After analysing the course data, the original scope of implementing Group Member Evaluation expanded. Attention shifted away from the length of time needed to set up the course, and toward the quality of student interactions. Using technology to support a pedagogy of self- and peer-assessment helped enrich the teaching and learning, leading to a more student-centered, meaningful learning experience.

More resources

Watch the webinar, “Nurturing student satisfaction through personalised assessment and feedback” featuring Demetrios Venetsanos of Imperial College London, as well as other instructors from the University of Bath.

Documentation on the TEF which details the framework and assessment criteria for teaching excellence, based among other things on results from the student surveys

Our Group Member Evaluation tool page overviews the features and functions of this peer assessment tool.

Assessment of learning outcomes

Notable outcomes

Possible variation

Share on social media

Recommended use cases

A story of how a faculty enhance engagement with technology-enhanced interactive study materials and peer assessment

Leiden University utilizes FeedbackFruits Competency-Based Assessment solution to track competencies of students

Imperial College London utilizes FeedbackFruits to elevate self and peer assessment process for the students

We gather the most valuable resources and learnings on pedagogy and email them to you. Once a month. Unsubscribe whenever you want.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Filling out this form means you agree to our Privacy Policy.