From student input to institution-wide adoption at the University of Arizona

Circée Ferrer
|
December 6, 2024
Using
DOMAIN
Social Sciences
Class Size
Instructor Workload
Learner Workload
LMS
ABOUT THE INSTITUTION

The University of Arizona, located in Tucson, Arizona, is a premier public research university founded in 1885. Renowned for its innovation and academic excellence, the U of A is home to more than 47,000 students and offers various undergraduate, graduate, and professional programs across diverse disciplines. It is a global leader in space science, environmental studies, and health sciences, consistently ranking among the top public universities in the United States. 

ABOUT THE INSTRUCTOR(S)

Michelle Vonie is an Instructional Designer and QM Coordinator with the University Center for Assessment, Teaching, and Technology. Her research interests include Quality Assurance, student success, retention, and learner experience in online courses. 

Samantha Maxwell is an Instructional Technologist with the University Center for Assessment, Teaching, and Technology at the University of Arizona. At the heart of her work lies a passion for utilizing technology to enhance educational access and the student learning experience.

ABOUT THE INSTITUTION

The University of Arizona, located in Tucson, Arizona, is a premier public research university founded in 1885. Renowned for its innovation and academic excellence, the U of A is home to more than 47,000 students and offers various undergraduate, graduate, and professional programs across diverse disciplines. It is a global leader in space science, environmental studies, and health sciences, consistently ranking among the top public universities in the United States. 

ABOUT THE INSTRUCTOR(S)

Michelle Vonie is an Instructional Designer and QM Coordinator with the University Center for Assessment, Teaching, and Technology. Her research interests include Quality Assurance, student success, retention, and learner experience in online courses. 

Samantha Maxwell is an Instructional Technologist with the University Center for Assessment, Teaching, and Technology at the University of Arizona. At the heart of her work lies a passion for utilizing technology to enhance educational access and the student learning experience.

Context

As online learning expanded, the University of Arizona looked for a peer review solution that could strengthen peer to peer interaction, support accessibility needs, and align with Quality Matters (QM) standards. The university evaluated peer review solutions with a student led process that directly informed campus wide adoption.

Since then, the partnership has continued to grow in February 2026, we visited their campus to see the impact firsthand. Watch the video below to hear it in their own words.

visual
icon
Learning journey
Gamified peer review
A framework to integrate game design elements into peer review for enhanced engagement and critical thinking
access now
Learning journey
Self-assessment for learner engagement
A framework to integrate game design elements into peer review for enhanced engagement and critical thinking
access now
Learning journey
Developmental portfolio for lifelong learning
Promote skills transfer and career readiness through technology-enhanced student portfolios
access now

How the University of Arizona evaluated peer review solutions with student voice and Quality Matters

Rather than relying solely on faculty or administrative input, the University of Arizona places students at the centre of instructional technology decisions through the Student Advisory Board for Instructional Technology (SABIT).

The board consists of 10 elected undergraduate and graduate students, including student government representatives. The group meets four times per year and plays a formal advisory role to institutional leadership.

“The Student Advisory Board meets with our team regularly to provide invaluable feedback on the technologies being considered.”
Samantha Maxwell, Instructional Technologist

To evaluate peer review solutions, the instructional design team established a four step evaluation process grounded in Quality Matters standards. Students assessed tools not only on usability, but also on whether they supported the learning design expectations required for high quality online courses.

The process included:

  • students explored shortlisted tools independently
  • students completed surveys aligned with QM criteria
  • the instructional design team analysed qualitative and quantitative themes
  • findings informed course design guidance, training strategy, and tool selection

“The QM standards were used as the key framework to evaluate the tools that would enter the main evaluation by the Student Advisory Board.”
Michelle Vonie, Instructional Designer

Across multiple review cycles, one message was consistent: students valued peer interaction, but felt it was not well supported by existing workflows.

The pilot: testing FeedbackFruits for Peer Review at scale

FeedbackFruits was evaluated as a feedback workflow that could connect Peer Review, reflection, and educator feedback within a structured learning flow.

FeedbackFruits Peer Review entered a nine month pilot (August 2023 to May 2024), alongside another peer review solution under consideration (Kritik)

During the pilot, students engaged with FeedbackFruits through:

  • independent exploration of the platform
  • live sandbox activities during advisory board meetings
  • qualitative reflections and structured surveys

By the end of the pilot period, FeedbackFruits stood out in three areas.

What stood out during the pilot

A clearer and more connected student experience

Students highlighted how structured and intuitive the peer review flow felt, especially compared with more disconnected and asynchronous approaches.

“Students talked about how easy it was to get paired with a fellow student, know exactly what they were reading, know exactly what they were saying, and know where to find their feedback. That level of clarity had never been there before.”
Samantha Maxwell, Instructional Technologist

Students also reported that detailed rubric criteria helped them move beyond generic comments and provide more specific, constructive feedback.

Solutions used

  • Peer Review for peer allocation and structured steps
  • Rubrics and criteria to guide actionable feedback
Screenshot of peer review sandbox with detailed rubric
Screenshot of peer review sandbox with detailed rubric


Alignment with Quality Matters and accessibility expectations

Accessibility was a major concern for students, particularly given past issues with the university’s previous peer review tool.

“We collected student data on accessibility after they explored FeedbackFruits and the support pages and then actually used the software itself.”
Michelle Vonie, Instructional Designer

Students responded positively to the clean interface, single sign on (SSO), and LMS integration, supporting clarity and consistency aligned with Quality Matters expectations.

Supporting authentic assessment and engagement

The instructional team needed technology that could support authentic peer learning rather than simple submission workflows. Alongside Peer Review, FeedbackFruits offers additional activity types such as Interactive Study Materials and Discussions to support learner engagement

“We focused primarily on peer to peer interaction, but instructor to student interaction was also important, and FeedbackFruits helps facilitate both.”
Michelle Vonie, Instructional Designer

Implementation and adoption

Following the pilot, student endorsement and pedagogical alignment accelerated adoption across the institution.

“Every week, there are tens and tens of new people signing up.”
Samantha Maxwell, Instructional Technologist

Analytics dashboard in FeedbackFruits showcasing student progress throughout the activity
Analytics dashboard in FeedbackFruits showcasing student progress throughout the activity

To support the transition, the university assigned a dedicated FeedbackFruits lead to help educators rebuild activities and ensure quality implementation. This was supported by workshops, targeted resources, and ongoing support from the FeedbackFruits team.

Impact

The student led evaluation and phased implementation delivered value across multiple levels.


For students

  • clearer peer interaction workflows
  • improved accessibility compared to previous tools
  • more structured and actionable peer feedback

For educators

  • consistent, rubric driven feedback design
  • scalable peer review workflows for large cohorts
  • reduced reliance on fragmented external tools

At an institutional level

  • a Quality Matters aligned feedback and assessment approach
  • LMS integrated workflows that scale across programmes
  • a repeatable, student informed model for technology adoption

“FeedbackFruits rose to the top based on students’ comments to us, even before broader system decisions were made.”
Michelle Vonie, Instructional Designer

A long term partnership

Today, the University of Arizona uses FeedbackFruits as a campus wide solution to support peer learning and scalable feedback and assessment across online programmes.

Melody J. Buckner, Associate Vice Provost, highlighted the alignment between the platform and the university’s mission to empower faculty with innovative methods for interactive teaching, peer assessment, and personalised feedback.

By combining student voice, Quality Matters alignment, and a pedagogically grounded feedback and assessment workflow, the University of Arizona continues to strengthen a sustainable model for feedback and assessment at scale.


A partnership still going strong

This use case captures the early journey from student-led evaluation to institution-wide rollout. But the story didn't stop there.

In February 2026, we visited the University of Arizona campus to see how the partnership had evolved. We sat down with the team who first introduced FeedbackFruits, as well as institutional leaders, to hear about where things stand today: what's working at scale, how faculty adoption has grown, and what the experience means for students across the university.

It was a chance to move beyond the numbers and see the human side of a partnership that started with a pilot and grew into something much bigger.



Watch the video below and hear it from their own words.

Assessment of learning outcomes

Notable outcomes

Possible variation

Share on social media

Recommended use cases

4/17/24

The University of Oslo's case for peer review in large-enrollment courses

How the University of Oslo used FeedbackFruits to run a 500+ student controlled study and discovered that structured peer assessment drives stronger exam results than traditional TA feedback, with no added workload.
A practical guide to facilitate holistic, inclusive, and technology-enhanced assessment
Read more
Webinar
4/17/24

Digitally transforming a 150-year-old institution: How the American University of Greece is embedding active learning at scale

With a fully online MBA live and a global campus initiative underway, the American University of Greece faced a clear challenge: engagement couldn't be an add-on. So its instructional design team built it into the foundation of every course and the results showed up in the grades.
A practical guide to facilitate holistic, inclusive, and technology-enhanced assessment
Read more
Webinar
4/17/24

Enhancing assessment and feedback at scale: The University of Bath's institutional approach

From a 2023 pilot to faculty-wide adoption: see how the University of Bath transformed peer review, group work, and feedback at scale with FeedbackFruits.
A practical guide to facilitate holistic, inclusive, and technology-enhanced assessment
Read more
Webinar

We gather the most valuable resources and learnings on pedagogy and email them to you every month.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Filling out this form means you agree to our Privacy Policy.