Designing AI policy with educational regulations in mind

Nhi Nguyen
Rebecca LeBoeuf
Rebecca LeBoeuf
|
July 1, 2024
Table of Contents

Context

Technology outpaces regulation, as the saying goes. But in the case of AI, government institutions are catching up. In October 2023, the White House released the Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence. In March of 2024, the European Union passed the Artificial Intelligence Act.

While there is some ambiguity in the executive order and the AI Act, both include provisions that will directly affect educational institutions and how they implement AI, inside and outside the classroom.

Explore best practices to integrate AI into policies, curriculum design, and assessment practices.
DOWNLOAD NOW
Join Cole Groom of FeedbackFruits and Patricia Luna of TAMU for an in-depth webinar exploring authentic assessment and how it can transform your approach to student evaluations
REGISTER NOW

The new (and old) regulations affecting higher ed

The US Executive Order

The Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence is an impressive-sounding name for a document that doesn’t contain any legislation. Instead, it directs government agencies, such as the Department of Education, to develop “resources, policies, and guidance regarding AI.”

Higher ed administrators may find it difficult or even impossible to draft or revise an AI policy when they don’t know the specific regulations coming down the pike. But there is a strong hint in the executive order.

The order directs the Secretary of Education to develop policies addressing “safe, responsible, and non-discriminatory uses of AI in education, including the impact AI systems have on vulnerable and underserved communities.” So it’s a good idea to address implicit bias—how to recognize it and how to avoid it—when drafting your own AI policy. (This post from Chapman University summarizes how implicit bias can manifest in AI and what to look out for.)

A second issue is Regular Substantive Interaction (RSI), an important criterion to meet for federal funding of higher education. In short, RSI regulations stipulate that distance learning interactions must be meaningful for the student and at predictable intervals. WCET, a nonprofit that focuses on digital learning, notes in a [recent report](https://wcet.wiche.edu/frontiers/2021/08/26/rsi-refresh-sharing-our-best-interpretation-guidance-requirements/#:~:text=The Department indicated in the,for regular and substantive interaction.) that using AI tools in an online class “will not meet the statutory requirements for regular and substantive interaction.” Thus AI policies for your institution need to address how to use AI tools in substantive ways (more on this below).

The EU AI Act

Legislation from the European Union may not seem like it should be a grave concern for American institutions. However, like the EU’s data privacy laws, the EU AI Act will have a wide-ranging effect, because it applies to all EU users. Since so many American universities have outposts on EU soil, or European students, or study abroad programs, it makes sense to pay close attention to EU requirements.

Unlike the more vague US executive order, the EU Act takes a specifically risk-based approach, defining activities and their associated level of risk (minima, limited, high, and unacceptable).

Of course, these uses are quite broad by design—since AI is relatively nascent in higher ed, legislators want to cover as many of the possibilities as they can. This may be frustrating to those who have to draft a policy that complies with the legislation. Remember, however, these uses of AI aren’t necessarily prohibited; rather they are subject to “conformity assessments”—meaning higher education institutions should take the necessary steps to demonstrate compliance.

Also, both the EU and the White House have a take on AI in education that could be described as cautious encouragement—the White House’s Fact Sheet on the executive order acknowledges that AI has the potential to “transform education.” Nevertheless, given the uncertainties, what should administrators keep in mind when drafting an AI policy for their own institution?

Developing AI policies for academic institutions

In universities and other institutions of higher education, policy is usually designed at a high level—for example by Directors of Centers of Teaching and Learning, with input from legal advisors. But an AI policy stands a greater chance of success with both a clear picture of how AI is used in the classroom and clarity on the ed tech products themselves.

Thus it’s a good idea to include all the stakeholders in your AI policy committee whenever possible: department heads, instructors, and students. Ask them how they’re using AI in the classroom—in curriculum and lesson plans, or to respond to assignments. They should be encouraged to voice their ethical concerns as well: when instructors feel students may be over-reliant on AI, or when students have noticed or experienced implicit bias.

These are big questions, but the idea is to foster an environment where stakeholders are aware of the ethical and regulatory issues and feel comfortable with the technology, so it all can be reflected in a comprehensive AI policy. And as we mentioned, the EU regulations are ambiguous in places, and in the US, they’re just getting started. Nevertheless, we know enough to begin the process of drafting an institutional policy. Here are some suggestions:

  1. Review your RSI policy to ensure AI tools are included. For example, provide clarity on how the use of chat AI tools in classroom assignments is insufficient without instructor feedback. (For more guidance on RSI, see this report from FeedbackFruits.)
  2. Provide AI training for students and instructors. Many instructors may be unfamiliar with the technology and need training in the basics. The training should include topics such as:
    • Recognizing implicit bias in prompts and responses
    • Best practices for maintaining cybersecurity and privacy
    • AI as a creativity aid
    • Crafting effective prompts
    • Designing assignments that discourage cheating
  3. Appoint appropriate staff. Educause, another nonprofit dedicated to higher education and digital technology, suggests appointing an AI Ethics and Compliance Officer if you don’t already have one. Additionally, consider an AI Facilitator who can be available to students both on- and off-campus.
  4. Document, document, document. Since it’s unclear what both EU and US compliance or “conformity assessments” will look like, documenting each step of the process will ensure it’s easier to demonstrate compliance, whenever they get clear about the criteria. The Higher Education Compliance Alliance has useful resources on strategies for documenting and demonstrating compliance.
Share on social media

More from FeedbackFruits tips series

Technology Tips
|
23/3/2023

5 ways to stimulate collaboration with FeedbackFruits Team Based Learning

5 ways optimize the team-based learning process with the support of technology

Technology Tips
|
28/10/2022

Develop critical thinking with Discussion on Work | FeedbackFruits tips

Check out 4 ways to design asynchronous discussions that stimulate critical thinking skills

Technology Tips
|
28/10/2022

Stimulate continuous engagement with Interactive Document | FeedbackFruits tips

Discover how to maintain continuous student engagement at every stage of your course using Interactive Document

Most popular blog posts

student success
|
Nov 16, 2023

Promote active learning in different instructional modes

Explore how to best implement active learning strategies with deep understanding of different modalities

Industry News
|
Sep 11, 2023

Meet the new FeedbackFruits partners | Summer 2023 Highlights

FeedbackFruits announces partnerships with many institutions worldwide over the past 4 months

student success
|
Aug 23, 2023

Competency-based education (CBE) in higher education: A landscape overview

An overview of the state of competency-based education (CBE) in higher education around the world

We gather the most valuable resources and learnings on pedagogy and email them to you every month.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Filling out this form means you agree to our Privacy Policy.