FACULTY AND STAFF AI GUIDELINES

Enhance your teaching and administrative capacity.

UW-Whitewater encourages all faculty and staff to thoughtfully consider their stances on artificial intelligence use, even if adopting others’ existing language and guidelines. Establishing an AI policy for your class allows you to have meaningful discussions with students on this topic. Being specific about how AI is or isn’t allowed makes the rules clear for students and faculty.

We implore faculty and staff to try at least one generative AI tool. It takes about 10 hours to get some idea of the capabilities of a tool. A good way to start exploring is to try out different prompts that you might give to students for an assignment, and see what the tool returns. While there are resources that can help you to learn how to write effective prompts — or explore existing prompt ideas — we encourage you to keep an open and curious mindset when considering whether such tools could be helpful to some or all of your students, or other colleagues in your unit or division.

AI trends in U.S. higher education

Source: EDUCAUSE

61%

Percentage of faculty and staff who have work responsibilities related to AI

57%

Percentage of leaders who view AI as a strategic priority

63%

Percentage of AI-related strategic planning efforts that include “training for faculty”

56%

Percentage of AI-related strategic planning efforts that include “training for staff”

Faculty and instructional staff guidelines

A policy statement on AI is mandatory in your syllabus.

Examples of suggested statements »

As you define your AI syllabus statement, we encourage you to clearly articulate your reasoning behind your approach to students. Your rationale should be grounded in the intellectual goals of your course, values of your discipline, and your broader learning objectives, particularly as they relate to critical thinking and skill development.

Generative AI is increasingly shaping how work gets done across industries and is likely to be more pervasive in the workplace going forward. Students will benefit from understanding not only when it is appropriate to use AI tools, but why those decisions matter. We encourage you to consider what kind of AI tools and uses employers in your field already expect from graduates or may expect in the near future.

Some of the questions you might ask yourself are:

  • What will students lose (or gain) by using generative AI in your course?
  • What do you want students to understand about AI and their intellectual development?
  • What other competencies/skills are employers in your field beginning to expect?
  • Although students may already use generative AI, it does not mean that they understand the limitations of the tool or how to use it properly.

Please check with your instructor on when and where GenAI use is allowed.

Faculty are encouraged to balance innovation with equity. Not all students may have access to premium AI tools, and some platforms may not be accessible for students with disabilities. When possible, prioritize free and inclusive tools. As you consider Generative AI use from an assessment perspective, some questions you may wish to consider are:

  • What do your existing assignments and assessments actually measure?
  • Are they providing meaningful evidence of whether students are achieving the course’s stated learning outcomes?
  • How could the use (restriction) of AI support (hinder) the way students demonstrate their knowledge, skills, and critical thinking?
  • Are you able to use GenAI to better assess the course’s learning outcomes?
  • What are some ways instructors can design assignments that gradually introduce AI tools?
  • How might this step-by-step approach support students in developing critical thinking about AI’s role in their work?

Generative AI can assist with:

  • Designing syllabi, discussion questions, quizzes, and rubrics
  • Drafting announcements or communications for large classes
  • Explaining complex concepts in multiple formats for diverse learners
  • Providing a more personalized learning experience in your class
  • Depicting scenarios or cases that might be used as a tool to aid in skill development, e.g. critical thinking and analytical skills.

We recommend that when you utilize Generative AI in your course design, that you are transparent in its use with your students.

To promote academic integrity and reduce over-reliance on generative AI tools, instructors are encouraged to adopt the following strategies:

  • Design assignments in progressive phases — starting with no AI use to build foundational skills, then allowing AI-assisted brainstorming or editing, and finally analyzing a fully AI-generated version — so students can critically engage with generative AI, compare outputs, and reflect on both the strengths and limitations of these tools in their learning process.
  • Design in-class learning experiences (e.g., discussions, labs, real-world problem-solving) that require spontaneous thinking and reduce opportunities for inappropriate AI use.
  • Where appropriate, encourage collaborative learning such as peer review, small group work, and scaffolded team projects, where interpersonal engagement supports deeper understanding.
  • Use process-based assessments that highlight student thinking over time — such as requiring drafts, revision histories, or self-reflections — making it easier to identify authentic learning and limit unauthorized AI-assisted work.

These approaches not only reinforce original work but also help students engage more intentionally with AI when it is allowed, emphasizing critical thinking and accountability.

Instructors should avoid relying solely on AI detection tools, as they are often unreliable. For example, research on AI detection software from MIT highlights the prevalence of high false positive and negative rates. Also, another study from Stanford found that detection software may be biased against certain types of learners, e.g. non-native speakers.

If AI misuse is suspected, speak directly with the student first, just as with any other academic integrity concern. Please review UWS Chapter 14 procedures or contact the UW-Whitewater Dean of Students Office for additional clarity.

Use supporting materials like writing samples, version histories, or revision drafts when evaluating suspected misconduct.

We highly recommend that instructor feedback on an entire student's work should not be automated. AI can assist with generating ideas for praise, considering additional approaches, and offering stylistic or grammatical suggestions to a specific paragraph.

Generative AI use for non-instructional staff

Two people work at a long table with a laptop and pictures laid out.

 

Generative AI tools, such as ChatGPT, offer transformative potential to streamline repetitive and time-consuming administrative tasks within colleges and departments. By automating routine communications, document preparation, and information management, GenAI can help to free up valuable staff time, improve service delivery, and support data-informed decision-making.

In general, you should adhere to the general guidelines for AI use, and know of the allowable and prohibited uses of AI. Where there may be a question, we recommend that you check with your supervisor or unit/division head regarding any specific AI policies pertaining to your department, unit or division. Below, we outline some potential areas where GenAI can be effectively deployed to enhance the day-to-day operations of university administrators.

  1. Automated drafting of routine communications
  2. Agenda generation for meetings as well as meeting minutes summarization for open meetings.
  3. Form generation and document preparation
  4. Trained GenAI Chatbots to serve as student and faculty FAQ assistants
  5. Calendar management and scheduling support
  6. Division/unit strategic planning
    • Collate information from a variety of sources; use Gen AI to summarize feedback and to help develop short and long term goals for the unit.

Generative AI is not a replacement for human judgment and care in administration — but it is a powerful tool that has the potential to amplify human capability. Augmenting work with GenAI for targeted administrative use may allow staff to focus more on high-impact work while improving the overall efficiency of academic operations.

Recommendations for all staff

Protect personal information: Avoid inputting sensitive or personally identifiable information into AI platforms, as these tools may store and utilize your data beyond your control. These could lead to FERPA violations.

  • Example: Types of Personally Identifiable Information (PII) include names, Social Security numbers, addresses, phone numbers, email addresses, and financial or medical records.
  • Intellectual property: Your own IP could be utilized or saved by AI platforms, so share work cautiously.

Protect university information: Aside from personal information, refrain from inputting any university or Universities of Wisconsin information into any AI platform that may be deemed sensitive, confidential, or proprietary.

Respect others' privacy: Do not upload or share others' work, including student assignments or unpublished research, with AI tools without explicit consent.

  • Third-party AI detection services: Accurate AI detection will remain difficult. Instructors who chose to use a third-party detection service, one that the university does not have an enterprise license with, must get explicit student consent to submit any student work that is for a grade. Doing so without student consent may constitute a FERPA violation.

Review privacy policies: Before using any AI tool, familiarize yourself with its data handling and privacy policies to make informed decisions about your data.

Recognize limitations: AI tools may produce biased, inaccurate, or exclusionary content.

Foster belonging: Be mindful of how AI-generated material affects diverse learners.

Promote equity: Students using free AI tools should not be penalized for lack of access to paid versions.

AI use should be clearly disclosed and cited when appropriate. Instructors and staff must communicate expectations around acceptable AI use, including what counts as misuse or misconduct. All users are responsible for the content they generate with AI, even if it is inaccurate or controversial.