Writing Your AI Syllabus Policy
A Faculty Guide for Setting Expectations Around AI Use in the Classroom
As generative AI tools like ChatGPT, Copilot, and Adobe Firefly become more accessible, faculty are faced with new decisions about how, when, and if these tools should be used by students. This page will walk you through the process of crafting a clear and course-aligned AI policy statement for your syllabus.
🔍 Goal: Help students understand your expectations, uphold academic integrity, and support responsible innovation in your course.
Step 1: Understand AI’s Role in Higher Education
Before drafting a syllabus statement, it's important to understand how AI is influencing higher education:
- AI can assist with writing, brainstorming, coding, and visual design
- It also raises concerns around plagiarism, misinformation, and equity
- Students may arrive with different assumptions or familiarity with AI tools
- "AI and Academic Integrity in Higher Education: What Faculty Should Know"
- This 5-minute video provides an overview of the importance of establishing AI policies in academic settings, highlighting key considerations for faculty. (CIRT)
- Guidelines for Responsible Use of Generative AI in the Classroom
- UNF Academic Integrity Policy
- What should a syllabus statement on AI look like? (Brown, 2023, Colorado State University)
Step 2: Determine Your Position on AI Use
Your AI policy should reflect your teaching philosophy, your course’s learning outcomes, and the types of assignments you use. Use the reflection prompts below to guide your stance.
Ask Yourself:
- Will AI help or hinder students’ development in this course?
- Am I comfortable with AI use in some assignments but not others?
- Do I expect students to cite or disclose AI-generated content?
Resources:
- Checklist – Key Questions for Choosing an AI Policy
- AI syllabus language (UNF Office of Faculty Excellence, n.d.)
- Course policies related to ChatGPT and other AI tools (Gladd, J., n.d.)
Step 3: Select or Draft Your AI Syllabus Statement
Once you’ve clarified your stance, you can select a model policy that fits your goals—or customize your own.
Option 1: AI Use Not Permitted
“This course assumes all submitted work will be generated solely by the student. The use of generative AI tools (e.g., ChatGPT) is not permitted and will be considered a violation of UNF’s Academic Integrity Policy.”
Option 2: AI Permitted in Limited Contexts
“Students may use generative AI tools for brainstorming or organizing ideas but must disclose how they were used. AI-generated content must be clearly attributed and aligned with assignment goals.”
Option 3: AI Encouraged with Responsible Use
“This course encourages students to explore generative AI tools to enhance their learning. Any AI-assisted work must be properly cited, and students should include a reflection on how AI shaped their final product.”
Resources:
- Developing an approach (Center for the Advancement of Teaching. (n.d.). Wake Forest University.)
- AI Expectations Worksheet (Center for the Advancement of Teaching. (n.d.). Wake Forest University.)
- AI Decision Tree (Center for the Advancement of Teaching. (n.d.). Wake Forest University.)
- Syllabi Policies for AI Generative Tools Google Doc – list of examples from faculty all over the US
- Templates: Faculty select restrictive, neutral, or fully integrated AI policies.
- Example Syllabi: Real-world samples with well-crafted AI policies.
Step 4: Ensure Your Policy Is Clear, Ethical, and Equitable
Use this checklist to confirm that your policy is ready for students:
- Clearly states whether AI use is allowed, limited, or prohibited
- Aligns with UNF policies and discipline-specific expectations
- Uses student-friendly, transparent language
- Addresses citation, disclosure, and access equity
Download:
Step 5: Communicate Expectations with Your Students
Don't just add the policy to your syllabus—discuss it in class.
- Set the tone early with an open conversation
- Share examples of appropriate/inappropriate AI use
- Encourage students to ask questions if they’re unsure
- Clarify how you’ll enforce the policy throughout the term
Pro Tip: Consider including an “AI in This Course” section on your Canvas homepage or first-day slides.
Final Step: Apply and Reflect
Once you’ve finalized your policy:
- Add it to your official syllabus
- Consider uploading it as part of faculty development records (if applicable)
- Reflect on how it aligns with your teaching values and student learning goals
Need Help?
UNF’s Center for Instruction and Research Technology (CIRT) and the Office of Faculty Excellence (OFE) are here to support you. Contact us with questions or explore our AI Faculty Development Series to learn more.
- Examples from UNF Faculty (Quotes, short videos, or vignettes)
Navigating Academic Honesty with AI
As generative AI tools like ChatGPT become more accessible, UNF remains committed to the core values of academic integrity: honesty, trust, fairness, respect, and responsibility. These values continue to guide how we adapt our teaching practices and communicate expectations in a changing technological landscape. Faculty play a vital role in helping students use AI tools ethically and responsibly, starting with clear policies, intentional course design, and open dialogue.
Setting Clear Expectations
To help students understand appropriate use of AI in your course:
- Include clear policies in your syllabus. You can specify whether AI is prohibited, permitted with attribution, or allowed only for certain tasks. Visit OFE’s Example AI Usage Syllabus Policies and Citation Information webpage for examples and adaptable language. In addition, review the Writing Your AI Syllabus Policy section of the Hub for detailed examples, adaptable language, and policy frameworks to help you define whether AI use is:
- Prohibited entirely
- Permitted with proper attribution
- Allowed only for specific tasks or assignments
- Engage students in scenario-based conversations about what ethical AI uses look like in your discipline.
- Encourage reflection by asking students to explain when, why, and how they used AI in their assignments.
- Require AI-use statements as part of major assignments to foster accountability and responsible learning.
- Rethinking Assessment in the Age of AI
- Beyond setting expectations, faculty can also rethink how assessments are structured in light of evolving AI tools. Some educators are exploring assessment models that intentionally integrate AI, focusing on transparency, critical thinking, and ethical engagement.
- Updating the AI Assessment Scale – Leon Furze outlines how assignments can move from AI-restricted to AI-embedded, helping educators design assessments that foster agency and deeper learning.
Detection Tools & Limitations
UNF has chosen not to activate Turnitin’s AI-detection features due to concerns about false positives, algorithmic bias, and a lack of transparency regarding how detection scores are generated. Current guidance from university leadership emphasizes caution in relying on AI-detection tools, as these technologies are still in their infancy and can misidentify original student work as AI-generated or overlook AI-assisted submissions entirely.
While formal policies are under review, faculty are encouraged to prioritize assignment design as the main way to support academic integrity. Instead of relying on detection tools, instructors can implement intentional design strategies that promote genuine engagement and are less likely to be exploited by AI. We recommend avoiding punitive approaches and focusing instead on building trust and teaching students to critically navigate AI.
Explore the AI-Resistant Design Practices Guide (PDF) - will need ADA certification
This guide outlines specific, course-tested approaches to promoting original student work, including:
- Personalization and real-world reflection
- Course-specific applications
- Metacognitive and process-oriented tasks
- Multi-stage assignments with scaffolding
- Diverse formats and creative assessments
These practices encourage higher-order thinking and reflection while reducing reliance on unreliable detection tools—helping maintain academic rigor and integrity in the evolving AI landscape.
How AI Is Being Integrated Across Fields
AI is transforming not just computer science but nearly every academic discipline. From analyzing data in public health to generating creative prompts in literature, educators are finding new, discipline-specific ways to leverage generative AI for learning and discovery. Understanding how AI intersects with your field can spark fresh ideas for course design and student engagement.
Here’s a brief look at how AI is being used across various academic domains:
- Arts & Humanities: AI tools support creative writing, assist in analyzing historical texts, and generate visual or musical art. Faculty use AI to foster critical engagement—encouraging students to question authorship, originality, and the human–machine creative process.
- Sciences: AI is employed for data analysis, simulation modeling, and hypothesis generation. Students use tools like ChatGPT or Elicit to formulate research questions or explain complex scientific phenomena in accessible terms.
- Business: In marketing, finance, and management courses, AI is used for case study analysis, market trend predictions, and writing professional communication drafts. Students are also exploring ethical AI applications in the workplace.
- Health Professions: AI supports clinical decision-making simulations, patient scenario generation, and medical documentation practice. Faculty are emphasizing responsible use and the importance of human judgment in healthcare.
- Education: Teacher-preparation programs integrate AI for lesson planning, differentiated instruction, and classroom communication. Discussions around ethical use and AI literacy are also central to preparing future educators.
- Social Sciences: AI assists with coding qualitative data, simulating policy outcomes, or supporting argument construction in disciplines like sociology, psychology, or political science.
Tips for Adapting AI to Discipline-Specific Learning Goals
- Start with Your Outcomes: Align AI use with what you want students to learn. Will AI help develop critical thinking, creativity, or communication skills?
- Embed Reflection: Ask students to explain how and why they used AI. This builds awareness and prevents over-reliance.
- Use Real-World Tasks: Design assignments that mirror professional applications of AI in your discipline.
- Be Transparent: Let students know when you are using AI—e.g., to generate discussion prompts or provide feedback templates.
- Avoid One-Size-Fits-All: What works in a marketing course might not apply in a physics lab. Adapt prompts, tasks, and policies based on the cognitive demands of your discipline.