AI Working Group

AI Working Group
The goal of the Wofford AI Working Group is to formulate an effective and innovative response to the widespread impact of generative AI in order to position Wofford as a liberal arts thought leader. Our first step will be to identify and understand the many ways that generative AI will present challenges and opportunities to our campus community in the present and future. The working group will then formulate a strategic framework for how Wofford can effectively respond to and prepare our students, faculty and staff for the impacts of AI on their careers and everyday lives.
Members
Members
Events
Events
Resources
Resources
AI Working Group FAQs

The following guidelines were developed from and based on a draft provided by the AI Working Group.

Does Wofford require instructors to have an AI policy statement in their syllabus?
The Office of the Provost strongly advises that each instructor should explicitly include a policy statement regarding the use of Generative AI in their course syllabus. Instructors retain the freedom to shape their own approach to how Generative AI should be utilized or avoided in their courses. However, it is imperative that they clearly communicate their expectations, standards, and any constraints regarding AI use within the syllabus. This approach is essential as students rely on their instructors for clear guidance on the application and boundaries of AI in their academic work.
Can instructors use software tools to detect if a student has used Generative AI on an assignment?

AI detection is imperfect; instructors should use it with caution. The risk of a false positive can create situations where students are wrongly accused of academic dishonesty. Instructors therefore should not rely only on detection software to determine if an assignment was written by AI.

We offer the following guidance for instructors who suspect that a student improperly used AI to complete an assignment:

  • Review previous student work. Do you detect a difference in style, tone, grammar, development of argument, and/or use of sources and citations?
  • Talk with the student about their writing process. How did they research, prepare to write, and then write their paper. How long did it take them? How much effort did the student dedicate to editing?
  • Check source credibility and quality. Currently, Generative AI is not effective at citing credible and relevant sources.
May instructors use AI tools to evaluate student work?

Instructors are generally discouraged from utilizing Generative AI for the evaluation of student work. While the landscape of AI technology is rapidly evolving and becoming more sophisticated, and faculty in various disciplines already employ online tools for the rote grading of certain assignments, the deployment of Generative AI in student assessment raises substantial concerns.

  • Student-Faculty Interaction: In a liberal arts context, the faculty's role in understanding and tracking each student's individual progress is particularly crucial. Faculty are tasked with knowing their students, recognizing their strengths, challenges, and the nuances of their intellectual growth. Relying heavily on AI for grading risks undermining this critical element of student-centric education, potentially diminishing the quality of both teaching and student-faculty interactions, which are central to the values and effectiveness of a liberal arts approach.
  • FERPA Regulations and Intellectual Property: Under the Family Educational Rights and Privacy Act (FERPA), there are stringent requirements to safeguard student information. When faculty input confidential or protected student data into most AI tools, there is no guaranteed expectation of privacy. Furthermore, students' original work is their intellectual property. Uploading this material to an AI tool, which may then incorporate it into its dataset, could potentially infringe upon a student’s intellectual property rights.
  • Limitations of Current AI Tools: Presently, most AI tools lack the sophistication needed to effectively grade a wide range of assignments without extensive training. Additionally, AI tools can generate 'hallucinations' — instances where the AI produces factually incorrect or nonsensical information. This limitation is especially problematic in academic settings, where accuracy and reliability are paramount.

In light of these considerations, while AI can be a valuable supplementary tool in certain scenarios, its use as a primary method for evaluating student work is not advisable. Faculty are encouraged to continue engaging directly with student submissions to ensure accurate and meaningful assessment.

May instructors require students to use AI for course assignments?

Yes, instructors are welcome to integrate AI tools into their courses, but it is vital they understand the broader context and the various issues that come with adopting these technologies. When considering the implementation of AI tools in courses, instructors should take into account the following aspects:

  • Alignment with Learning Outcomes: Instructors should evaluate how using AI tools connects to their learning outcomes for a course and whether using AI tools enhances student learning.
  • Access and Equity: Be mindful of the varying quality and cost of AI tools. Some are freely available, while others, often of higher quality, require payment. Ensure that all students have equal access to these resources, preventing a divide in learning opportunities.
  • Privacy Considerations: Instructors must respect and adhere to FERPA regulations. It is imperative not to compel students into activities that might breach their privacy rights.
  • Data Use and Transparency: It is advisable to educate students on how AI tools handle their data. This includes discussions around copyright, intellectual property rights, and personal data usage.
  • Ethical Implications: Reflect on the ethical dimensions of AI. This involves examining the development process of these tools, the potential for biased data, and any inherent discriminatory practices within various AI systems.
  • Quality and Critical Assessment: Encourage students to develop a critical approach to evaluating the information generated by AI. This includes scrutinizing the quality and reliability of the data and the technology itself.

For more information, please see the following resources:
https://teaching.pitt.edu/resources/teaching-with-generative-ai
https://teaching.cornell.edu/generative-artificial-intelligence/ethical-ai-teaching-and-learning

Students: Where should I go if I have questions about using AI for a course?

The use of AI in a course is at the instructor’s discretion and most instructors will include their policy on the syllabus. When in doubt, ask your instructor!

Who should I contact if I have questions about AI at Wofford?

For any questions about AI at Wofford, please reach out to Dr. Kimberly Hall at hallka@wofford.edu.