At the Center for Integrated Professional Development, we have curated this guide to help instructors navigate the presence of emerging artificial intelligence (AI) technologies in their classrooms. While we have provided a thorough exploration of AI-generated content below, here are our top-level conclusions:
What is generative AI? How might such content be problematic in higher education?
Generative AI encompasses a variety of services that use artificial intelligence to create content using natural human language. Also referred to large language models, each AI-powered service is “trained” on a different dataset, which is the information that it uses to create a wide variety of written or symbolic documents, sometimes using real references and sometimes fabricating them (which is called a “hallucination”). Many services can now also produce multimedia content like images and videos. The content produced by these services is often indistinguishable from content produced by humans, especially for folks without deep knowledge of the subject matter being described.
The purpose of these services is to increase the amount of automation that is available to us, to reduce repetitive tasks, and allow us to focus on higher order critical thinking tasks. Many corporations and some universities are developing internal LLMs to assist in day-to-day workflows. However, because many of these services can produce content in such a robust manner, there has been increasing concern in the broader higher education community that AI content generation might be used by students to complete assignments, essays, or other activities in a manner that would lead to concerns with academic integrity.
Effectively, students could use AI content generators to avoid producing their own work in a very similar way to paying someone else to write a paper for them or using an exam/assignment database provided by their peers. This is problematic because it can be difficult for instructors to detect and easy for students to access. Most importantly, it removes all learning value in any assignment—if AI-generated content is used uncritically, and submitted as a final product, and not as part of a drafting or brainstorming exercise.
Specific AI content generation services include ChatGPT, Google Bard, Microsoft Bing and Copilot, Claude, Pi, DALLE, Midjourney, Adobe Firefly, amongst many, many others. Hundreds of mobile and desktop apps and services are also integrating AI, with more coming on a weekly basis. (“ChatGPT” is often used as short-hand for all generative AI, as Kleenex is for facial tissues.)
For the purposes of this guide, we have opted not to provide examples of what Generative AI can and can’t do because the answer to that question would be “almost anything.” The most mainstream services like Bard and ChatGPT can provide robust written responses to prompts. Adobe Firefly can generate images. There are other services that can replicate the human voice and likeness—a specific human’s voice and likeness, even.
In general, in our testing, generative AI is good at finding knowledge and procedural information on the internet. It can also bring things together to compare and contrast them. Where it struggles, though, is to handle prompts that require it to analyze, create, and evaluate—i.e., to go higher on Bloom’s Revised Taxonomy.
Resources with specific examples of what generative AI can do and which services have what capabilities are currently in production. We highly recommend looking at the conversations ongoing in your discipline to see what uses of AI might be most relevant to your classroom context.
How might I know if a student’s work is generated using artificial intelligence?
Generative AI can reliably produce text and visuals that approach human capabilities. There is not yet a detection service that can definitively determine whether content was produced by humans or produced by AI. While OpenAI has announced that they are exploring “digital watermarks” that would make their content easier to spot by detectors and Adobe is building in provenance to files that its Firefly service creates, these safeguards are still in the early stages of development.
If you read something that doesn’t “feel” right, because of improper use of idioms or logic, then you might be reading content created by generative AI, but it becomes more fluent on a regular basis. It’s also important to note that language that is not perfectly idiomatic for Standard Academic English may also be produced by humans, so this is a difficult road to walk. However, if there is a style mismatch between two sections of written work, where the tone, vocabulary, and/or syntax shift widely, that may be a sign that some of the work you are reading was not written by the student.
One area that many AI content generators still struggle with is creating citations. They will sometimes find an author who has written something that would be appropriate to cite, but then make up a new title for that work or fully invent an author and a work. They may also fabricate quotes from real sources. Both of these instances are referred to in discussions about AI as “hallucinations” and result from the system’s attempts at filling in the gaps caused by knowledge it does not have access to. If you use a service like Crossref (discussed later) to verify a student’s sources and there are lots of unknowns, this may be a sign of an AI-generated paper.
Otherwise, we suggest using strategies like the ones you may already be using to validate the originality of student work:
Due to concerns about student privacy and intellectual property rights and significant false positive rates, instructors should never put their students’ work into services that claim to be able to detect content written through generative AI or into generative AI services themselves, without their students’ informed consent. As of October 2023, CIPD is not aware of any AI detection service that has been vetted through a published, peer-reviewed study, so we cannot recommend their use.
How do I develop learning experiences that discourage the use of AI content generation services?
How do I develop learning experiences that integrate the use of AI-generated content into my course pedagogy?
How does student use of AI content generation interact with Illinois State's existing academic integrity policy? What do I do if I suspect a student has submitted work that is not theirs?
The Student Code of Conduct’s academic integrity policy has several provisions that would apply to the unauthorized use of any resource or service not authorized by or acknowledged to the instructor, which covers the unauthorized use of AI content generation services. This is similar to how cases of students paying for essays or relying on test/paper archives maintained by peers would be considered. Using AI content generation services to solve equations or develop code would be examples of unauthorized uses of assistance. Using them to create text for a written assignment would be plagiarism if that content were not cited as being AI-generated content.
If you are concerned that a student may have used AI content generation services to submit work that is not theirs, you should follow the same process you would during any other academic integrity concern. That process is laid out by Student Conduct and Community Responsibilities.
What sort of syllabus language might be appropriate to address students’ use of artificial intelligence in my class?
The Center provides suggested syllabus language for a variety of topics, including academic integrity. This language can be added to your course syllabus and discussed with your students at appropriate times throughout the semester. The suggested language on academic integrity was updated in January 2023 to include mention of the use of content produced through artificial intelligence services.
Other institutions are also examining the impact of these new technologies on learning.
Need additional help incorporating these suggestions into your particular course? Email ProDev@ilstu.edu to set up a consultation with a member of the Center's Scholarly Teaching team.