Generative AI in the Classroom
At the Center for Integrated Professional Development, we have curated this guide to help instructors navigate the presence of emerging artificial intelligence (AI) technologies in their classrooms. While we have provided a thorough exploration of AI-generated content below, here are our top-level conclusions:
- Illinois State’s academic integrity policy already prohibits the use of unauthorized resources, including AI content generators, and the Center has updated the suggested academic integrity syllabus statement to note this fact specifically.
- Content produced by AI is often hard to detect, but there are strategies to create assignments and examine student work to mitigate this. In written assignments, invented or poorly cited references are often the easiest direct evidence of AI-generated content.
- Using AI content generation to spark classroom discussion could help students understand why they should avoid using it.
Defining Generative AI
What is generative AI? How might such content be problematic in higher education?
Generative AI encompasses a variety of services that use artificial intelligence to create content using natural human language. Also referred to large language models, each AI-powered service is “trained” on a different dataset, which is the information that it uses to create a wide variety of written or symbolic documents, sometimes using real references and sometimes fabricating them (which is called a “hallucination”). Many services can now also produce multimedia content like images and videos. The content produced by these services is often indistinguishable from content produced by humans, especially for folks without deep knowledge of the subject matter being described.
The purpose of these services is to increase the amount of automation that is available to us, to reduce repetitive tasks, and allow us to focus on higher order critical thinking tasks. Many corporations and some universities are developing internal LLMs to assist in day-to-day workflows. However, because many of these services can produce content in such a robust manner, there has been increasing concern in the broader higher education community that AI content generation might be used by students to complete assignments, essays, or other activities in a manner that would lead to concerns with academic integrity.
Effectively, students could use AI content generators to avoid producing their own work in a very similar way to paying someone else to write a paper for them or using an exam/assignment database provided by their peers. This is problematic because it can be difficult for instructors to detect and easy for students to access. Most importantly, it removes all learning value in any assignment—if AI-generated content is used uncritically, and submitted as a final product, and not as part of a drafting or brainstorming exercise.
Specific AI content generation services include ChatGPT, Google Bard, Microsoft Bing and Copilot, Claude, Pi, DALLE, Midjourney, Adobe Firefly, amongst many, many others. Hundreds of mobile and desktop apps and services are also integrating AI, with more coming on a weekly basis. (“ChatGPT” is often used as short-hand for all generative AI, as Kleenex is for facial tissues.)
The Capabilities of Generative AI
For the purposes of this guide, we have opted not to provide examples of what Generative AI can and can’t do because the answer to that question would be “almost anything.” The most mainstream services like Bard and ChatGPT can provide robust written responses to prompts. Adobe Firefly can generate images. There are other services that can replicate the human voice and likeness—a specific human’s voice and likeness, even.
In general, in our testing, generative AI is good at finding knowledge and procedural information on the internet. It can also bring things together to compare and contrast them. Where it struggles, though, is to handle prompts that require it to analyze, create, and evaluate—i.e., to go higher on Bloom’s Revised Taxonomy.
Resources with specific examples of what generative AI can do and which services have what capabilities are currently in production. We highly recommend looking at the conversations ongoing in your discipline to see what uses of AI might be most relevant to your classroom context.
Spotting AI-generated content in student work
How might I know if a student’s work is generated using artificial intelligence?
Generative AI can reliably produce text and visuals that approach human capabilities. There is not yet a detection service that can definitively determine whether content was produced by humans or produced by AI. While OpenAI has announced that they are exploring “digital watermarks” that would make their content easier to spot by detectors and Adobe is building in provenance to files that its Firefly service creates, these safeguards are still in the early stages of development.
If you read something that doesn’t “feel” right, because of improper use of idioms or logic, then you might be reading content created by generative AI, but it becomes more fluent on a regular basis. It’s also important to note that language that is not perfectly idiomatic for Standard Academic English may also be produced by humans, so this is a difficult road to walk. However, if there is a style mismatch between two sections of written work, where the tone, vocabulary, and/or syntax shift widely, that may be a sign that some of the work you are reading was not written by the student.
One area that many AI content generators still struggle with is creating citations. They will sometimes find an author who has written something that would be appropriate to cite, but then make up a new title for that work or fully invent an author and a work. They may also fabricate quotes from real sources. Both of these instances are referred to in discussions about AI as “hallucinations” and result from the system’s attempts at filling in the gaps caused by knowledge it does not have access to. If you use a service like Crossref (discussed later) to verify a student’s sources and there are lots of unknowns, this may be a sign of an AI-generated paper.
Otherwise, we suggest using strategies like the ones you may already be using to validate the originality of student work:
- Long stretches without citations could indicate plagiarism.
- If a student can’t explain how they arrived at a certain answer or why they made a stylistic choice in their writing it may indicate that they didn’t do their own work.
Due to concerns about student privacy and intellectual property rights and significant false positive rates, instructors should never put their students’ work into services that claim to be able to detect content written through generative AI or into generative AI services themselves, without their students’ informed consent. As of October 2023, CIPD is not aware of any AI detection service that has been vetted through a published, peer-reviewed study, so we cannot recommend their use.
Discouraging the use of AI content generation services
How do I develop learning experiences that discourage the use of AI content generation services?
- Consider connecting your assignment prompts deeply to in-class discussions and activities. AI content generation services do not have access to your course content, so requiring mention of topics or ideas specific to your class negates the value of AI-created content.
- Require more than one draft of the same paper/assignment. Have students bring a first draft to class, then work during class to revise and improve the paper/assignment. Even if students use AI-generated content for their first draft, they will be invested in the work to refine the second draft.
- Encourage primary research, when possible, so that student work uses information not available on the internet (e.g., interviews, reviews of archival materials).
- Consider using citation practices that require DOI numbers or links to validate resources. Crossref is an excellent tool for validating resources, and you could have students include a report with their bibliography showing the Crossref results and explaining any mismatches/errors that came up.
- Include a reflective component in your paper/assignment that can only be created by students themselves. AI content generation services do not have the ability to create content that isn’t based on reported and searchable information.
- Develop writing prompts and other assignments at levels of Bloom’s Taxonomy that require higher-level cognitive engagement. AI content generation services can construct content at the “create” and “evaluate” levels rather effectively; however, content that requires students to “analyze,” “apply,” “understand,” or “remember” is often lacking in complexity and depth.
- Use small assignments/papers in conjunction with other technologies to allow them to be written during class time, such as a Canvas quiz, or a prompt on Nearpod.
- Consider multi-modal project types, such as presentations, videos, or podcasts.
- Group and/or client-based projects, where students must collaborate with one another or external entities.
- Problem-based learning and design thinking are processes that are too complex for AI content generators to navigate effectively.
Developing learning experiences which use AI-generated content
How do I develop learning experiences that integrate the use of AI-generated content into my course pedagogy?
- Use content generated by an AI content generation service for student critique and discussion. Provide students with content generated on a platform such as ChatGPT (or ask them to bring content to class) and ask them to dive into the strengths and weaknesses of the content that was generated.
- Use content generated by an AI content generation service as a “first draft” for students. Allow students to bring a draft of content for a paper or assignment to class that was generated by an AI content generation service. Ask them to use this draft to correct, expand, and improve the first draft.
- Encourage students to use an AI content generation service as a part of active learning experiences in the classroom. Using strategies such as jigsaw or think-pair-share, students can be asked to learn about a particular topic first using AI-generated content, then fact-checking that content, adding to it, and exploring other sources information to create an extended learning experience that is then shared with peers.
University policies
How does student use of AI content generation interact with Illinois State's existing academic integrity policy? What do I do if I suspect a student has submitted work that is not theirs?
The Student Code of Conduct’s academic integrity policy has several provisions that would apply to the unauthorized use of any resource or service not authorized by or acknowledged to the instructor, which covers the unauthorized use of AI content generation services. This is similar to how cases of students paying for essays or relying on test/paper archives maintained by peers would be considered. Using AI content generation services to solve equations or develop code would be examples of unauthorized uses of assistance. Using them to create text for a written assignment would be plagiarism if that content were not cited as being AI-generated content.
If you are concerned that a student may have used AI content generation services to submit work that is not theirs, you should follow the same process you would during any other academic integrity concern. That process is laid out by Student Conduct and Community Responsibilities.
Syllabus language
What sort of syllabus language might be appropriate to address students’ use of artificial intelligence in my class?
The Center provides suggested syllabus language for a variety of topics, including academic integrity. This language can be added to your course syllabus and discussed with your students at appropriate times throughout the semester. The suggested language on academic integrity was updated in January 2023 to include mention of the use of content produced through artificial intelligence services.
Additional resources
Other institutions are also examining the impact of these new technologies on learning.
- Update your course syllabus for ChatGPT
- Practical responses to ChatGPT
- AI Chatbots: Three Methods and Assignments
- AI-Based Text Generation and the Social Construction of "Fraudulent Authorship"
Questions?
Need additional help incorporating these suggestions into your particular course? Email ProDev@ilstu.edu to set up a consultation with a member of the Center's Scholarly Teaching team.