The use of artificial intelligence (AI) tools to produce written work that sounds convincingly human-produced exploded in popularity in late 2022, prompting questions in higher education related to how text generators will impact the academic integrity and authenticity of work produced by our students. While artificial intelligence is not new, the viral adoption of ChatGPT and similar tools raises an unprecedented awareness in our community of AI’s capabilities. In fact, a survey conducted in late 2023 with 1,140 U.S. higher education administrators found that 60% claim their institution uses AI functionality and “about half of the respondents expect positive effects of AI to span the student experience, from better supporting the student journey to improving student outcomes” (Educause, 2023).
ChatGPT, the first AI text generator to make waves, is now in good company as the market has expanded rapidly since its introduction. Google Gemini, Microsoft Copilot, and Claude are among some of the more well known generative AI text tools available for free to users, with optional paid tiers available. Each of these products offer similar functionality with varying degrees of success; some of which can also leverage search engines (ChatGPT Plus, Google Gemini, and Microsoft Copilot) to form more accurate output. In addition, Google has woven Gemini into its productivity suite (Docs, Slides, etc.) and Microsoft has added Copilot to Office 365 (Word, PowerPoint, etc.) and Windows 11.
Text generators like ChatGPT are just one type of artificial intelligence. Images can be generated from a prompt with tools like Adobe Firefly, and videos can be created from a script with tools like Synthesia.
ITDS has compiled resources comparing different text, image, and video generators on the market to help you make an informed decision on which is right for you. We also offer a series of workshops designed for all experience levels to empower faculty with the knowledge needed to address artificial intelligence in their courses.
Navigate the sections below to familiarize yourself with the capabilities of generative AI and learn more about how to acknowledge, detect, and mitigate the use of AI in your classroom.
Writing Productive Prompts
Applying the principles and best practices of prompt engineering can help you get the most out of any AI tool. Generating useful and accurate output from Generative AI requires submitting requests and questions which properly prompt the artificial intelligence to produce your desired content. These tools are only as smart as what you ask of them, so consider the following when writing your prompts:
- Assign roles: For example, instead of asking “Provide restaurant recommendations for a trip to Italy,” try asking “I’d like you to act as a travel agent and provide breakfast and dinner recommendations for a 4-night trip.” Understand that in the context of academia, students could leverage this to prompt “I’d like you to act as a student and write a 4-page paper on the Titanic.”
- Be specific and responsive: Generative AI not only produces output to your initial prompt – it is responsive, as well. For example, prompting it to write a 4-page paper on the Titanic does not include sources and citations by default. However, responding to the initial paper with “Same paper with scholarly sources” will enhance the output with in-text citations and an APA-formatted reference list. You could even respond with “Same paper in MLA format” to switch it to a different citation style. This flexibility and adaptability is powerful for students who may misuse the tool to produce work on their behalf.
Relative to teaching and learning, consider these categories of Generative AI prompts:
Research
AI tools can be extremely helpful when conducting research tasks thanks to their large datasets of information and, for some tools, their ability to utilize search engines to access current information. Text generators are particularly useful at the initial stages of research, helping to identify starting points when the process may seem overwhelming.
Example Prompts:
- What are the important facts / statistics / current issues related to the following topic? Provide references.
- Suggest references for the following topic and summarize each source with references.
Learning
AI tools can be leveraged as personal tutors or learning assistants. Especially talented at synthesizing information, consider using these tools to make complex articles more accessible, develop study guides to help digest and understand complex content, or explain difficult concepts in layman’s terms.
Example Prompts:
- What are the different types of paraphrasing? Provide examples for each type.
- Explain if ‘the’ is necessary in the following sentence and explain why. Give me 10 more examples that demonstrate the same usage.
Interaction
AI tools are interactive, responsive, and can be assigned roles. Think about using a tool like ChatGPT as a debate or study partner, as an interviewer, or even as a conduit to converse with a person from history.
Example Prompts:
- I need help with brainstorming. Can you ask me questions on the following topic one at a time and edit my answers?
- Play the role of a hiring manager hiring for this particular role. Ask me questions and evaluate my responses.
Feedback
Consider using these tools to help enhance your writing. They are fairly effective at identifying mistakes, offering recommendations, and providing additional insight.
Example Prompts:
- Revise my essay and provide explanations for each change.
- What are the most common mistakes in my text? List them in order of frequency and explain each one.
Being familiar with the tool’s limitations also aids in recognizing content generated by it:
- Remember it is not a search engine: Some artificial intelligence tools (such as ChatGPT, Microsoft Copilot, and Google Gemini) can access search engines to incorporate more timely data into their output. Others (such as Claude), cannot and therefore are dependent upon their datasets. It’s important to consider if the output you seek would be impacted by the LLM’s dataset timestamp, and therefore, which tool is the best to use.
- Know the characteristics and limits of AI writing: While the style and tone of AI-produced material is seemingly accurate to human-produced writing on its surface, reviewing it through a more analytical lens quickly exposes its flaws. As an example, according to OpenAI’s documentation on ChatGPT, responses may be:
-
-
- Factually inaccurate: Data set limitations can result in “inaccurate, untruthful, and otherwise misleading [output] at times” (OpenAI, 2024).
- Broad, and oversimplified: Responses are ”often excessively verbose and overuse certain phrases” (OpenAI, 2023).
- Biased: ChatGPT “may also occasionally produce harmful instructions or biased content” (OpenAI, 2024).
If you’d like to learn more about Prompt Engineering, consider attending ITDS’ Using Generative AI: Prompt Engineering To Unlock AI’s Potential workshop.
Back to Top
Approaching Artificial Intelligence In Your Class
Empowered with an understanding of the tool’s capabilities and potential, decide how you will address generative AI in your course. Consider the subject matter and assignments in your course(s) and the ways artificial intelligence may be leveraged by your students. As with any new and developing technology, your response may change over time with experience and as the tool itself evolves.
You may consider one or more of the following:
- Discuss it openly with your students: You may already lead conversations at the beginning of each semester defining norms, expectations, and general class policies. Consider how Generative AI can be part of these discussions. Defining your perspective on the tool facilitates transparency and reduces ambiguity.
- Develop a formal syllabus policy: After defining your own stance on the use of Generative AI in your course, develop a syllabus policy to reflect your encouragement and/or discouragement of how AI is used. This is not necessarily an all-or-nothing approach; there may be ways that students can leverage these tools (i.e. for inspiration and ideas) while still ultimately producing authentic work of their own. See the section below for specifics on crafting your syllabus policy.
- Test out your assessments: Try using Generative AI to see what sort of response it may generate for an assignment in your course. It may shed light on areas of an assignment that you may consider adjusting to promote more authentic work.
Back to Top
Instructional Design Strategies to Mitigate the Use of AI
While restricting access to generative AI tools may not be possible, there are strategies you can implement that will mitigate its use. Many methods we recommend for preventing the use of AI in your course relate to best practices for online assessment, but can be tailored for courses offered in any modality.
Strategies to mitigate the use of AI:
- Promote authentic assessment: When designing assessments, give preference to authentic activities (have students apply their learning to a real-world problem). Typically, authentic activities motivate students to succeed because the skills they utilize for the assessment will translate to their professional lives. Authentic prompts challenge chatbots to be current and hyperspecific, something that they struggle with given their predefined data sets. As a result, an authentic prompt posed to a chatbot may result in a dissatisfactory, generic response.
- Integrate a variety of assessments: Developing different types of assessment for your course can empower students to illustrate their learning in multi-dimensional ways by providing more opportunities to express themselves and apply their thinking. Incorporating different types of assessment (i.e., e-portfolios, role playing, debate, case studies) can prevent chatbots from being effective.
- Foster student investment/volition via transparency: Have conversations with students about how the skills they learn in your course are important in the real-world. Understanding the value of what is being learned can be an effective deterrent to taking a “shortcut” using AI technology.
- Use low-stakes, chunked assignments: Segmenting larger assessments into smaller, lower-stakes assignments provides students additional opportunities for feedback while emphasizing the importance of revision and progress. Higher-stakes assessments can motivate students to cheat, so creating smaller, chunked, assignments at lower stakes can be effective in mitigating misuse of AI tools.
- Ask students to submit drafts: Drafting places emphasis on the writing process. Asking students to submit drafts provides an opportunity to analyze changes from one draft to the next. In Google Docs, it is also possible to track a document’s changes.
- Create assignments which promote higher-order thinking: As an artificial intelligence trained to generate predictions and calculations based on queries to its data set, chatbots produce their best (and most convincing) work in response to requests for knowledge and recall. Consider if your assessments and/or activities challenge students to achieve higher-order thinking analysis and interpretation. You’ll likely find Generative AI’s attempts at higher-order thinking to be verbose and factual, but lacking in critical analysis.
- Evaluate your assignment prompts: Consider the strategies provided on TurnItIn’s AI Misuse Checklist to assess if your course’s assignment prompts promote higher-order thinking, authentic assessment, and other practical steps to mitigate the misuse of artificial intelligence.
Back to Top
Leveraging Technology to Detect AI-Produced Works
No detection technology on the market at present can determine with absolute certainty that text is or is not written by artificial intelligence. Pedagogical adjustments such as leveraging authentic assessment and/or providing multiple means of students representing their learning are preferred. If utilizing any detection tool, its results should only be taken as a means of approaching students to engage in dialogue rather than leverage for disciplinary action. It’s worth noting these tools usually determine its findings as “most likely written by AI” or “highly confident” rather than declaring it with certainty.
If you still wish to use an AI detection tool with the above in mind, consider trying the following free and paid options:
- GPTZero: Free, 5000 character limit per inquiry.
- ZeroGPT: Free, 15,000 character limit per inquiry.
- Originality.ai: Available only by subscription with a credit-based model (not unlimited use)
- Copyleaks: Available only by subscription with a credit-based model (not unlimited use)
Some questions to ponder when using these tools include (adapted from Bowen & Watson, 2024):
- Should AI detection be allowed in your academic integrity policies and procedures?
- Considering the student mental health crisis and the damage false accusations can incur, under what circumstances would it be ethical to utilize tools that have a record of false positives? What is your personal acceptable level?
- At what threshold of AI detector probability score and in what context would you consider student work to be AI assisted?
- How will you use the data generated by an AI text detector?
Alternatively, leveraging proctoring software is an alternative method to mitigate the use of AI in an in-person classroom environment. Proctoring tools are less effective in online courses where students are taking exams at home with access to other devices. The following tools are available in Canvas:
- Respondus LockDown Browser is a viable tool for proctored environments. It disables copy/paste, screenshotting, and the ability to shift between applications while taking an exam.
- Respondus Monitor, an add-on component which can only be used in conjunction with LockDown Browser, requires students to use a webcam and can be optimal for remotely proctored exams.
Back to Top
Developing an AI Syllabus Policy
A first step all faculty should take, whether in favor of or opposed to artificial intelligence, is to develop a syllabus policy to define the use of artificial intelligence in your course. This policy should be shared at the beginning of the semester and clearly define if, when, and how students should or should not leverage artificial intelligence in your classroom.
Policies typically fall into one of three primary buckets:
- Prohibit the use of Generative AI: AI should not be used in any way in the course. You may consider asking students to sign a pledge declaring they will not use artificial intelligence.
- 40% of Montclair faculty (OFE Survey, Nov. 2023)
- Example policy created using Pepperdine’s Generative AI Syllabus Statement Tool: You may not use generative AI tools on assignments in this course. If you use generative AI tools to complete assignments in this course, in ways that I have not explicitly authorized, I will apply the Montclair State University Code of Academic Integrity as appropriate to your specific case. In addition, you must be wary of unintentional plagiarism or fabrication of data (generative AI tools are prone to both). Depending on the specific circumstances, a first offense academic integrity violation related to misuse of generative AI could range from a request to resubmit your work to receiving zero credit on the applicable assignment.
- Allow the use of Generative AI: AI may be utilized in specific instances, such as basic brainstorming, example-finding, outlining, studying, or completing certain assignments as determined by the instructor, so long as appropriate attribution is provided.
- 52% of Montclair faculty (OFE Survey, Nov. 2023)
- Example policy, created using Pepperdine’s Generative AI Syllabus Statement Tool: You are permitted to use Generative AI to complete tasks you would also leverage a search engine for (brainstorming, ideating, fact-checking, etc.). If you do use generative AI tools on assignments in this class, you must properly document and credit the tools themselves. Cite the tool you used, following the pattern as specified for APA guidelines; be sure to include the tool name and version. When submitting your work, please include a brief description of how you used the AI tool for the assignment. If you choose to use generative AI tools, please remember that they are typically trained on limited datasets that may be out of date. Additionally, generative AI datasets are trained on pre-existing material, including copyrighted material; therefore, relying on a generative AI tool may result in plagiarism or copyright violations.
- Encourage the use of Generative AI: AI can be used in a totally unrestricted fashion, for any purpose, at no penalty. In this scenario especially, students need to understand the strengths and weaknesses of artificial intelligence.
- 8% of Montclair faculty (OFE Survey, Nov. 2023)
- Example policy (Ryan Baker, University of Pennsylvania): Within this class, you are welcome to use foundation models (ChatGPT, GPT, DALL-E, Stable Diffusion, Midjourney, GitHub Copilot, and anything after) in a totally unrestricted fashion, for any purpose, at no penalty. However, you should note that all large language models still have a tendency to make up incorrect facts and fake citations, code generation models have a tendency to produce inaccurate outputs, and image generation models can occasionally come up with highly offensive products. You will be responsible for any inaccurate, biased, offensive, or otherwise unethical content you submit regardless of whether it originally comes from you or a foundation model. If you use a foundation model, its contribution must be acknowledged in the handin; you will be penalized for using a foundation model without acknowledgement. Having said all these disclaimers, the use of foundation models is encouraged, as it may make it possible for you to submit assignments with higher quality, in less time. The university’s policy on plagiarism still applies to any uncited or improperly cited use of work by other human beings, or submission of work by other human beings as your own.
As you develop your AI syllabus policy, consider leveraging the following resources:
Back to Top
Incorporating AI into Your Course
As with many challenging technologies before it, ChatGPT has prompted a strong response from educators, ranging from cautious optimism to outright skepticism. It’s understandable to proceed with caution, since the misuse of AI chatbots like ChatGPT is easily done. However, there can be benefits to incorporating this technology to enhance teaching and learning; consider having open conversations with students regarding the misuse of AI, but also ponder the possibilities of how this technology can be helpful to them as learners.
Here are some ways ChatGPT can be leveraged as a teaching and learning tool:
- Feedback assistant: Students can ask ChatGPT for feedback on how to improve their writing. For example, it may suggest a student add more specificity to their writing by incorporating additional examples, adding transitions, or even give recommendations to include alternative perspectives. Consider having students keep record of the feedback they receive to incorporate into their writing, or analyze the feedback and justify if they are valid recommendations or not.
- Act as an English Professor and provide recommendations to enhance this paper: [copy and paste text]
- Debate partner: To help students exercise strong argumentation, they can prompt ChatGPT to act as a debate partner (Roose, 2023). By asking ChatGPT to take a particular stance on an issue, students can develop counterarguments and spar with the chatbot. This can help students develop deeper understandings of alternative perspectives, prepare for an actual debate in class, or enhance argumentation for a persuasive assignment.
- Act as my debate partner for the topic of “Banning Books in Public Schools.” Play the role of a parent who wants to censor texts they feel should not be read by high school students.
- Additional point-of-view: Incorporate ChatGPT into a class discussion by modifying the traditional “Think, Pair, Share” format to, “Think, Pair, ChatGPT, Share” (Miller, 2022). Inviting an AI perspective can provide another critical lens for students to collectively analyze.
- Ask ChatGPT the same question you’ve posed to students (or have students ask ChatGPT the question). After generating an initial response, try prompting ChatGPT with the following: Now, answer the same question from the perspective of a [insert persona].
- Prompt generator: As an educator, consider using ChatGPT to help create discussion prompts for your students on a topic or based on a particular (open access) text. You’ll need to provide the link to the text for ChatGPT to access and develop its questions.
- As a journalism professor, create a short prompt that will inspire college students to write an open-ended response about the media’s effect on society.
- Quiz creator: ChatGPT can create open-ended or multiple choice quiz questions based on a text you provide (Roose, 2023). Be sure to double-check that questions are accurate.
- Develop 5 open-ended and 5 multiple choice questions based on this article [insert link].
- Language tutor: Text generators offer the opportunity for students of language acquisition courses to practice in the target language. Besides working through text chat, most mobile applications of text generators offer conversation functionality in multiple languages. Below are some specific ways these tools could function as a language tutor:
- Conversational practice and listening comprehension (Gemini and ChatGPT apps)- Students can have verbal conversations in the target language. Users can tailor their experience to any level, speed, or subject.
- Reading and writing practice- Read content written in the target language. Students can prompt the tool to quiz them to enhance comprehension.
Back to Top