AI and Written Assessments

Writing a final research essay has served as a recurrent mode of summative assessment in many liberal arts courses, particularly in English classes. Many composition and literature courses feature essay assignments that aim to demonstrate students’ mastery of course outcomes such as prewriting, revising, and incorporating research by producing an essay.

With the growing popularity and proliferation of AI technologies, it is not surprising that growing numbers of instructors have reported cases of academic dishonesty tied to the use of ais such as ChatGPT, Bard, and Quillbot, though it can be difficult to detect and prove the use of AI in many cases. Examples range from the submission of entire essays created by chatbots to the use of AI-guided writing tools to revise wording and structure based on the technology’s recommendations. It is clear that the use of AI in such capacities will not likely diminish any time soon and this obviously impacts how instructors create writing assignments (in all courses). 

What Instructors Can Do To Prevent the Misuse of AI Technologies Inside and Outside of Class

One starting point to prevent the misuse of AI is to incorporate unique prompts that require students to provide original responses or analyses that cannot be easily generated by AI tools. Assignment directions might refer to a specific facet of the Detroit Mercy mission or another component that would garner a more personalized writing project. Likewise, instructors may opt to include more in-class writing assignments that are hand-written (rather than electronic submission). This process helps ensure that students are producing their work without access to AI tools or other assistance.

Having students complete such in-class assignments will allow instructors to gain an impression of each student writer’s distinct voice and skill set. Such in-class assignments can be scaffolded to lead to the creation of a larger project. Employing a process writing methodology can also alleviate the need for students to turn to AI. Turning in short writing assignments at each stage of the process can allow instructors to see the project come to fruition from beginning to end. Following this idea, instructors can also tailor assignments for individual students and/or allow them to choose topics that align with their personal interests or experiences during class. This increases the likelihood of students investing their own time, effort, and creativity into the assignments, reducing the temptation to rely on AI-generated content.

As we move forward, we may wish to rethink what types of summative assignments (beyond the essay) will also allow students to demonstrate their mastery of outcomes and objectives. Creative projects like short films, podcasts, and other presentations include writing components (through planning and reflective elements) that may be more difficult for AI to create and allow for more student investment and passion for the project.  Instructors may also presentations or defenses as part of the assessment process, demonstrating their understanding, originality, and engagement with the assignment.

Finally, it is also important to discuss and teach AI limitations. Instructors can explore the limitations of AI tools, including their biases, lack of context, and potential errors. By understanding the shortcomings of AI, students can make more informed decisions about its use in their writing. Following the step above, instructors may assign specific tasks for AI use; provide specific assignments or tasks where students can utilize AI to support their process. This ensures that AI is used purposefully and in a controlled manner and may demonstrate the shortcomings of 

Additional Resources Regarding AI, Assessment, and Academic Dishonesty

Eaton, S.E. “Are We Living in a Post-Plagiarism World?” University World News. 4 March 2023.

In this article, Eaton traces the development of ChatGPT reminding us the technology did not simply appear out of thin air.  Eaton notes the use of AI tools by students “does not automatically constitute academic dishonesty” and suggests to readers AI apps such as ChatGPT can be used to help “reluctant writers generate a rough draft that they can then revise and update” arguing such use can help students learn. Moreover, argues Eaton, students will certainly encounter AI tools in their workplace, and it is our responsibility as educators to demonstrate effective and ethical uses of the tools.

May, J. “ChatGPT Is Great. You’re Just Using It Wrong.” Digital Society Press. 2023.

May debunks the common misconception that ChatGPT is a “super Google” or a sort of online librarian that searches for facts, explaining large language models were not explicitly designed to do that. The article explores how common it is for ChatGPT to create text with factual errors as it “forms a probability for all of the words in its vocabulary given that conversation, and then chooses one of them as the likely next word.” The article offers a helpful definition and explanation of how ChatGPT actually works.

Terry, O. “I’m a Student. You Have No Idea How Much We’re Using ChatGPT. No Professor or Software Could Ever Pick Up on It. The Chronicle of Higher Education. 12 May 2023.

In this column, the student author highlights how students have been using ChatGPT explaining it is not the way that professors think it is being used and its use is essentially undetectable. According to Terry, students do not use the app to simply write an essay, but rather, have the AI walk through the writing process step by step. Terry’s view is that “the reasonable conclusion is that there needs to be a split between assignments on which using AI is encouraged and assignments on which using AI can’t possibly help.”

Surovell, E. “ChatGPT Has Everyone Freaking Out About Cheating. It’s Not the First Time.” The Chronicle of Higher Education. 8 Feb. 2023

Surovell compares the recent panic about the use of AI like ChatGPT in college courses to a similar sense of anxiety about the use of calculators in decades past as well as more recent questions about the use of laptops and cellular phones in classes. Overall, Surovell suggests that change is a natural part of the field of higher education and notes that assignments, assessments, and academic integrity policies will all need to evolve to adjust to ChatGPT’s presence.