White robot with large round eyes and a small lined mouth looking up at the camera. The robot has some areas of blue light showing through near its ear and shoulder, and appears to have a screen attached to its chest.

Who (or What) Wrote This? AI Content Generators in Higher Ed  

Since ChatGPT was launched in November 2022, higher education channels have lit up with discussions about the implications of artificial Intelligence (AI) for teaching and learning. Here we look at what AI content generators are, what the implications of access to them are, and how post-secondary educators and institutions might respond to them.  

What is it? 

AI content generators such as predictive text in Microsoft apps, AI Music Pro, DALL.E (art), and Codiga (code) are not new, but the buzz around ChatGPT in education results from its current free “research preview” and its ability to produce “movie scripts and comic strips, sonnets and grant proposals, graduate course syllabi and lessons. It can execute math problems, showing all its work with written explanations. Nearly any writing prompt one might assign to be completed outside of class (with a few notable exceptions) can be written pretty well, quickly, at no cost, and undetectable by our current plagiarism software by anyone who takes a little time to learn the nuances of ChatGPT“ (Alby, 2023).  

What are the Implications? 

Academics and others have weighed in on the implications for post-secondary teaching and learning. Some express concerns about academic integrity and rigor: “The potential for abuse in academic integrity is clear . . . it is likely that this is only the first of many such AI products we’ll see in the years to come, and we may need to permanently re-envision how we assign college writing” (Faculty Center, n.d.).  And not just those who assess writing are concerned. Worldwide, people have shared ChatGPT’s ability to write and debug computer code, build new apps, and create investment term sheets (Bailey, 2022). 

And yet others don’t think bots such as ChatGPT are a cause for panic. Thomas Lancaster, a computer scientist and academic-integrity researcher at Imperial College London, states “ChatGPT-generated essays will out themselves more readily than the products of essay mills, by including quotes that weren’t actually said, incorrect information created through false assumptions, and irrelevant references” (Stokel-Walker, 2022). 

How do we respond? 

The reactions shared can be roughly organized into four categories: embrace the AI, neutralize it, ban it, or ignore it. 

Embrace It 

As Kovanovic points out, “history has shown time and again that educational institutions can adapt to new technologies. In the 1970s the rise of portable calculators had maths educators concerned about the future of their subject – but it’s safe to say maths survived.” He argues that “rather than being a tool to prohibit, AI models should be meaningfully integrated into teaching and learning.” Because students will use AI in their future workplaces, they should be taught its possibilities and limitations now (2022).  For example, an Ontario math teacher posed word problems to ChatGPT and found the responses “wildly wrong” but plausible on the surface to a layperson, so he took them to his students and asked them to explain what the bot had done wrong mathematically. The students reported enjoying the challenge (Alfa, 2023). 

Gleason offers a sample class activity in which students discuss an assigned reading in small groups and then plug the reading’s research question into GPT-3.5 and let it generate an alternative essay. They then assess the assigned reading and the AI-generated content including evaluating evidence, identifying assumptions, and reviewing methodology or lack thereof (2022). 

McKnight offers several ways to engage with AI for writing. She concludes “pretending that they do not exist or banning them outright will not prevent or enhance their use” and offers key questions to ask (and have students ask) about any AI text generators including examining the material the writing is based on, asking who and what was excluded from the material, and assessing what underlying assumptions, biases and injustices are embedded in the material” (2022). 

Neutralize it  

The University of Central Florida suggests making changes to assessments such as crafting writing prompts that are so course specific no outsider (human or bot) could produce useful output. Or, as ChatGPT only has information up to 2021, provide prompts related to current events or publications. Another suggestion is to “reduce course-related workload to disincentivize cheating. Many instances of student cheating, including the use of AI-writing, is borne out of desperation and a lack of time. Consider how realistic the workload you expect of students is” (Faculty Center, n.d.). 

An additional approach is to discuss the limitations of the chatbot’s output with your students. As Grobe points out, “the things ChatGPT cannot do (cite and analyze evidence, limit claims, create logical links between claims, arrange those claims into a hierarchy of significance) are the basic stuff of college-level writing”. In addition, he notes that “in hiding the seams of its own relentless pattern replication, ChatGPT uses many of the same tricks I teach my students to avoid as warning signs of insufficient argument” (2023).  Among other shortcomings, he highlights the chatbot’s tendency to start every paragraph with a transition word without providing “any logical connection between adjacent ideas.” 

I asked ChatGPT what types of questions it could not answer and got a pretty good response: 

ChatGPT is a language model that has been trained on a large dataset of text, so it has knowledge on a wide range of topics. However, it may not be able to answer certain questions for a few reasons: 

    1. The question may be too specific or niche, and the information may not be present in the training dataset.
    2. The question may be too ambiguous, making it difficult for ChatGPT to understand and provide a specific answer.
    3. The question may be outside of the scope of language modeling, such as a question about a mathematical calculation or a question that requires physical manipulation.
    4. The question may be too complex and may require a lot of background information to be able to answer.
    5. The question may be asking for an opinion or a subjective point of view which is not something a language model could provide. 

It’s also important to keep in mind that ChatGPT is a machine learning model and it’s not capable of understanding or experiencing like a human. So the answer may not be accurate or complete. 

Source: ChatGPT via chat.openai.com (2023) 

Ban it  

The New York School District has already banned AI content generators and post-secondary institutions are scrambling to update academic integrity statements and policies. The University of Central Florida recommends that if you choose to ban the use of AI writing, state it explicitly in your syllabus, but if you intend to allow its use but want it cited in some way, be clear and specific about your expectations (Faculty Center, n.d.). They also suggest 

Discuss[ing] the ethical and career implications of AI-writing with your students. Early in the semester (or at least when assigning a writing prompt), have a frank discussion with your students about the existence of AI writing. Point out to them the surface-level ethical problem with mis-representing their work if they choose to attempt it, as well as the deeper problem of “cheating themselves” by entering the workforce without adequate preparation for writing skills, a quality that employers highly prize. (Faculty Center, n.d.) 

Ignore it  

As Kovanovic points out, “Just as Wikipedia and Google didn’t spell the end of assessments, neither will AI. In fact, new technologies lead to novel and innovative ways of doing work. The same will apply to learning and teaching with AI” (2023). Indeed, some instructors find their current methods of assessment are already sufficient to confound an AI generator for the very reasons ChatGPT offered me.    

However, educators and institutions respond, ChatGPT and other forms of AI are not going away; indeed, as Bailey notes “the technology is undoubtedly going to improve, which is likely to happen much faster than many realize. Some forecasts estimate that GPT-4 will contain over 100 trillion parameters—nearly 600 times as many as the current GPT-3. Our responses now may help us to “use AI to its full potential while also mitigating any possible negative effects” (2022).  

D’Agostino quotes Ted Underwood, professor of information sciences and English and associate dean of academic affairs in the School of Information Sciences at the University of Illinois who says  

Our approach to teaching should be guided not by one recent product but by reflection on the lives our students are likely to lead in the 2030s. What will the writing process look like for them? Will they use models as research assistants? As editors? 

No crystal ball can answer those questions yet. But the uncertainty itself is a reminder that our goal is not to train students for specific tasks but to give them resilience founded on broad understanding. (2023) 

References 

Alby, C. (2023, January 9). ChatGPT: A must-see before the semester begins. Faculty Focus. https://www.facultyfocus.com/articles/teaching-with-technology-articles/chatgpt-a-must-see-before-the-semester-begins/ 

Alfa, I. (2023, January 8). As concerns over ChatGPT grow, this teacher found a way to use it in his classroom [Radio broadcast]. In Metro Morning with Ismaila Alfa. CBC. https://www.cbc.ca/listen/live-radio/1-39-metro-morning 

Bailey, J. (2022, December 16). Meet ChatGPT: The AI Chatbot That Can Write Code, Pass Exams, and Generate Business Ideas. AEIdeas. https://www.aei.org/technology-and-innovation/meet-chatgpt-the-ai-chatbot-that-can-write-code-pass-exams-and-generate-business-ideas/ 

D’Agostino, S. (2023, January 12). ChatGPT advice academics can use now. Inside Higher Ed. https://www.insidehighered.com/news/2023/01/12/academic-experts-offer-advice-chatgpt 

Faculty Center. Artificial Intelligence Writing. (n.d.). University of Central Florida. https://fctl.ucf.edu/teaching-resources/promoting-academic-integrity/artificial-intelligence-writing/ 

Gleason, N. (2022, December 9). ChatGPT and the rise of AI writers: how should higher education respond? THE Campus. https://www.timeshighereducation.com/campus/chatgpt-and-rise-ai-writers-how-should-higher-education-respond 

Grobe, C. (2023, January 18). Why I am not scared of ChatGPT. The Chronicle of Higher Education. https://www.chronicle.com/article/why-im-not-scared-of-chatgpt 

Kovanovic, V. (2022, December 14). The dawn of AI has come, and its implications for education couldn’t be more significant. The Conversation. https://theconversation.com/the-dawn-of-ai-has-come-and-its-implications-for-education-couldnt-be-more-significant-196383 

McKnight, L. (2022, October 14). Eight ways to engage with AI writers in higher education. THE Campushttps://www.timeshighereducation.com/campus/eight-ways-engage-ai-writers-higher-education 

Stokel-Walker, C. (2022, December 9). Ai Bot chatgpt writes smart essays – should professors worry? Nature Newshttps://www.nature.com/articles/d41586-022-04397-7 

 

Further Reading/Viewing  

Tech Talks: AI is already disrupting your writing assignments a video discussion among faculty at  the Community Colleges of Spokane, including a demonstration of an AI writing generator 

 19 Ways to Use ChatGPT in your Classroom a list of practical class activities adaptable to the college classroom

The Mechanical Professor an article in which a professor explores ways to ease the teaching load using ChatGPT 

 ChatGPT both is and is not like a calculator. Thoughtfully written article refuting the popular comparisons of AI content generators and calculators in disrupting education. 


Photo by Alex Knight on Unsplash