James Blakely & Jeanne Law
The general consensus among techies and luddites alike is that a paradigm shift occurred in November 2022, when the public became acquainted with GPT’s first iteration of generative artificial Intelligence (AI). Bill Gates, in his blog, GatesNotes, likened GPT to a revolution he hasn’t seen since “graphical user interface.” There is far less consensus on the implications of this paradigm shift in higher education, though. While GPT is a technological revolution, its creators may not have imagined how it would impact content creation and assessment in higher ed’s writing courses.
In response, researchers in the Department of English at Kennesaw State University, a large public institution in Georgia, are currently surveying thousands of incoming students enrolled in first-year writing courses to determine their attitudes towards generative AI use in the classroom as well as its potential for future use in academia and workplace situations. In this study, 135 student respondents have already provided vital insights into GPT/Bard use for potential cheating, what AI plagiarism detection means to them, and how their careers may be impacted by AI use in workplace writing.
As we study what our students actually think about generative AI, James Blakely, a graduate teaching assistant, and Jeanne Law a Professor and Director of First-Year Writing, each bring perspectives to this vital conversation.
James’ POV
As a graduate student and TA, I hold unique positionality regarding AI and LLM-assisted writing. I strongly believe that proliferation of AI technology has changed my and other students learning processes for the better. As a researcher, I do not write or research this topic from a place of mysticism, fear about job prospects (what are those, anyways?), or security of the “craft.” And, most importantly, I do not fault myself for establishing, or students for wanting, a pedagogy that equips them with the skills necessary to face the paradigm shift ahead of us. The ubiquitous use of AI in first-year writing means that our students still have adoptive expectations. Instructors, then, must determine how to play this line, ethically integrating AI into our courses in ways that increase student success.
Jeanne’s POV
I am interested in investigating how first-year students are using generative AI as well as what writing program administrators and faculty can do to help meet students where they are and help them to get where they want to be. To do this, I am partnering with colleagues, including James, to analyze what our students actually think about AI.
Our collective thoughts
In our Program-wide survey of students, we have received 135 responses so far. Interestingly, we have found divergent perspectives on AI and more nuanced notions of AI as “cheating,” which surprised us. Only 15% of students viewed use of AI as cheating, while 27% did not; 58% believed it depended on context (a finding institutions might consider as they develop policies).
Additionally, over half of respondents indicated they believe AI is the future of writing. Students reported in open-ended feedback that[i]:
- As more technologies adapt AI programing, learning to use them will be key for those who want to adapt to a changing culture.
- It is a useful tool that is villainized by people’s preconceptions of AI. However, I do think that it should be strictly regulated and closely watched.
- With the continuous advancement of these technologies, the academic community can expect even more sophisticated applications, ultimately transforming the landscape of scholarly writing.
Our data points also to a need for pedagogies and policies that are mindful of the role that AI will play in our students’ professional and personal lives. We must engage in care-work, meeting students where they are, not where we want them to be.
Our data further indicates that students are open to discussing AI’s complexity and ethical use. Their attitudes find alignment with the MLA/CCCCs joint report, which calls for an “ethic of transparency” surrounding the use of AI technology, initiating discussion and collaboration between instructors, students, and administrators. Our survey responses also indicate that students already view AI as an academic reality. Students seem ready for thoughtful integration of this technology in the classroom and thoughtful policy to govern it, understanding the potential of it to enhance, not undermine, their educational realities.
In the absence of such mindfulness, however, we may see restrictive policies on AI as harmful. Without guidance, students may come to over-rely on these tools, generating instead of writing—without grounding. Our research indicates that students, facing restrictive policies, may view AI as a replacement for many aspects of the writing process, utilizing these models for research and conceptual output. Not only does this approach undermine expectations of academic sourcing and integrity, but it also leads to diminished discursive ability. Stay tuned for more as we use our data to write and revise low and high stakes assignments that integrate AI in ways that appeal to and benefit students’ rhetorical growth.
Collectively, care-work, transparency, and course-level interventions challenge assumptions about AI integration. With our study’s expansion, we will provide data to allow individuals and institutions to move from fear to development, grounding policy and instruction in ethically principled, rhetorically-sound pedagogies that accept AI as a reality for our students. We see opportunities to hear our students and prepare them for radically-changing futures.
Sources:
Gates, Bill. “The Age of AI has Begun.” GatesNotes, 23 March 2023,
https://www.gatesnotes.com/The-Age-of-AI-Has-Begun. Accessed 15 July 2023.
MLA-CCCC Joint Task Force on Writing and AI. Overview of the Issues, Statement of
Principles, and Recommendations. Modern Language Association of America and Conference on College Composition and Communication, July 2023,
https://hcommons.org/app/uploads/sites/1003160/2023/07/MLA-CCCC-Joint-Task-Force-on Writing-and-AI-Working-Paper-1.pdf, Accessed 23 July 2023.
[i] Survey data taken from July 1, 2023 report.