Chances are that you have encountered conversations about ChatGPT – software powered by artificial intelligence (AI) able to deliver nuanced text that mimics natural human writing. Educator responses to ChatGPT seem to range from excitement to exasperation to exhaustion and even terror. As one might expect, there is a similarly wide range of integration at Brown. While some instructors may wish to prohibit use of the tool, Professor Steven Lubar (Center for Digital Scholarship, American Studies, History, and History of Art and Architecture) intentionally uses AI, noting,
“In my Methods in Public Humanities (PHUM2020) course, I am, with some trepidation, encouraging students to use ChatGPT and similar tools. These new tools will be useful to them in their work after Brown, and we should help them learn to use them wisely.”
The good news embedded in all this uncertainty is that good teaching practices (such as those that support critical thinking, knowledge acquisition, and skill proficiency) already position us to productively engage with ChatGPT. Additionally, Brown’s student population is well-positioned for this conversation as students who have chosen an Open Curriculum where they are intentional participants in their intellectual and personal growth.
However, because of quickly changing norms and the range of teaching practices around ChatGPT, there are real and practical considerations to take into account as we move forward. This resource addresses three areas we encourage you to consider as you think through your approach with students this term.
Designing writing assignments
Students’ submission of AI-generated texts is among educators’ top concerns. Like the calculator and Wikipedia, AI-assisted writing seems likely to impact teaching practices for years to come. Keep in mind, however, that we ask students to write in our classes because we believe there is something of value in that exercise that cannot be found in multiple choice or other types of assessments. Writing these assignment prompts and grading them is hard, so spend time reflecting on why you assign writing.
Thinking about that why of writing will help you develop and potentially scaffold better writing assignments, create effective rubrics, and set generative expectations for students, even in large classes. You might also reflect on authentic assignment options best suited to the learning outcomes.
We encourage you to communicate that pedagogical rationale to students, including the importance of writing in your own education, profession, and life. You might do this in a class conversation, syllabus statement, or extra reading; it does not need to be long, but it does need to help students understand that writing in your class actively contributes to their learning and development in our community. Without such explicit framing, students can easily forget that writing is more than just a task when it is actually a way of learning, of thinking, and of doing.
ChatGPT software offers specific benefits and challenges for English language learners. Dr. Frederick Poole (Michigan State University) writes about how language teachers might leverage ChatGPT for language material, and it also has potential for grammar practice. However, issues of text generation and authorship still apply, and are perhaps even more complicated by challenges of voice in translation.
AI-generated text has not historically been a part of how educators have thought about plagiarism. Additionally, there is a wide variety of teaching practices now with ChatGPT, with some instructors encouraging student experimentation and others prohibiting its use. Because of changing norms and the wide variety of instructional practice, it is helpful for instructors to be explicit to students about their own expectations.
For both undergraduates and graduate students, Brown’s academic code states,
A student’s name on any exercise (e.g., a theme, report, notebook, performance, computer program, course paper, quiz,or examination) is regarded as assurance that the exercise is the result of the student’s own thoughts and study, stated in his or her own words, and produced without assistance, except as quotation marks, references, and footnotes acknowledge the use of printed sources or other outside help.
Within this framing, consider how you might update your classroom plagiarism guidelines or develop a new AI writing policy for students that details how they should, might, or cannot engage with this or similar software platforms.
For example, Professor Monica Linden (Neuroscience & Provost’s Faculty Teaching Fellow), created a new syllabus statement about the ethical and effective use of ChatGPT in her course. (See additional commentary about Prof. Linden's statement in this blogpost.) Another example from Professor Steven Lubar (Center for Digital Scholarship, American Studies, History, and the History of Art and Architecture) is provided below.
AI Writing tools: I encourage you to explore the possibilities of tools like ChatGPT for writing papers, labels, summaries, and the like. If you use one, think carefully about prompts and the formats you’re requesting. Be sure to give the tool proper credit, and include a description of the way you used it and what was useful or not. Note: ChatGPT makes things up. Be sure to fact-check.
-Syllabus statement from Prof. Steven Lubar for Methods in Public Humanities (PHUM2020)
Even though it is past the first day of class, you can still publish an official update of your syllabus with revised language to reflect these changes, and ongoing classroom discussions around specific assignments are also useful. A Sheridan Center newsletter on Inclusive Practices for Addressing Academic Integrity offers additional ideas.
Ethics of use and cost
Any classroom conversations about ChatGPT should include issues of authorship and ethics. Data privacy and content ownership are important issues to raise with students. Dr. Autumm Caines’s (University of Michigan-Dearborn) blog post on the larger implications of using tools like ChatGPT addresses privacy concerns with data acquisition, labor issues, student privacy, and performance stability, walking through recommendations for how to talk about ChatGPT.
Additionally, while ChatGPT has been free in preview mode, subscription costs are likely in the future with some users already reporting access to a “pro tier.” Possibly including faster responses, more stable access, and longer content length, any tiered pricing scheme will starkly divide those with and without resources. Even if free options continue, the option of enhanced service by fee threatens to exacerbate existing inequities that Brown has committed to addressing in our community.
Consider how you might address issues of use, cost, authorship, and ethics with your students. For example, Professor Shriram Krishnamurthi (Computer Science) encourages use of GPT-3 in CSCI 1710 (Logic for Systems) but requires students to work through a series of guided questions first, some of which address ethical and privacy-related issues. (Professor Krishnamurthi notes that others are welcome to use and adapt this prose, with attribution.) Such prompted reflection can productively address these issues and potential implications for students as individuals and as members of a learning community dedicated to advancing diversity, equity, and inclusion.
The Sheridan Center is planning a spring forum for instructors to discuss views and practices on AI in the classroom. Please contact [email protected] if you are interested in participating or would like a confidential individual consultation on implications for your own teaching.
This resource was authored by Dr. Jenna Morton-Aiken, Senior Associate Director for Writing and English Language Support; and Lecturer, Department of English.
Thank you to Mary Wright, Kristi Kaeppel, and Kris Nolte for feedback on drafts and initial discussions of the topic.