ChatGPT in Teaching and Learning
There are many artificial intelligence (AI) tools that can generate “human-like” responses to a wide range of questions and statements. Among the most popular generative AI (or GenAI) tools is ChatGPT, a text-based tool that can produce essays, reports, lesson plans, and more. Boston College students will likely use content from these tools in a variety of ways, including as substitutes for their own thinking and writing. Like other technologies that have created new opportunities for academic dishonesty (e.g. Wikipedia, calculators, etc.), ChatGPT invites instructional responses that promote academic integrity and authentic student learning without sacrificing trust in instructor-student or student-student relationships.
ChatGPT’s parent company, OpenAI, has provided some responses to the most commonly asked questions about the tool on its website.
Limitations of ChatGPT
While ChatGPT can produce text that can pass for human-created work, it does have many significant limitations depending on the version being used. As of this writing, OpenAI provides a brief comparison of the distinctions between versions 3.5 (free) and 4.0 (paid). For those using the free version, key limitations include:
- ChatGPT does not use any information after January 2022. Current events and trends are not a part of its database.
- ChatGPT does not access the internet. It relies completely on the information already present in its database.
- ChatGPT is not “unbiased.” It is limited by the culturally specific and language-specific information that was used to train it.
- ChatGPT will often make up facts, produce misleading information, and include content unrelated to a user’s question. This is because the technology is constantly changing and adjusting to how users interact with it. The technology has no native way to differentiate fact from fiction.
Faculty Concerns
Faculty have raised several questions and concerns since ChatGPT was released in November 2022:
- How will we know if students are submitting original work?
- Should we ban the use of ChatGPT and other AI tools?
- If we assume students will use ChatGPT, how can the content it produced be cited?
- Are there ways that ChatGPT can be used to help students learn?
- Can we use ChatGPT for our own course design?
Tools for Detecting AI
Several tools do exist that claim to detect AI-generated content. For-profit services like Originality.AI claim to have 94% accuracy in identifying text produced by ChatGPT. Among the free tools are:
While these tools may be helpful in some cases, there are also significant downsides to using these tools:
- Research shows that detectors are unreliable and biased against several populations.
- It is unsustainable for faculty to run every one of their students’ submissions through these services. It is not only an enormous time-consuming process, it is also likely that the services will be inaccurate.
- Using these tools creates an adversarial relationship with students wherein they are taught that being caught cheating is worse than choosing to use the service in the first place.
- As of January 2023, it is unclear whether or not there are FERPA violations in submitting student work through these plagiarism detectors.
Below are alternative instructional responses to ChatGPT, including short- and long-term interventions.
Instructional Responses to ChatGPT
Instructional responses to the use of ChatGPT and similar tools vary depending on how much time and energy faculty have to make course-level or assignment-level changes. The short-term solutions provided below are meant to serve as an immediate response that should be revisited when time permits. The long-term approaches, on the other hand, are meant to prompt students to think more critically about the technology and their own intellectual formation.
While the advent of ChatGPT poses new questions to address, strategies that limit academic dishonesty in general remain as relevant and effective as they have been before AI.
Course Design: Short-Term Interventions
Update Your Syllabus
The following sample statements should be taken as starting points to craft your own policy. As of January 23, 2023, the Provost’s Office at BC has not issued a policy regarding the use of AI in coursework. When adding an AI-specific policy to a syllabus, consider how to personalize the policy depending on the norms of each department and course, along with BC’s institutional policies and protocols.
It is important to note that while you may suggest to students that using GenAI to complete assignments will be considered plagiarism, it is very difficult to prove given that GenAI will produce unique responses to every prompt.
Syllabus Statement 1 (Discourage Use of AI)
Artificial Intelligence (AI) Tool Usage: AI tools can generate text, images, and other media very quickly. Since a central goal of this course is to help you become independent and critical thinkers, you are discouraged from using AI tools to create text, video, audio, or images that end up in your work (assignments, activities, responses, etc).
If any part of this is confusing or uncertain, please reach out to me for a conversation before submitting your work.
Note: This statement assumes the syllabus has an academic integrity policy and/or statement about how plagiarized work will be treated.
Syllabus Statement 2 (Treat AI-generated text as a source)
Artificial Intelligence (AI) Tool Usage: AI tools can generate text, images, and other media very quickly. Since a central goal of this course is to help you become independent and critical thinkers, you are discouraged from using AI tools to create text, video, audio, or images that end up in your work (assignments, activities, responses, etc).
If any AI-generated content is used for your assignments, you must clearly indicate what work is yours and what part is generated by the AI. In such cases, no more than 10% of the student work should be generated by AI. Any AI-generated work not cited and/or used for more than 10% of your assignment will receive ____.
If any part of this is confusing or uncertain, please reach out to me for a conversation before submitting your work.
Note: This statement assumes that students are told which citation styles to use for secondary sources. The instructor would indicate the penalty for not following the policy.
As of January 15, 2024, a regularly-updated list of existing policies for the use of AI tools at various institutions is available online. BC faculty choosing to adapt these policies for their own syllabi may need to edit them in light of BC institutional policies and protocols, as well as departmental and course context.
Discuss ChatGPT With Students
Allocate some time in class to discuss ChatGPT. The conversation can be a way to discuss ethics in education, your expectations, as well as student perspectives. Discussion prompts and questions can include:
- What are some reasons students would want to use ChatGPT?
- In what ways does ChatGPT impact the course goals? In what ways can AI prevent or enable students in meeting these goals?
- What are the differences between plagiarism, remixing, influencing, and originality?
- How does the use of Wikipedia compare to the use of ChatGPT?
- What is Academic Integrity and why is it so valued at Boston College?
CTE’s resource on how to make Academic Integrity transparent can be a useful guide for these conversations. Additionally, the BC Library has a helpful guide for orienting students to Generative AI.
Collect a Writing Sample
Toward the beginning of the semester, ask students to write in class a short version of what you might ask them to write later in the semester. This “diagnostic” can be a source of comparison if a student later submits work that seems significantly different.
Additional Short-Term Course Design Interventions
Additional short-term approaches to course design can be found on Ryan Watkins’ educational technology blog.
Course Design: Long-Term Approaches
Cultivating a learner-centered course climate is fundamental to designing a course that de-incentivizes the use of tools like ChatGPT to cheat (see the CTE’s resource on Underlying Reasons for Academic Dishonesty). In general, the more that assessments can make thinking visible, the less likely a tool like ChatGPT will be able to be used to replace a student’s own problem-solving techniques.
Break Up Major Assignments
Divide major assignments into smaller graded components that build on each other. This “scaffolding” approach requires students to incorporate feedback on earlier assignments to improve their later assignments.
The CTE’s guidance on assignment design explains how making such changes can positively harness student motivation and deter academic dishonesty.
Distribute the Grading
When students are asked to produce one or two exemplar assignments for a semester, they are often pressured to focus on the product rather than the process of learning. By creating more lower-stakes graded assignments, instructors encourage students to demonstrate learning as a continuous activity. Faculty can see the progress of their students and easily discover how thinking on a subject has evolved.
Assess Student Workload
As indicated in the resource, Underlying Reasons for Academic Dishonesty, one common reason students might use a tool like ChatGPT to cheat is because the overall workload expected of them seems excessive. What is considered excessive, of course, is relative. One good way to measure the workload is by assuming that a student will take at least three times longer to accomplish a task than it would for a faculty member. Rice University has also produced a workload estimator that can be used to measure the time it might take a student to complete certain kinds of tasks.
Assignment Design: Short-Term Interventions
Try running your assignment prompts through ChatGPT itself. If you are finding the responses to be on par for what you would expect, make small changes to make it more challenging for the tool to be used inappropriately, and plan for longer-term adjustments to how the technologies should be engaged, if it all.
Require Recent References
ChatGPT is only able to use information prior to January 2023 in its responses. By asking students to use current events, recent newspaper sources, or very recent academic articles, faculty can more reliably assume that the analysis will be the student’s own.
Add Reflection to Assignment
Ask students to provide a reflective essay in addition to the paper or exam itself. The objective would be to allow students to show their thinking and provide explanations for why they made the content and stylistic choices they did. Sometimes called “exam wrappers,” these reflections personalize each submission and give students a chance to explain in detail how their thinking led to the product.
Show Students How to Cite
If allowing students to use ChatGPT to some degree, or assuming students will do so whether it’s allowed or not, demonstrate how to attribute content produced by the AI. Refer to the BC Library’s LibGuide for guidance.
Assignment Design: Long-Term Approaches
Teach with ChatGPT
Ask students to analyze the output of the AI for a question that could easily be asked as part of an assignment. In the analysis, see if students can differentiate the output with something that a human would produce. For example:
- Does the output have a “style” of writing that makes it distinct?
- Does the AI rely on clichés or casual speech in a manner that is inappropriate for the topic?
- How would a student rewrite the AI output to be more accurate or more distinct?
A thoughtful framework on ways to teach students to write with AI from Glenn Kleiman can be referenced for further reading.
If choosing to teach with ChatGPT, privacy concerns should be discussed openly. See Georgetown’s resource on the topic under the section, Privacy and Data Collection.
Review ChatGPT’s privacy policy before asking students to sign up for the service. Students should be made aware that signing up authorizes ChatGPT to share “Personal Information” with third parties without notice and that they must provide their cell phone number to the service. In addition, ChatGPT’s parent company OpenAI discloses that:
- It can access any information fed into or created by its technology
- It uses log-in data, tracking, and other analytics
- The technology does not respond to “Do Not Track”
Plan for Social Annotation of Text
Tools like Perusall can be enormously useful in not only ensuring students read a given text, but also in prompting critical engagement with the readings.
If interested in using Perusall, you can review CTE’s resources on Persuall and set up a consultation with us if needed by emailing centerforteaching@bc.edu.
Create Assignments that Require Multiple Modes
Since ChatGPT is text-based, it can only render output that uses traditional sentence structure and syntax. In place of written essays, faculty can ask students to create multimodal submissions: podcasts, posters, mind maps with annotations, short videos, etc.
A very helpful — if overwhelming — resource for alternative assessments can be a useful guide when rethinking assignment types. The “Guide to Alternative Assessments 2.0” PDF is particularly salient when thinking of multimodality.
Future Impact of AI on Teaching and Learning
The strategies suggested above respond to what we know about AI tools today; however, it’s clear that such technologies will improve and so our strategies will have to improve with them.
In the long run, instructional responses that engage the technology and its limits — rather than seek to simply ban them — promise to be more effective ways to meet learning goals across disciplines. Such strategies may also help faculty find new ways to respond to other persistent challenges in higher education, such as:
- How can AI tools be used ethically and strategically for our curriculum?
- How can they be used to teach students about information literacy, data privacy, and intellectual property?
- Can AI be used to promote more equitable learning experiences for students, especially those who have faced structural barriers to resources? See Equity and Academic Integrity for a reference on how accusations of cheating are disproportionately aimed at underrepresented learners.
- Should we find more ways to assess the process of learning and not just the product?
The CTE will continue to update this resource with suggestions, strategies, and perspectives that can inform faculty decisions on these questions.