This story was originally published by Chalkbeat. Sign up for their newsletters at ckbe.at/newsletters.
Much of the education narrative around ChatGPT has focused on the potential downsides of the AI tool (think Skynet from “The Terminator”). But as a high school teacher who has experimented with the sophisticated word predictor app in my economics, U.S. History, and speech and debate classes, I’ve found that ChatGPT provides opportunities to engage the students in class activities. Here’s how I’ve harnessed a tool that is here to stay (no matter how many school districts try to block it).
I welcomed our first “speaker,” ChatGPT.
Our student learning is anchored around Harkness discussions, an educational approach in which students discuss class readings, course material, and other texts. Students learn how to prepare for a discussion, have difficult conversations, understand different perspectives, and meaningfully participate. While many students grasp the notion of voicing their ideas, teaching them to build upon another student’s point is more difficult.
In Economics, we spent one class analyzing the dot-com bubble at the turn of the millennium. To prepare, I asked ChatGPT what caused the bubble, and here was the response:
“The dot-com bubble was caused by a combination of factors, including:
- Irrational exuberance: Investors were overly optimistic about the potential of internet-based companies, leading to a huge influx of investment capital.
- Overvaluation of tech startups: Startups with no revenue or profits were valued at extremely high prices, leading to a bubble.
- Lack of regulation: The SEC failed to adjust its regulations to the new realities of the internet and the stock market.
- Hype around internet technology: There was a lot of hype around the potential of the internet, which fueled the bubble.
- These factors combined to create the Dot-com Bubble, which burst in 2000-2001 causing many internet-based companies to fail, leading to a significant loss of investment capital.”
In class, I asked the students to respond to or build upon the ChatGPT answer in 2-4 sentences. Next, the students considered what facts or context would make the ChatGPT response more accurate or complete. Then, the students created three Harkness questions based on the ChatGPT response: a clarifying question, a curiosity question, and a probing question.
Student questions included: How could the government help prevent a future dot-com bubble? What are the measurements that determine whether the stock price of a company matches its value? How would an investor utilize bubbles to increase the value of their investment before leaving when the going is good?
Finally, I input the student questions into ChatGPT, projected the app’s responses live on the board, and asked the students to build upon the app’s response in a live discussion.
After going back and forth with ChatGPT, the class reflected on how the AI tool produced its content. We discussed, for example, whether the word predictor technology focused on certain keywords to produce generalized information regarding a financial crisis, most of which could be generalized to any financial crisis, including the 1929 stock market crash. Through this exercise, students learned to build on a prior discussion point, think on their feet, and better develop arguments.
I asked teens to fact-check ChatGPT’s answers.
Right now in my U.S. History class, we are studying Reconstruction. During a recent lesson, I asked ChatGPT, “What is Radical Reconstruction?” The app replied:
“‘Radical Reconstruction’ refers to the period of American history from 1867 to 1877 in which the federal government, under the leadership of the Republican Party, actively sought to rebuild the Southern states that had seceded during the Civil War and to provide civil rights and political representation for the newly freed slaves. This period was marked by the passage of several important pieces of legislation, including the 14th and 15th Amendments to the Constitution, which granted citizenship and voting rights to African Americans. The period was also marked by conflict and violence, as white Southerners resisted the changes and the federal government’s efforts to enforce them.”
I then asked the students, working in pairs, to critically analyze the ChatGPT response. What is accurate about this response? What is inaccurate? What facts or context could be added to the response to make it more complete?
Working in pairs, some students observed that though the start and end dates were accurate, the response was very vague and could benefit from referencing notable people, quotes, amendments, and laws. Other students noted that because Radical Reconstruction was more of a movement rather than an event, it is difficult to establish the exact dates. Many online sources report that it ended in 1872 rather than 1877. Why is there a discrepancy regarding the exact dates? That prompted a discussion around historical accuracy, historiography, and how we choose to report and sequence events. Further, some students noted the 15th Amendment only applied to African American men.
This lesson reinforced that one must always view online content with a critical eye. One of the most important skills to teach students today is how to discern which online sources are credible, and which are not.
I had students debate the app.
In Speech and Debate, one of the hardest skills to teach students is to think on their feet and deliver rebuttals. In Parliamentary debate, students are given a topic and a short time to conduct quick research before the debate begins. This format requires participants to construct arguments on the spot and respond immediately to their opponents.
To practice for the upcoming debate tournament, I fed sample debate resolutions to ChatGPT, projected the app’s responses, and had the students respond in real time.
To begin, I gave ChatGPT the resolution, “Should 16-year-olds have the right to vote?” Students were quick to identify weaknesses in the app’s argument, and they delivered their rebuttals. For example, one of my students argued that 16-year-olds should not be able to vote because their prefrontal cortex is not fully developed (a keen observation from a fellow 16-year-old). Other students argued that 18 is an arbitrary age that confers the maturity of an adult (if not yet the ability to purchase alcohol) — so why not 16?
Students reflected that using ChatGPT in debate prep was valuable because the app was able to launch the debate, outline arguments, provide a foundation of the issue, and retrieve useful points that the students could later adapt and refine.
AI isn’t going anywhere, which means it’s up to us, educators, to find creative ways to engage with it in our classrooms. We are still smarter than the machines, for now!
Chalkbeat (chalkbeat.org) is a nonprofit news organization covering public education.
Related:
We gave AI detectors a try–here’s what we found
ChatGPT caught NYC schools off guard–now, we’re determined to embrace its potential
- 5 ways online coding programs prep students for success - October 4, 2024
- Pandemic-related science losses hit underrepresented groups harder - October 3, 2024
- A surefire way to make science relevant to kids - October 2, 2024