A robot writing "AI" on blackboard

Let’s Take a Look At the Risks and Benefits of ChatGPT for Children’s Learning

Share this
Reading Time: 14 minutes

In recent years, there has been a significant increase in the use of technology in education. Artificial intelligence (AI) has become the hot and controversial topic in education. Particularly, ChatGPT has gained much attention since its role out in December 2022. Most educators are concerned about the use of generative AIs such as ChatGPT because they have the potential for misuse by students in their studies. What kind of impact does generative AI have on learning and education? Is it all bad? This article discusses the risks and benefits of children using ChatGPT for their studies, the concerns of educators, and how to navigate both good and bad usage of this AI tool.

What is ChatGPT and why are educators concerned?

OpenAI and ChatGPT logos

ChatGPT is a language model developed by OpenAI that can engage in conversation and answer users’ questions. This language model has been trained on an extensive range of information from the internet, including books, articles, blogs, and other online resources. Thanks to the vast array of pre-trained data, ChatGPT can respond to user queries in a relatively accurate manner. When you talk to ChatGPT, you feel like you are talking to a real person.

Accessing ChatGPT is simple and can be done through Bing.com when using the Microsoft Edge browser. Additionally, the ChatGPT app is available for free download on both Android and iPhone devices.

If ChatGPT is so convenient, why are many educators concerned about it when it comes to education or teaching? There are several reasons.

ChatGPT gives answers effortlessly

Students learn most effectively when working on tasks independently. They gain valuable learning experience when the task is challenging and if they dedicate time and effort to tackle it. Whether it involves writing an essay, completing a math project, or researching a historical figure for a history presentation, independent work helps cultivate students’ tenacity, problem-solving skills, critical thinking, and sense of accomplishment when they finish the task.

ChatGPT provides effortless answers, comparable to giving a child a calculator to practice the 9×9 chart. However, ChatGPT’s capabilities extend far beyond simple calculations; it can write an entire essay for the child, create a project outline for their math project, and develop a presentation about a historical figure, complete with citations.

This level of automation effectively removes a child’s opportunity to think, attempt, and execute tasks on their own. Consequently, they lose the chance to apply creativity, enjoy the process, and learn through trial-and-error.

ChatGPT can be used to do homework or assignment for the student

ChatGPT is capable of generating accurate, or at least accurate-sounding, answers to a wide variety of questions that students might ask. This capacity enables students to potentially use it as a tool for completing their homework or assignments.

For instance, if students are given three challenging math problems to solve as homework for grading, they may choose to have ChatGPT solve the questions and tell them the steps. All they need to do then is copy the steps and solutions, and submit them to their teacher. Alternatively, students might receive a worksheet on fraction multiplication to practice their skills, but instead, they could let ChatGPT finish the entire sheet without making any effort to improve their own calculation skills.

Compounding the problem is that differentiating between AI-generated work and student-generated work can be difficult, if not impossible. Since ChatGPT can provide the steps along with the answers, teachers may find it hard to determine if the work was produced by the AI. Even when there are signs suggesting that the work was created by ChatGPT, it is often impossible to definitively conclude that the work was generated by the AI.

ChatGPT can give inaccurate, hallucinatory answers

Since ChatGPT’s language model is based on data gathered from the internet, there is no guarantee that its responses are accurate or factual. In fact, when presented with a topic it lacks knowledge of, ChatGPT may generate a fabricated response as though it were legitimate. This well-known occurrence is commonly referred to as hallucination.

For instance, when I asked ChatGPT, “Why is it hard for dogs to run after licking people’s faces?” it provided me with a seemingly plausible answer:

Dogs may find it difficult to run after licking a person’s face for a few reasons:

  1. Disrupted senses: When dogs lick a person’s face, they can leave behind traces of saliva. Since a dog’s primary sense of smell is highly developed, the scent of their saliva on a person’s face can distract or confuse them. This can make it challenging for them to focus on other scents or tasks, such as running.
  2. Physical discomfort: Dogs may experience physical discomfort while running after licking a person’s face due to the structure of their tongues. A dog’s tongue is covered in papillae, which are small, backward-facing projections that help with licking and consuming food. When a dog licks a person’s face, these papillae can become entangled in facial hair or even irritated by the chemicals found in some personal care products. This discomfort can cause a dog to be less inclined to run or be less focused on the activity.
  3. Social interaction: Licking is a common social behavior for dogs, and when they lick a person’s face, it can be a sign of affection or bonding. After engaging in such an interaction, dogs may feel content and satisfied, leading them to be less motivated to engage in vigorous physical activities like running.

Of course, there is no factual basis for the claim that “dogs have difficulty running after licking a human’s face.” It is entirely fabricated; my dog is perfectly content running after licking off my whole face. This example demonstrates that ChatGPT can provide false information. It necessitates fact-checking by users. Children, who are often not used to fact checking, may accept ChatGPT’s answers as factual, resulting in learning wrong information.

Teachers cannot tell if the student work was produced by ChatGPT or tell if the work was plagiarized

ChatGPT can provide plausible and mostly accurate answers, making it difficult for many educators to determine whether a student’s response was generated by the AI or the student themselves. Although there are several tools that can help differentiate AI-generated content from human-generated content, they are not perfect.

One area in which this issue becomes particularly problematic is plagiarism. Plagiarism occurs when a student presents someone else’s ideas as their own without proper attribution. In the past, teachers typically relied on plagiarism checkers to identify instances of copying. These checkers compare a student’s work with various online resources, detecting and highlighting content that is identical or substantially similar. Educators can then use these highlighted similarities as evidence to prove that a student has plagiarized their work.

However, current plagiarism checkers are not capable of determining if AI has been used to paraphrase existing content. For instance, I used GPT-4 (the engine behind ChatGPT-plus) to paraphrase a paragraph from the Humane Society’s web page about puppy mills:

In puppy mills, mother dogs spend their entire lives in cramped cages with little to no personal attention. When the mother and father dogs can no longer breed, they are abandoned or killed. Due to poor sanitation, overbreeding and a lack of preventive veterinary care, the puppies frequently suffer from a variety of health issues, creating heartbreaking challenges for families who should be enjoying the delights of adopting a new family member.

The paraphrased content was this:

In puppy mills, female dogs are confined to small cages for their entire lives and receive minimal human interaction. When these dogs and their male counterparts are no longer able to reproduce, they are either discarded or euthanized. The unsanitary conditions, excessive breeding, and insufficient medical care often result in the puppies experiencing numerous health problems, causing distress for families who should be relishing the joy of welcoming a new addition to their household.

(Note, the original output by GPT-4 used the word “dog breeding facilities” in the first sentence, but I substituted with “puppy mills” to give more resemblance to the original paragraph.)

Putting the paraphrased content into an online plagiarism checker gave me a score of 100% unique, 0% plagiarized. If plagiarism means using someone else’s work without citing or contributing your own thoughts, then paraphrasing using AI ought to be considered a form of plagiarism. However, currently, teachers cannot detect this form of plagiarism.

Children may become dependent on AI

In East Asian schools, such as those in Korea, Japan, and China, students are not allowed to use calculators in any setting, be it in school or at home. This rule applies to students across all grade levels, whether they are in elementary school or high school, and regardless of whether they are self-studying or taking a calculus test—even a simple calculator is forbidden. On the other hand, in Canadian schools, students are allowed to use both simple and scientific calculators in almost any situation. They can use them for most math work (unless specifically practicing paper calculations) and during tests.

This disparity in calculator usage has resulted in a notable difference in children’s abilities to perform mental math and paper calculations. Many elementary students in East Asian schools can do multiplication, division, and work with fractions by the time they reach Grade 6, while many Canadian elementary students struggle with these skills. According to the 2018 PISA (Programme for International Student Assessment) ranking of mathematics achievements among 15-year-olds, the top seven countries are all occupied by East Asian nations or geopolitical regions.

As obvious as it may seem, consistent practice helps students become better. It enables them to understand the concepts and achieve automaticity. Consistent practice frees up students’ working memory during challenging tasks, allowing them to focus on problem-solving rather than just performing simple calculations. If a ubiquitous “calculator” such as ChatGPT were made available for students to generate any answer they need, would they still choose to work independently? Would they still see the value in practice or independent work?

Concerns surrounding privacy and children’s personal information

One area of concern involves the handling of children’s data when they input their information into ChatGPT. Children may unintentionally enter sensitive information, such as personal details, medical history, or school information. Consequently, ChatGPT might incorporate this data into its training program. It is unclear how such information may be used or who it could ultimately be shared with. It is possible that the children’s information may reappear in a chat with a future user.

Significant uncertainty surrounds data privacy and AI. Due to privacy concerns, Italy has banned ChatGPT from operating within its borders.

How can educators prevent the misuse of ChatGPT by students?

Most educators want their students to succeed and reach their full potential. They don’t want them to rely on quick answers from ChatGPT or get into trouble by plagiarizing, whether intentionally or unintentionally. So, how can educators reduce the chances of students misusing ChatGPT? While there’s no foolproof solution due to limitations in detecting AI-generated work, I’ve listed some ideas that might be helpful.

  • Establish clear guidelines with the students: Educators should create specific rules and expectations regarding the use of AI tools like ChatGPT at the beginning of each term or before giving out an assignment. Put the rules on paper. Make sure that students understand the rules surrounding AI usage and penalty for misuse.
  • Teach the “Do”s and “Don’t”s of ChatGPT: Create a brief lesson discussing the effective use of ChatGPT for learning and the types of usage to avoid. Introduce the concept of a growth mindset – the belief that intelligence can be developed through hard work and persistence; students can do that if they don’t solely rely on tools like ChatGPT. Encourage students to participate in class discussions about the impact of using AI tools. Help them understand the significance of using AI technologies like ChatGPT for the right purposes.
  • Teach fact checking and how to avoid plagiarism when using ChatGPT: Create a brief lesson on how to do fact-checking when students do use ChatGPT. Provide examples of “hallucinations” generated by ChatGPT and demonstrate various methods students can employ to verify facts on their own. For instance, they can search for the claim’s source on reputable websites or request ChatGPT to provide its source. Additionally, teach students how to prevent plagiarism by properly citing authoritative sources in their assignments.
  • Limit access: Restrict access to AI tools like ChatGPT during specific periods in school, such as designated research time or self-study time. Work with your school’s technical staff to block access to ChatGPT website or Bing.com and implement measures to prevent students from installing ChatGPT app on their iPad.
  • Use AI-detector tools: GPTZero is a tool designed to detect content created with GPT technology, including ChatGPT. When you input a piece of content into its engine, it classifies the content as either entirely written by AI, moderately likely to be written by AI, moderately likely to be written by a human, or entirely written by a human. Keep in mind that the tool is not foolproof or perfect. If you input a student’s work into the tool and receive a result indicating that the work was entirely written by AI, it does not guarantee that it was actually written by an AI. However, it suggests that the student may have used AI, which allows you to take the next step, whether it is a conversation with the student or a disciplinary action.

GPTZero.me screenshot

(Screenshot of GPTZero, a tool that can determine if student work was created by AI)

Does ChatGPT provide any benefits for learning and education?

Technology has always had a tumorous relationship with education. Every time a new technology emerges that makes learning “easier” for students, there is often a concern that it will lower the quality of learning. Calculators serve as an example. Using calculators allows students to save time when solving math problems, but it takes away their opportunity to practice their calculation skills.

Despite this, calculators offer significant educational benefits. Students can easily visualize complex graphs with a graphing calculator. Teachers can use online tools like Desmos to demonstrate math concepts using graphs. High school students can rely on calculators for tedious calculations, enabling them to focus on practicing more relevant skills that they learned in school.

Similarly, I believe that ChatGPT, when used correctly, can provide substantial benefits to students and educators. Although there is potential for misuse, ChatGPT can open doors to a new way of learning for students. It can enable educators to teach in ways that were previously impossible. Let’s examine some ways that ChatGPT can be advantageous for learning and teaching.

ChatGPT can be an at-home tutor for students who cannot afford a tutor

I believe ChatGPT has the potential to revolutionize education despite its vulnerability to misuse. ChatGPT can offer at-home assistance for students whose families cannot afford private tutors or tutoring classes. Since ChatGPT and the more advanced ChatGPT-plus are either free or have a low cost, every student can access additional educational support when needed.

For example, let’s say a student doesn’t understand how evolution works. She can ask ChatGPT to give her an overview of evolution, natural selection, and so on. When she needs more clarification, she can ask ChatGPT for further assistance. The interaction the student has with the AI is not an experience that searching and browsing web pages can give her.

This type of access to at-home educational assistance may help reduce the achievement gap between children from wealthy families and those from financially struggling families. Traditionally, well-off parents hire tutors to help their children succeed in school. While not many parents can afford the $200 per month (or more) for a private tutor, most parents can manage the $20 per month cost of ChatGPT-plus.

The effectiveness of such AI-powered tutors for children’s learning is yet to be determined. However, it gives us educators a reason to be hopeful.

ChatGPT can help students get over the “writer’s block”

When a child struggles to begin an assignment or an essay, they often experience writer’s block – the initial feeling of not knowing where to start or being afraid to begin. ChatGPT can be a good tool for brainstorming, helping the child overcome this initial “freezing” stage.

If a child is unsure how to start a project at home, they can ask ChatGPT for some initial ideas or a starting point. ChatGPT may provide a list of ideas for the child to explore. ChatGPT can also be used in a classroom setting to help students brainstorm. For example, if students are tasked to create a presentation about diseases caused by virus, ChatGPT can be used to create a list of possible diseases that students can choose from.

Similarly, if a student experiences writer’s block, they can ask ChatGPT to suggest some potential outlines for an essay on their chosen topic. ChatGPT might even generate a topic sentence for a paragraph if the student shares their writing interests. The student should not copy the sentence directly, but they can use it as a reference to begin their own writing.

ChatGPT can help students with editing or proofreading tasks

ChatGPT can be used to check for errors in writing. A student can upload their essay and ask ChatGPT to find any mistakes in grammar. For instance, they can use the prompt “Please correct any grammar mistakes in my paragraph” to have it corrected. Some may say that this is cheating, and that students should proofread their own work without help. This is a reasonable perspective, and teachers should evaluate the advantages and disadvantages of permitting students to use ChatGPT for proofreading before deciding whether to allow it.

Educators can use ChatGPT to create lesson materials

ChatGPT provides a remarkable opportunity for educators to create content independently. By using a bit of creativity, teachers can employ ChatGPT to develop various materials for their students. For instance, they can create homework questions, test questions, and lesson outlines using ChatGPT. Teachers can also brainstorm ideas for their students’ next project or ask ChatGPT for suggestions for science demonstrations.

Another area where ChatGPT proves to be beneficial is in coding. If a teacher wants to demonstrate a concept, they can ask ChatGPT to develop a simple app to illustrate it. As an example, in this article, I explained how I used GPT-4 (the engine behind ChatGPT-plus) to create a simple browser game that demonstrates Newton’s laws, without entering any code myself. ChatGPT has the potential to enable educators to personalize their lessons to suit their students’ needs.

Screen recording of game made by GPT-4

(A simple game created using GPT-4)

Bottom Line

ChatGPT can be a helpful or harmful tool for children’s education. It can improve learning, but if misused, it may also lead to issues like difficult-to-detect plagiarism and preventing children from thinking and working independently. Regardless of how it is used, it is essential to protect children’s chances for learning and growth. Even when ChatGPT is part of the learning process, children must still be given the opportunity and responsibility to think and act on their own. When children own their own learning, ChatGPT can be a helpful aid in their education.