A laptop showing ChatGPT

Information Your Child Shouldn’t Share while Chatting with ChatGPT

Share this
Reading Time: 5 minutes

ChatGPT, like many other AI-based chatbots, can greatly augment a child’s learning and entertainment experiences. However, to safeguard a child’s privacy and security, certain types of information should never be shared with AI-based chatbots like ChatGPT and Google’s Bart. This article provides guidance on the crucial pieces of information that should remain private to ensure your child’s online safety.

Why Sharing Certain Information on AI Chatbots Is Unsecure

ChatGPT is an advanced Artificial Intelligence model developed by OpenAI. It is designed to understand and respond to users’ inputs in natural language, mimicking human-like conversation patterns. Despite being an AI model, engaging with ChatGPT might feel very realistic, which is why many people, children and adults alike, use ChatGPT to look up information, brainstorm, and perform certain writing tasks.

Sharing certain information on ChatGPT, or any AI chatbots that use large language models, can pose potential security risks for you. This is especially true for sensitive personal information such as your full name, address, contact details, financial information, and data that could enable identity theft or fraud.

Even though the conversation you have with ChatGPT seems private, it is not. According to OpenAI’s privacy policy, any data a user shares to its application is collected and stored. This information can be used to “improve our Services and conduct research”. This means that the information you provide to ChatGPT can be used to train ChatGPT. When other users talk to ChatGPT in the future, the AI chatbot can inadvertently reveal the sensitive information you shared previously.

When it comes to children using ChatGPT, it becomes even more important to monitor what they share with ChatGPT. Most children don’t fully understand the potential privacy risks associated with sharing personal information online. They might inadvertently share their personal details and other sensitive information, which could lead to cyber threats and compromises in their privacy.

OpenAI's Privacy Policy screenshot

(Credit: OpenAI. OpenAI’s privacy policy page)

The Kind of Information Your Child Should Not Share With AI Chatbots

Personal Identifiable Information

While interacting with ChatGPT, children might unknowingly provide personal information such as their name, school, home address, or other identifiable data during the course of their conversation. There is the risk of data being used to train the AI and subsequently revealed to a third-party user of ChatGPT.

The following information should be not shared with ChatGPT.

  • Real name
  • Birthdate
  • Physical address
  • Email address
  • School information
  • Phone number
  • SSN or SIN
  • Parent’s name, phone number, email address

Credential Information

Children, while interacting with ChatGPT, might innocently disclose sensitive credential information such as passwords during their conversation. This could occur in various contexts, such as describing a video game scenario, divulging secrets, or asking ChatGPT to help them log into a school site.

When talking with AI chatbots like ChatGPT, children should always avoid sharing:

  • Passwords to websites or apps
  • Passwords to their social media accounts
  • Answers to security questions
  • PIN codes and security codes

Payment Information

While most children may not have access to payment information such as credit card and debit card numbers, it’s paramount to inform our children not to share such information. Payment information includes:

  • Credit or debit card numbers
  • Credit card expiry date and CVV
  • Bank account numbers
  • Bank information such as bank name and home branch
  • Any other financial information

Personal Health Information

Some children may use an AI as sort of a personal therapist. AI programs like ChatGPT can provide help and support in form of understanding and empathy. It may even benefit children in some situations. However, it’s generally not advisable for children to share their personal health information or seek medical or mental health help from ChatGPT. There are several reasons for this.

  1. Lack of Professional Training: AIs are not trained psychiatrists or therapists who have the expertise to handle serious health or mental problems. They cannot provide diagnoses or treatments plans, unlike health professionals.
  2. Privacy and Confidentiality: The information shared with ChatGPT can be stored and potentially used to train the ChatGPT. ChatGPT, in turn, may reveal such information to future users.
  3. Misinterpretation: ChatGPT is still learning and improving. There is a potential risk of the AI misunderstanding the information being provided, potentially leading to incorrect advice or support.
  4. Potential Inadequacy: ChatGPT might not detect urgency or risk in certain situations. For example, in situations involving attempted suicide, ChatGPT may not be equipped to create an immediate response plan to ensure safety or escalate the situation to a human helper.
  5. Misinterpretation by Children: There is a good reason why we should never self-diagnose. That is because we tend to pick and choose the worst case scenario. Children who are facing mental health crises may not be able to properly use the information provided by ChatGPT.

As a parent, you should discourage your child from sharing the following information, and instead, talk to you, the school counselor, or other adults in charge.

  • Personal health problems
  • Mental health problems
  • Personal health data and health history
  • Content of private conversation between the child and their medical service provider

Adding ChatGPT to Digital Literacy

Children using electronic devices

Digital literacy involves the teaching of crucial skills and knowledge that allow children to participate critically, responsibly, and confidently in the world of digital media and communication. It involves areas such as understanding how to operate digital devices, knowing how to interact positively and safely on social media platforms, understanding how to find and evaluate online information, and understanding the ethical and legal rules for accessing and using digital resources. Children who have digital literacy are less likely to be deceived by false information online or become a victim of online fraud.

I believe that part of digital literacy education should involve teaching children what they should not share with AIs, such as ChatGPT. It is crucial that children understand the importance of refraining from sharing personally identifiable information, credential information, and any other data that could potentially compromise their identity. Educators and parents can establish clear rules surrounding the type of information that can be shared with AI platforms. They can teach children to treat AI chatbots as they would strangers. Furthermore, to deepen their understanding, educators and parents can engage children in discussions about how AI systems function.

By understanding how to use AI wisely, tools like ChatGPT can become a powerful aid in children’s education.