This is a heartbreaking story of Florida. Megan Garcia thought her 14-year-old son spent all his time playing video games. She had no idea he was having offensive, deep and sexual conversations with a chatbot powered by the app Character AI.
Sewell Setzer III stopped sleeping and his grades dropped. He eventually committed suicide. Just seconds before his death, Megan says in a lawsuit, the bot told him, “Please come home as soon as possible, my love.” The boy asked, “What if I told you I could come home now?” His Character AI bot responded, “Please do so, my dear king.”
DON’T SCAM YOURSELF WITH THE TRICKS THAT HACKERS WON’T WANT TO SHARE
🎁 I’m giving away a $500 Amazon gift card. Enter hereno purchase necessary.
You have to be smart
AI bots are proprietary technical companies are known for exploiting our trusting human nature, and they are designed using algorithms that promote their profits. There are no guardrails or laws governing what they can and cannot do with the information they collect.

A photo illustration of an AI chatbot. (iStock)
If you use a chatbot, it will know a lot about you when you fire the app or website. From your IP address, it collects information about where you live, keeps track of things you’ve searched for online, and has access to any other permissions you granted when you signed the chatbot’s terms and conditions.
The best way to protect yourself is to be careful with the information you provide.
Be careful: ChatGPT likes it when you get personal
THIS CRIME HAS INCREASED BY 400% – HOW TO PROTECT YOURSELF
10 Things Not to Say to AI
- Passwords or login details: A big privacy mistake. If someone gains access, he or she can take over your accounts in seconds.
- Your name, address or telephone number: Chatbots are not designed to process personally identifiable information. Once shared, you no longer have control over where it ends up or who sees it. Enter a fake name if you want!
- Sensitive financial information: Never include bank account numbers, credit card details or other financial matters in any document or text message you upload. AI tools are not secure vaults; treat them like a full room.
- Medical or health data: AI is not HIPAA compliant, so make sure your name and other identifying information are obscured when asking AI for health advice. Your privacy is worth more than quick answers.
- Asking for illegal advice: That’s against any bot’s terms of service. You’ll probably get flagged. Plus, you may end up with more problems than you expected.
- Hate speech or harmful content: This can also result in a ban. No chatbot is a license to spread negativity or harm others.
- Confidential work or company information: Proprietary data, customer data and trade secrets are all no-nos.
- Answers to security questions: Sharing them is like opening the front door to all your accounts at once.
- Explicit content: Keep it PG. Most chatbots filter out this kind of thing, so anything inappropriate could get you banned too.
- Personal data of others: Uploading this is not only a breach of trust; it is also a breach of data protection law. If you share private information without permission, you could get into legal trouble.

A person is seen using ChatGPT. (Frank Rumpenhorst/photo alliance via Getty Images)
Do you still trust Google? Never search for these terms
Claim a (little) bit of privacy
Most chatbots require you to create an account. If you create one, do not use login options such as ‘Login’ with Googleor “Connect with Facebook.” Instead, use your email address to create a truly unique login.
TECH TIP: SAVE YOUR MEMORIES BEFORE IT’S TOO LATE
FYI, with a free ChatGPT or Perplexity account, you can disable memory features in the app settings that remember everything you type. For Google Gemini you need a paid account.
Best AI tools for search, productivity, fun and work

Google is pictured here. (AP Photo/Don Ryan)
No matter what, follow this rule
Don’t tell a chatbot anything you wouldn’t want made public. Trust me, I know it’s hard.
Even I find myself talking to ChatGPT as if it were a person. I say things like, “You can do better with that answer” or “Thanks for the help!” It’s easy to think that your bot is a trusted ally, but that’s absolutely not the case. It’s a data collection tool like any other.
CLICK HERE TO DOWNLOAD THE FOX NEWS APP
Become tech-smarter on your schedule
Award-winning host Kim Komando is your secret weapon for navigating technology.
Copyright 2025, WestStar Multimedia Entertainment. All rights reserved.