6 Things You Should Never Share with ChatGPT

ChatGPT is a great tool, but you shouldn't trust it with your sensitive and private data. We tell you what to never share.

ChatGPT, and other AI chatbots, are arguably the most useful time saving tool since the invention of the computer – but there are certain details that you should never share with them, unless you’re happy for your private data to be potentially shared with the world.

Chatbots train on your data, so anything that you put into them could well be used to influence the next request they receive from a user. While most queries are unlikely to cause issues, sharing certain information could leave you exposed to fraud, or even jeopardise your job.

We explain some of the biggest things you should never share with ChatGPT and similar platforms.

1. Sensitive Company Data

If you haven’t opted out of ChatGPT storing your data, then anything you put into the platform is considered fair game, and could be used to train the LLM, as well as be to train its AI.

That also means information that might not strictly be yours, but the company you work for. There have already been examples of private company data being surfaced via ChatGPT, with one of the high profile examples being Samsung, who clamped down on use of the chatbot this year.

Surfshark logo🔎 Want to browse the web privately? 🌎 Or appear as if you're in another country?
Get a huge 86% off Surfshark with this special tech.co offer.See deal button

In an internal memo, the company warned staff against using ChatGPT, after a security leak was traced back to an employee sharing sensitive company code on the platform.

Samsung aren’t alone either – plenty of other companies, including Apple, have banned ChatGPT for certain employees and departments.

Being responsible for exposing your company’s sensitive data could see you having a very awkward chat with HR, or even worse, fired.

2. Creative Works and Intellectual Property

Written the next great American novel and want ChatGPT to give it an edit? Stop. Never share your original creative work with chatbots, unless you’re happy to have them potentially shared with all other users.

In fact, even copyrighted woks aren’t safe. Chatbots like ChatGPT are currently embroiled in a number of legal cases from the likes of Sarah Silverman and George R. R. Martin, accusing them of training their large language models (LLMs) on their published writings.

Your next great idea could well be surfaced in a stranger’s ChatGPT results, so we’d suggest keeping it to yourself.

3. Financial Information

Just like you wouldn’t leave your banking or social security number on a public forum online, you shouldn’t be entering them into ChatGPT either.

It’s fine to ask the platform for finance tips, to help you budget, or even tax guidance, but never put in your sensitive financial information. Doing so could well see your private bank details out in the wild, and open to abuse.

It’s also extremely important to be vigilant of fake AI chatbot platforms which may be designed to trick you into sharing such data.

4. Personal Data

Your name, your address, your telephone number, even the name of your first pet…all big no nos when it comes to ChatGPT.

Anything personal such as this can be exploited to impersonate you, which fraudsters could use to infiltrate private accounts, or carry out impersonation scams – none of which is good news for you.

So, resist the temptation to put your life story into ChatGPT, and if you are determined to have it write your autobiography for you, think carefully about what you’re sharing.

5. Usernames and Passwords

There’s only one place you should be writing down passwords, and that’s on the app or site that needs them. Best practice states that storing unencrypted passwords elsewhere could leave you vulnerable.

So, if you don’t want your passwords to become publicly available, we’d suggest resisting the temptation to get ChatGPT to record all your passwords in one place to make them easier to find, or perhaps asking it to suggest stronger passwords for you.

If you’re struggling to remember your passwords (and lets face it, we all are), then a password manager is a great tool that takes the pain out of juggling multiple passwords at once.

If you want to test your existing passwords, there are plenty of free, secure tools that can do this for you.

6. ChatGPT Chats

So okay, we’ll admit this one is a slight oxymoron, and it would be very difficult to use ChatGPT without actually talking to it, but it does a great job of demonstrating the danger of entering absolutely anything into ChatGPT.

Yes, even your own ChatGPT requests could be shared with others, and it has happened in the past.

There have even been instances recently where a bug meant that ChatGPT users were seeing chats that other users had carried out with the chatbot.

There has also been evidence that Google’s Bard chatbot has been indexing chats with users, making them easy for anyone to find online.

In both cases, the companies promised to rectify the issues, but it illustrates how quickly the tech is progressing, and that nothing, not even the requests you put into the platform, can be considered private. It’s worth keeping this in mind whenever you’re conversing with a chatbot.

Using ChatGPT Safetly

ChatGPT is a powerful tool, and it can do lots for you to make your work, and personal life, easier and more efficient.

However, it’s important to remember that the information you share with it could well be used to train the platform, and may appear in other users requests, in various forms. You can opt out of having your data used by ChatGPT, and we’d suggest familiarizing yourself with how the platform uses your data.

It’s also important to realize that anything shared in the past on the platform can be extremely difficult to permanently delete, so always use chatbots with caution, and treat them like a distance acquaintance, rather than a close friend.

Did you find this article helpful? Click on one of the following buttons
We're so happy you liked! Get more delivered to your inbox just like it.

We're sorry this article didn't help you today – we welcome feedback, so if there's any way you feel we could improve our content, please email us at contact@tech.co

Written by:
Jack is the Deputy Editor for Tech.co. He has over 15 years experience in publishing, having covered both consumer and business technology extensively, including both in print and online. Jack has also led on investigations on topical tech issues, from privacy to price gouging. He has a strong background in research-based content, working with organisations globally, and has also been a member of government advisory committees on tech matters.
Explore More See all news
Back to top
close Building a Website? We've tested and rated Wix as the best website builder you can choose – try it yourself for free Try Wix today