Jump directly to the content

YOU need to be extremely careful when talking to online chatbots like ChatGPT – and a simple rule could keep you safe.

Don't risk having your identity stolen or bank accounts raided because you made a common mistake.

AI-powered chatbots like OpenAI's ChatGPT can be incredibly useful – but don't spill all of your info when speaking to them
1
AI-powered chatbots like OpenAI's ChatGPT can be incredibly useful – but don't spill all of your info when speaking to themCredit: Getty

The Sun spoke to a top security expert who said artificial intelligence chatbots need to be used with caution.

AI bots are everywhere – even built into apps like WhatsApp and Facebook Messenger – so it's important to know what you should and shouldn't say to them.

"Think of ChatGPT and similar AI tools like a stranger you meet on the street," warned cybersecurity pro Stuart Green.

"You might feel comfortable asking them for directions or when the next bus arrives, but you wouldn’t share personal details or your home address.

Read more on AI

"AI systems often have a human-like interface, so it’s important to treat them like a person – just one you don’t fully trust, and suspect might share your secrets with the next person they encounter."

There are specific types of info that you shouldn't send to a chatbot.

Stuart, who works as a cloud security architect at Check Point Software, told The Sun that it's "crucial to avoid" sharing something called "PII".

THREE-LETTER RULE

The little-known three-letter term stands for "personally identifiable information".

"This includes your full name, home address, email address, phone number, or any form of identification such as passport details," Stuart told us.

"Sharing PII can increase the risk of identity theft, doxxing, or unauthorised access to your personal accounts.

Meta’s top VR boss predicts AI-powered future with no phones, brain-controlled ovens and virtual TVs that only cost $1

"Even if the platform anonymises your data, there is always the chance of a data breach."

Of course the list of info that you shouldn't be sending to any AI chatbot goes way beyond just PII.

MONEY MATTERS

It's also important to avoid handing over financial info.

Chatbots are humanlike and can be highly convincing, so it's easy to be duped by an AI – especially one that's been created or compromised by a criminal.

If this type of data is leaked, it could result in privacy violations, discrimination, or misuse by malicious actors.

Stuart Greensecurity expert at Check Point

"Credit card numbers, bank account details, and passwords should also be kept private," Stuart told The Sun.

"Sharing financial data online exposes you to the risk of fraud, theft, or scams if it is ever exposed.

"It’s equally important not to share your login credentials, such as usernames, passwords, or two-factor authentication codes.

"Doing so can lead to account takeovers, data theft, or unauthorised transactions.

READ MORE SUN STORIES

DON'T BE AI-FRAID – BE CAUTIOUS!

Here's advice from The Sun's tech expert ...

AI can be your friend. A trusted colleague. A customer support assistant.

In fact, it can be almost anything. AI chatbots are now very humanlike and highly convincing.

But it's more important than ever to remember that they are not human. They're certainly not your friend.

It's lines of code pretending to be a human – and so you need to treat it like messages from a total stranger.

Don't give over personal information – not even your full real name.

No one can guarantee the total safety of your messages, even if you're using tools from massive and respected tech companies.

There are plenty of ways crooks could break into your chatbot messages – including compromising your device or even the AI apps themselves.

So feel safe using chatbots to help you get things done, but be mindful that you don't know exactly where those messages might end up.

We're still very early in this wild AI journey, so it's better to be safe than sorry.

Remember: if you're about to send a message to AI, ask yourself if you'd feel comfortable sending that same text to a total stranger.

If the answer is no, don't send it.

"Similarly, sensitive health or medical data should be kept private, including your medical history, prescriptions, diagnoses, or mental health information.

"If this type of data is leaked, it could result in privacy violations, discrimination, or misuse by malicious actors."

Topics