MAKING simple mistakes in a conversation with an AI chatbot leaves you exposed to three devastating types of crime.
That's the official warning from a security expert who warned The Sun readers about what you must never tell AI-powered bots.
Artificial intelligence chatbots are everywhere – from OpenAI's ChatGPT to Meta AI and more.
They can help you find info quickly, get jobs done faster, and even just chat about life.
But cybersecurity expert Jamie Akhtar warned The Sun that you need to be extremely careful when you talk to these online chatbots.
"Tools like ChatGPT and Meta AI have been a revelation, speeding up everything from creating business plans to generating graphics. However, they aren’t risk-free," said Jamie, chief exec at security firm CyberSmart.
Read more on AI
"Regardless of whether you’re using AI chatbots for personal or professional use, you need to be extremely careful what you share with them.
"Why? Well, chatbots gather and store your raw conversational data.
"Every question, prompt or message you send is stored, analysed and processed by the companies behind these assistants to train and improve their AI assistants.
"You might think, so what? However, once you share this data, you lose control over where your data goes and how it’s used – not to mention its safety."
Most read in Tech
Jamie warned that this data is effectively a treasure trove for criminals.
And they could use it for at least three different types of online crime – including stealing your identity to raid your finances.
"The data chatbots collect is typically held in a server. And, while these servers are usually pretty secure, hackers do breach them," Jamie said.
"If the server is breached, cybercriminals can use your data for all sorts of nasty ends.
"They could sell it to the highest bidder on the dark web, steal your identity or launch cyberattacks against your accounts."
PROTECT YOURSELF
Thankfully the way to stay safe is extremely easy.
You should also be careful when sharing any kind of intimate thoughts with these tools.
Jamie Akhtar
You simply need to be careful what you tell an AI chatbot so that the data never ends up anywhere.
Jamie gave The Sun a simple "rule of thumb" to follow when interacting with bots.
"We recommend limiting the data you feed to your chatbot," Jamie told us.
"Never share personally identifying information such as your full name, address, date of birth or social security number.
DON'T FEAR THE AI FUTURE
Here's what The Sun's Head of Technology and Science has to say...
When it comes to AI, it's all about striking a balance.
There are real safety concerns with AI: you don't want to overshare as you have little control over where that info ends up.
And the tech is so new that cyber-criminals are desperately looking for ways to exploit it.
That said, AI can be massively helpful so you shouldn't avoid it just because of these dangers.
There are many relatively safe ways to use AI chatbots that can give you a helping hand without putting you in danger.
Just try to avoid giving up any personal details (and certainly never any work info) and you'll likely be fine.
It's also important to try to stick to well-known and reputable chatbots.
If you interact with random chatbots on the internet, you're likelier to be engaging with a scam operation.
Check reviews for chatbots before using them – and be very careful if you're being asked to download or install any files.
Usually the same old rules apply: don't give strangers private info or money, and avoid clicking unsolicited links and files.
"Likewise, avoid ever giving an AI chatbot login credentials for anything or any financial details.
"You should also be careful when sharing any kind of intimate thoughts with these tools. A good rule of thumb is to refrain from sharing anything you wouldn’t want publicly known.
READ MORE SUN STORIES
"The reason behind this is that this is all information a cybercriminal could use."
He added: "Finally, never, under any circumstances share any confidential or proprietary data from your workplace. This poses a huge cyber risk for your employer."