Why you should never ask AI medical advice — and 9 other things to avoid

Why you should never ask AI medical advice — and 9 other things to avoid

Why you should never ask AI medical advice — and 9 other things to avoid

Author: Brooke Kato
Published on: 2024-12-26 14:21:31
Source: Latest Technology News and Product Reviews | New York Post

Disclaimer:All rights are owned by the respective creators. No copyright infringement is intended.


Chatbots might seem like trustworthy smart assistants, but experts are warning not to get too personal with the AI-powered agents.

Recent survey data from Cleveland Clinic shows that one in five Americans have asked AI for health advice, while survey statistics published last year by Tebra found that approximately 25% of Americans are more likely to use a chatbot over therapy sessions.

Experts, however, are warning users against oversharing with AI chatbots, especially when it comes to medical information.


Programmer using generative artificial intelligence for software development, typing on a keyboard in an office setting
Never share passwords, phone numbers and other identifiable data. Vane Nunes – stock.adobe.com

According to USA Today, people should avoid divulging medical and health data to AI, which does not comply with the Health Insurance Portability and Accountability Act (HIPAA).

Since chatbots such as ChatGPT are not HIPAA compliant, they should not be used in a clinical setting to summarize patient notes nor should they have access to sensitive data.

That being said, if you’re looking for a quick answer, be sure to omit your name or other identifying information that could potentially be exploited, USA Today reported.

The outlet also warned that explicit content and illegal advice are off limits, as is uploading information about other people.

“Remember: anything you write to a chatbot can be used against you,” Stan Kaminsky, of cybersecurity company Kaspersky, previously told The Sun.

Login credentials, financial information, answers to security questions and your name, number and address should also never be shared with AI chatbots. That sensitive data could be used against you by malicious actors


Asian woman at home engaging with voice assistant on her smartphone
If you must ask specific questions, try to omit as much personal data as possible by using asterisks or the word “redacted,” experts say. Charlie’s – stock.adobe.com

“No passwords, passport or bank card numbers, addresses, telephone numbers, names, or other personal data that belongs to you, your company, or your customers must end up in chats with an AI,” Kaminsky continued.

“You can replace these with asterisks or ‘REDACTED’ in your request.”

Confidential information about your company is also a major privacy faux pas,

“There might be a strong temptation to upload a work document to, say, get an executive summary,” Kaminsky said.

“However, by carelessly uploading of a multi-page document, you risk leaking confidential data, intellectual property, or a commercial secret such as the release date of a new product or the entire team’s payroll.”


Disclaimer: All rights are owned by the respective creators. No copyright infringement is intended.

Leave a Reply

Your email address will not be published. Required fields are marked *

Secured By miniOrange