More than 40 million people daily turn to ChatGPT for health information, according to OpenAI. | ECRI also sounded the alarm ...
ECRI received dangerously inaccurate information when asking large language models questions about medical products and technologies.
Misuse AI chatbots tops list of 2026 health hazards, warns ECRI, highlighting risks from unregulated, inaccurate chatbot guidance in healthcare.
About 1 in 4 teenagers now use AI chatbots for mental health support, with young adults affected by violence being more likely to seek help from chatbots, according to a new study by the Youth ...
Healthcare professionals are finding AI to be nothing short of an asset in producing efficient communication and data organization on the job. Clinicians utilize AI for managing medical records, ...
PsyPost on MSN
AI chatbots tend to overdiagnose mental health conditions when used without structured guidance
A new study published in Psychiatry Research suggests that while large language models are capable of identifying psychiatric diagnoses from clinical descriptions, they are prone to significant ...
About 1 in 8 adolescents and young adults in the U.S. are using AI chatbots for mental health advice, according to a new study published in the Journal of the American Medical Association. The study, ...
Convenience and accessibility are drawing users to AI therapy, but can these tools truly support mental health or do they ...
Researchers found about 1 in 8 adolescents and young adults are turning to ChatGPT, Gemini, My AI and other types of generative artificial intelligence for advice and help with emotional distress.
Artificial intelligence chatbots help Department of Veterans Affairs doctors document patient visits and make clinical ...
Senator Ed Markey sent a letter this week to OpenAI CEO Sam Altman and other tech leaders expressing concerns over the use of ...
The nonprofit said technologies like ChatGPT have suggested incorrect diagnoses, invented body parts and otherwise provided ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results