OpenAI Launches ChatGPT Health Amidst Safety Concerns
OpenAI has introduced ChatGPT Health, a specialized generative artificial intelligence (AI) tool designed to offer personalized health and wellness advice. The platform enables users to link medical records, wellness applications, upload diagnostic images, and interpret test results. This introduction occurs amidst ongoing discussions concerning AI accuracy and safety in healthcare, with independent research highlighting instances where such tools have provided unsafe health advice.
Introduction to ChatGPT Health
OpenAI's ChatGPT Health is a dedicated generative AI tool focused on health and wellness. It aims to provide personalized answers by allowing users to connect their accounts with medical records and smartphone applications like MyFitnessPal. The tool also facilitates the uploading of diagnostic images and the interpretation of test results.
Currently, access for users in Australia is available via a waitlist. It is important to note that medical record and some app integrations are presently available only in the United States.
AI Health Tool Usage Trends
Data from 2024 indicates that 46% of Australians had recently utilized an AI tool. Health-related queries represent a frequent application globally, with OpenAI reporting that one in four regular ChatGPT users submit a health-related prompt weekly.
A 2024 Australian study estimated that nearly one in ten Australians had used ChatGPT for a health query within the previous six months. This trend was observed to be more prevalent among groups that may face challenges in accessing health information, including individuals born in non-English speaking countries, those speaking another language at home, and people with limited health literacy.
Approximately 39% of non-users were considering using AI for health advice. These tools may be perceived by some users as offering rapid, confident, and personalized responses, potentially more private than consulting a human.
Accuracy and Safety Concerns
Independent research has consistently indicated that generative AI tools can provide unsafe health advice, even when granted access to medical records.
Notable instances cited include ChatGPT reportedly encouraging suicidal thoughts. Separately, Google removed several AI Overviews on health topics after investigations identified information deemed misleading or dangerous.
ChatGPT Health Features and OpenAI's Statements
ChatGPT Health incorporates new personalization features, allowing users to connect their accounts with medical records and smartphone apps. This integration enables the tool to access personal data such as diagnoses, blood test results, monitoring information, and context from general ChatGPT conversations.
OpenAI states that ChatGPT Health conversations are kept separate from general ChatGPT, with enhanced security and privacy measures. The company also asserts that data from ChatGPT Health will not be used to train foundation models. OpenAI reports having collaborated with over 260 clinicians across 60 countries, including Australia, to improve the tool's outputs.
OpenAI explicitly states that ChatGPT Health is not intended to replace medical care, diagnosis, or treatment, and cautions that the tool may still make errors.
Regulatory and Australian Context Challenges
Several limitations and risks have been identified, particularly in the Australian context:
- Limited Independent Information: There is a lack of independent information regarding the tool's accuracy, safety, or the quality of its source summarization. The tool has not undergone independent testing.
- Unclear Regulatory Classification: Its regulatory classification in Australia as a medical device remains unclear.
- Alignment with Australian Guidelines: ChatGPT Health's responses may not align with Australian clinical guidelines, health systems, or the specific needs of priority populations, including First Nations people, culturally and linguistically diverse individuals, people with disabilities, those with chronic conditions, and older adults.
- Data Privacy and Security: Data privacy and security standards for medical records in Australia are a significant consideration.
- Incomplete Medical Records: Incomplete medical records within Australia's MyHealthRecord system could limit the AI's comprehensive understanding of a user's medical history.
Recommendations for AI Health Tool Use
Recommendations suggest differentiating between types of health questions when using AI:
- Higher-risk inquiries: Questions requiring clinical expertise, such as interpreting symptoms, seeking treatment advice, or understanding test results, are generally recommended to be directed to a health professional.
- Lower-risk questions: General inquiries, such as learning about a health condition, understanding medical terms, or preparing questions for medical appointments, can utilize AI as one of multiple information sources.
Australian Alternative Health Resources
In Australia, individuals can access free 24/7 national health advice through 1800 MEDICARE (1800 633 422), a phone service connecting callers with a registered nurse. Healthdirect's publicly funded Symptom Checker is also available as an alternative resource.
Future Considerations for AI in Health
As AI tools continue to develop, there is an identified need for clear, reliable, independent, and current information regarding their functionality and limitations. Future AI health tools are suggested to be developed with community and clinician input, prioritizing accuracy, equity, and transparency. Educating diverse communities on the safe navigation of this technology is also considered crucial.