CHATGPT HIPAA

ChatGPT HIPAA Considerations

MARLENE MAHEU, PhD

August 20, 2023 | Reading Time: 4 Minutes
2,525

Please support Telehealth.org’s ability to deliver helpful news, opinions, and analyses by turning off your ad blocker. How

ChatGPT HIPAA compliance is one of the hottest topics at 2023 conferences and with good reason. AI technologies like ChatGPToffer various healthcare solutions, including re-organizing and managing back-office tasks such as keeping psychotherapy notes. They also offer unprecedented ways to compromise privacy. Ensuring compliance requires careful consideration by practitioners and organizations alike. This article will outline general steps to ensure ChatGPT HIPAA compliance, then offer more specific areas to be addressed by clinicians and healthcare organizations.

Here are some steps for healthcare providers and health systems to consider implementing immediately:

  • Policy Development. Develop, implement, and regularly update policies governing the use of AI in handling PHI. These policies should align with HIPAA’s Security Rules and include procedures for responding to security incidents and breaches and updating your security risk assessment.
  • Encryption and Security Measures. Employ encryption and other advanced security measures to protect PHI in transit or at rest. This includes utilizing firewalls, access controls, and secure authentication methods.
  • Risk Assessment. Conduct regular risk assessments to identify and address potential vulnerabilities in AI-driven processes that handle protected health information (PHI). This includes implementing ongoing monitoring and audit trails.
  • Patient Informed Consent. Ensure clear patient consent processes for collecting and using data in AI algorithms, with transparency about how the data will be used.
  • Data De-Identification. Implement robust de-identification techniques to remove or alter personal identifiers within health data. All identifiers must be meticulously removed or “scrubbed” before engaging any chatbot. This includes sensitive information such as psychotherapy notes within EHRs, where special considerations apply. HIPAA’s Privacy Rule provides essential guidance on de-identification standards (HHS, 2021). 
  • Ethical Guidelines. Consider ethical guidelines and best practices from your national professional organizations and the American Health Information Management Association (AHIMA).
  • Transparency and Accountability. Maintain transparency with patients and clients regarding the use of AI, and establish clear lines of accountability within your practice or organization for managing and overseeing AI-driven processes, including ChatGPT.
  • Utilize Certified Technology. When possible, use AI technology certified or endorsed by recognized healthcare and technology organizations, demonstrating adherence to privacy and security standards. If third-party AI services are used, ensure that vendor agreements include robust security and privacy protections and that vendors comply with HIPAA regulations. Conduct due diligence in selecting vendors.
  • Monitoring and Continuous Improvement. Continuously monitor and evaluate the effectiveness of compliance measures, making improvements as needed. Engage in regular HIPAA audits to ensure ongoing compliance. 
  • Collaboration with Legal Experts. Work closely with legal and compliance experts to keep abreast of federal, state, and local regulations regarding the use of AI in healthcare.
  • Training and Education. Provide extensive and recurring training for staff on AI’s legal and ethical implications, including specific guidance on handling sensitive information like psychotherapy notes using ChatGPT.

For Clinicians 

Navigating the delicate balance of utilizing ChatGPT without breaching confidentiality is challenging. Clients can be compromised if clinicians use ChatGPT without understanding how to assure compliance. From extracting clinical notes from video recordings of sessions with clients, to organizing psychotherapy notes, or hastening contacts with insurers, ChatGPT can be used to walk a very fine line with regard to HIPAA compliance. Some of the currently discussed uses we at Telehealth.org have heard have raised eyebrows.

Other potentially problematic areas have included these general issues:

  • In-office communications with an admin, addressing a quick supervision question, or outside digital consultation can inadvertently include protected health information (PHI), such as a client’s residential details or past hospital admissions. 
  • PHI can easily be captured using non-HIPAA-compliant text messaging software built into telephones, such as iMessage in iPhones. Other non-HIPAA-compliant technologies, such as cloud storage systems, can be used to mine data. For example, Apple Cloud is not HIPPA compliant. Then the data can be transferred to ChatGPT.
    • AI ChatGPT can then be used behind the scenes to process this information, particularly if the individual has developed an online profile associated with an organization or institution.
      • Every data crumb of PHI released about a client or patient can be used to their disadvantage later, as it is combined with other data crumbs over the years. For over two decades, numerous companies have collected “data crumbs” to build profiles to sell to marketing firms, police departments, and many other groups. While we at Telehealth.org cannot know exactly what is being sold to whom, we can point to the need for all licensed professionals to protect PHI.
      • Data brokers selling PHI from apps to the highest bidder have already been identified by researchers.
      • The Federal Trade Commission (FTC), US Health & Human Services (HHS), and the Office for Civil Rights (OCR) recently issued a joint warning about illegal tracking technologies used in healthcare, largely unbeknownst to practitioners.
    • If you work for an institution or organization, enter your name and other identifying information into ChatGPT to see what it can find about you. (I was surprised at how well-presented and documented my bio is. Search for “Marlene Maheu, San Diego”.)

If ChatGPT can organize information about individuals this year, what will it find next year, next decade, or in 25-50 years? Protecting those who entrust us with their care is one of our primary obligations. If you are interested or already have been using ChatGPT, get essential training to ensure that your actions comply with HIPAA, state law, and the ethical privacy standards held by your profession.

For Health Systems 

The responsibility to protect patient privacy extends to offering robust training on the risks associated with ChatGPT, emphasizing opportunities and potential pitfalls. Immediate initiation of training is vital, as adoption rates of chatbots are increasing. This training must be ongoing and integrated into annual HIPAA and privacy instruction.

Some health systems may choose a more restrictive approach, limiting chatbot accessibility solely to trained employees or blocking network access to chatbots altogether. Like many emerging technologies, this is a dynamic space where health systems continually work to align innovation with regulation and ethical practice.

By following these guidelines, clinicians and health systems alike can explore the potential benefits of AI’s publically available ChatGPT while adhering to the essential principles of patient privacy and HIPAA compliance. The sensitive nature of information such as psychotherapy notes makes these precautions advisable and crucial, in line with legal obligations and the commitment to patient trust and care.

Conclusion

Staying HIPAA compliant while integrating AI into healthcare requires a multifaceted approach combining technology, policy, education, ethics, and continuous vigilance. The above considerations offer a roadmap, but consultation with legal and regulatory experts specific to the healthcare provider’s jurisdiction and practice area is strongly advised.

Disclaimer

The information provided here is a general guideline and does not constitute legal or professional advice. For personalized guidance, consult with legal and compliance professionals in your jurisdiction. Professional training may also be helpful for understanding the many time-saving legal and ethical uses for AI, such as ChatGPT.

References

  1.  U.S. Department of Health & Human Services. (2021). Guidance Regarding Methods for De-identification of Protected Health Information in Accordance with the Health Insurance Portability and Accountability Act (HIPAA) Privacy Rule. Retrieved from HHS.gov
  2. American Medical Association. (2019). Policy on Augmented Intelligence in Health Care. Retrieved from AMA-assn.org
Therapist AI & ChatGPT: How to Use Legally & Ethically

Immerse yourself in our highly-engaging eLearning program and delve into the uncharted territory of Artificial Intelligence (AI) in Behavioral Healthcare!

Essential Telehealth Law & Ethical Issues

Bring your telehealth practice into legal compliance. Get up to date on inter-jurisdictional practice, privacy, HIPAA, referrals, risk management, duty to warn, the duty to report, termination, and much more!

Advanced Telehealth Regulations & Ethical Issues: Best Practices & Informed Consent

Essentials of practice guidelines published by the leading professional associations, explained with a focus on what-to-do rather than theory that leaves you empty-handed.

HIPAA Compliant Cybersecurity for Professionals

Must-know information about how to protect your telehealth practice from a ransomware attack. Operate w/ EYES WIDE OPEN.

Disclaimer: Telehealth.org offers information as educational material designed to inform you of issues, products, or services potentially of interest. We cannot and do not accept liability for your decisions regarding any information offered. Please conduct your due diligence before taking action. Also, the views and opinions expressed are not intended to malign any organization, company, or individual. Product names, logos, brands, and other trademarks or images are the property of their respective trademark holders. There is no affiliation, sponsorship, or partnership suggested by using these brands unless contained in an ad. Some of Telehealth.org’s blog content is generated with the assistance of ChatGPT. We do not and cannot offer legal, ethical, billing technical, medical, or therapeutic advice. Use of this site constitutes your agreement to Telehealth.org Privacy Policy and Terms and Conditions.

Was this article helpful?
YesNo

Please share your thoughts in the comment box below.

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments

Register for Free

Receive Any of Our 57 FREE Newsletters!

REGISTER

Most Popular Blog Topics

You May Also Like…