Proceed with Caution: Protecting Patient Privacy in the Age of AI and Data Breaches
Proceed with Caution: Protecting Patient Privacy in the Age of AI and Data Breaches
Esther Fiore, MA & Cindy Marino, PsyD
The Growing Threat of Psychotherapy Data Breaches
Accessibility, affordability, and ease of use are likely reasons countless individuals signed up for Vastaamo Psychotherapy Centre, a telehealth therapy service established in 2009 and the largest network of private mental health clinicians in Finland (Looi et al., 2025). On October 4th, 2020, a typical day for 22-year-old Finnish college student Jere, an email notification came across his screen. In the subject line were his name, personal identity number (akin to the U.S.’s social security numbers), and "Vastaamo" (Ralston, 2021). Jere had participated in psychotherapy treatment at Vastaamo Psychotherapy Centre as a teenager, during a time when he was struggling with substance use, suicidality, and physical abuse from his parents - all of which he had discussed in detail with his treatment provider. The email threatened to publish his protected health information if the hackers did not receive €200 worth of Bitcoin within 24 hours. He was one of roughly 30,000 individuals targeted in the company's ransom attack.
Data breaches like the one that impacted Vastaamo's Psychotherapy Centre are more common than most providers realize. Although there has been a decrease in breaches caused by loss/theft of healthcare records, improper disposal, and unauthorized access, there has been a significant increase in breaches caused by hacking and ransomware attacks (Alder, 2025). The US Department of Health and Human Services Office for Civil Rights reported that, between 2018 and 2023, there was a 239% increase in data breaches related to hacking and a 278% increase in data breaches related to ransomware attacks (Alder, 2025). This threat to patient privacy is not going away.
The Rise of AI in Clinical Work
Adding to the ethical and legal complexities of patient privacy in a golden age of information technology is the increased utilization of artificial intelligence (AI) in psychotherapy. New AI platforms use machine learning to transcribe, summarize, and even analyze psychotherapy sessions. Although the extent of their use remains unclear, AI platforms such as Upheal, Mentalyc, Eleos, Praxis, and EPIC are designed to assist in writing clinical progress notes and are now widely available (Bouguettaya et al., 2025).
Many new AI platforms record the entire psychotherapy session and use the recordings to produce drafts of psychotherapy notes. Unfortunately, thousands of people have already been exposed to the risks associated with this level of monitoring. In September 2024, security researcher Jeremiah Fowler discovered a database of protected health information linked to virtual U.S. medical provider Confidant Health, much of which was publicly accessible on the internet (King & Fenno, 2024). The database contained thousands of patients’ sensitive psychotherapy data, including audio and video recordings of therapy sessions. In total, 5.3 terabytes of publicly accessible data were found, the equivalent of 75 million emails (with an average email size of 75KB) or 1,325 full-length HD movies (with an average movie size of 4GB; King & Fenno, 2024).
While mental health clinicians are not expected to be experts in cybersecurity or information technology, they must understand the associated risks, benefits, and privacy considerations anytime a new technological advance is rolled out. Compared to traditionally written progress notes, psychotherapy session recording increases the amount and type of data stored. Where data used to be limited to a note, data now include audio and/or video recordings and transcripts. The risk comes with what happens with these data after being recorded, and valid privacy concerns exist at every turn. The storage, deidentification and reidentification, access, and use of data all carry unique risks and vulnerabilities (Khalid et al., 2023; Shojaei et al., 2024). As providers, do we truly understand how AI platforms store data on third-party cloud servers? Are we aware of how long data are kept, what deidentification measures are in place, or who ultimately owns and has access to such data? These questions demand our attention. In fact, the U.S. Department of Health and Human Services encourages us to consider that even in HIPAA-compliant environments, the risk of data breaches remains – particularly with the integration of new and evolving technologies (Office for Civil Rights, 2022).
What Can Clinicians Do?
- Provide Thorough Informed Consent.
Informed consent is a crucial aspect for clinicians to consider, especially if video or audio recording of patients is taking place (Ethics Code Standards 3.10, 4.02, 4.03, and 10.01; American Psychological Association [APA], 2017). Informed consent needs to include as much education as is necessary for the patient to fully understand what information is being collected, where it is being stored, who may have access to it, and what the possible associated risks are. These risks are especially important to consider when working with individuals whose identities may inherently put them at greater risk if the data were to be leaked; for example, consider immigration status, gender identity, and sexual orientation (Ethics Code Principle E; APA, 2017).
- Consider the Rationale for Including Information in Progress Notes.
Psychologists have a professional and ethical responsibility to develop and maintain records that contain the necessary information while also protecting the client (Ethics Code Standard 6.01; APA, 2017). Clinicians also have an obligation to take reasonable steps to protect confidential information stored in any medium (Ethics Code Standard 4.01; APA, 2017). What level of detail is necessary for a specific situation? For example, say a patient is upset because of a political disagreement with their family member. When you write the progress note, are you identifying directly or indirectly the patient's political leanings? Are you identifying the involved family member? If you are, what is the clinical utility of doing so? Clinicians are often reminded, “If it’s not documented, it didn’t happen,” but also encouraged to take a less-is-more approach to protect client privacy. Striking this balance means consistently documenting events or disclosures that carry ethical and legal obligations—such as suicidal intent or clinician-client conflict—while avoiding unnecessary detail. With that in mind, policies should support documentation that includes what is ethically and legally required, keeping notes clear, concise, and purposeful.
- Consider Potential Impact of Record Exposure for the Patient.
Psychologists must always strive to benefit patients and do no harm, which extends to documentation (Ethics Code Principle A and Standard 3.04; APA, 2017). Is it necessary to include information about a patient's identity that falls under a protected status? In what circumstances is it necessary? How might the revelation of that information impact the patient should a data breach occur? For example, say you are conducting an intake with an undocumented patient, and you include this information in the report. How may inclusion of this information affect the care the patient receives from other providers? What safety concerns could arise? If future or concurrent care takes place in a state such as Missouri – a state which has introduced a Senate bill putting a 'bounty' on undocumented immigrants (Creates Provisions Related to Illegal Aliens Act, 2025) – how might this affect their psychological well-being?
- Carefully Inspect Third-Party AI Tools.
Psychologists are ethically obligated to seek appropriate education and training when planning to provide services involving new or emerging technologies (Ethics Code Section 2.01; APA, 2017). Electronic medical records (EMR) and software utilization decisions are often made at a systems level - far out of the control of individual clinicians. However, clinicians can help improve security by raising important questions about the platforms being used. Below is a starter list of questions to ask organizations and vendors to better understand the risks of AI tools (APA, 2024; Inkster et al., 2023; National Institutes of Health. (n. d.); U.S. Department of Health and Human Services, n. d.):
- Is the platform HIPAA-compliant?
- Does the platform have any certifications to show evidence of following strict security and privacy standards, such as HITRUST, ISO, or SOC (Elendu et al., 2024; Stavrakas, 2022)?
- What specific safeguards exist (e.g., access control, encryption, audit logs)?
- Where is patient data stored?
- How long is session data retained?
- Can clinicians delete patient data upon request?
- Does the company offer business associate agreements to attest to compliance with privacy laws?
- Advocate for Data Privacy Within Your Organization.
Psychologists have an ethical duty to advocate for ethically aligned laws, policies, and regulations, especially if they have the potential to harm patients (Ethics Code Section 1.02 and 1.03; APA 2017). Decisions about software to implement at a system level are often made without clinician input and, therefore, the risk to patients may be overlooked. Here are ways clinicians can advocate (APA, 2024; Inkster et al., 2023; National Institutes of Health. n. d.); U.S. Department of Health and Human Services, n. d.):
- Raise ethical concerns in team meetings. Frame concerns about implementing new platforms around the ethics of protecting patient privacy. Cite recent breaches such as Vastaamo or Confidant Health to illustrate present risks. Advocate for increased cyber security training for clinicians and staff.
- Propose Privacy Impact Assessments (PIAs). A PIA analyzes how personally identifiable information is collected, used, shared, and maintained (U.S. Department of Defense, n. d.). Encouraging leadership, including compliance officers or ethics committees, to conduct a formal PIA can help to identify risks and plan for how to address them in the future. For a guide to PIAs, please visit the National Privacy Commission’s 2018 PDF guide.
Summary
Issues with the use of AI technology and the potential for data breaches in clinical work should not be taken lightly. Jere’s experience as a victim of the Vastaamo Psychotherapy Centre data breach illustrates how little control clinicians ultimately have once a note is entered into an electronic record system, making it all the more critical for psychologists to carefully consider how their practices might put patients at heightened privacy risks in an age of technology-driven progress.
References
Alder, S. (2025, April 17). Healthcare data breach statistics. HIPAA Journal.
https://www.hipaajournal.com/healthcare-data-breach-statistics/
American Psychological Association. (2017). Ethical principles of psychologists and code of conduct (2002, amended June 1, 2010, and January 1, 2017). https://www.apa.org/ethics/code
American Psychological Association. (2024, October 25). APA’s AI tool guide for practitioners.
https://www.apaservices.org/practice/business/technology/tech-101/evaluating-artificial-intelligence-tool
Bouguettaya, A., Team, V., Stuart, E. M., & Aboujaoude, E. (2025). AI-driven report-generation tools in mental healthcare: A review of commercial tools. General Hospital Psychiatry, 94, 150–158. https://doi.org/10.1016/j.genhosppsych.2025.02.018
Creates Provisions Related to Illegal Aliens Act, S. bill 72, Missouri Senate. (2025).
https://www.senate.mo.gov/25info/BTS_Web/Bill.aspx?SessionType=R&BillID=523
Elendu, C., Omeludike, E. K., Oloyede, P. O., Obidigbo, B. T., & Omeludike, J. C. (2024). Legal implications for clinicians in cybersecurity incidents: A review. Medicine, 103(39), Article e39887. https://doi.org/10.1097/MD.0000000000039887
Inkster, B., Knibbs, C., & Bada, M. (2023). Cybersecurity: a critical priority for digital mental health. Frontiers in Digital Health, 5, 1-7.
Looi, J. C., Allison, S., Bastiampillai, T., Maguire, P. A., Kisely, S., Reutens, S., & Looi, R. C. (2025). Cybersecurity lessons from the Vastaamo psychotherapy data breach for psychiatrists and other mental healthcare providers. Australasian Psychiatry: Bulletin of Royal Australian and New Zealand College of Psychiatrists, 33(1), 106–110. https://doi.org/10.1177/10398562241291340
Khalid, N., Qayyum, A., Bilal, M., Al-Fuqaha, A., & Qadir, J. (2023). Privacy-preserving artificial intelligence in healthcare: Techniques and applications. Computers in Biology and Medicine, 158, 1-21. https://doi.org/10.1016/j.compbiomed.2023.106848
King, D., & Fenno, L. (2024). Data security, data privacy, and telehealth. Psychiatric News, 59(12). https://psychiatryonline.org/doi/epub/10.1176/appi.pn.2024.12.12.33
National Institutes of Health. (n. d.). Privacy impact assessments. Office of Management Assessment. Retrieved April 10, 2025 from https://oma.od.nih.gov/DMS/Pages/Privacy-Program-Privacy-Impact-Assessments.aspx
National Privacy Commission. (2018). Privacy impact assessment guide.
https://privacy.gov.ph/wp-content/uploads/2022/01/NPC_PIA_0618.pdf
Office for Civil Rights. (2022). Cybersecurity newsletter: HIPAA and health IT developers. U.S. Department of Health and Human Services. https://www.hhs.gov/hipaa/for-professionals/security/guidance/index.html
Ralston, W. (2021, May 4). They told their therapists everything. Hackers leaked it all. WIRED.
https://www.wired.com/story/vastaamo-psychotherapy-patients-hack-data-breach/
Shojaei, P., Vlahu-Gjorgievska, E., & Chow, Y.-W. (2024). Security and privacy of technologies in health information systems: A systematic literature review. Computers, 13(2), 41. https://doi.org/10.3390/computers13020041
Stavrakas, H. (2022, December 21). SOC 2 in healthcare: Why do SOC reports matter for audit compliance? Linford & Company LLP.
https://linfordco.com/blog/soc-2-healthcare-audits/?utm_source=chatgpt.com
U.S. Department of Health and Human Services. (n. d.). Cloud computing. Health Information Privacy. Retrieved April 10, 2025 from https://www.hhs.gov/hipaa/for-professionals/special-topics/health-information-technology/cloud-computing/index.html
U.S. Department of Defense. (n. d.). Privacy impact assessment. Privacy and Civil Liberties Directorate. Retrieved April 10, 2025 from https://pclt.defense.gov/DIRECTORATES/Privacy-and-Civil-Liberties-Directorate/Privacy/Privacy-Impact-Assessment/