Don’t get me wrong: I love privacy as much as the next guy. Even more, actually. I’m a psychiatrist. Also an American, in fact a Yankee, and one who respects individualism and rebellion. But I’m afraid privacy’s dead.
With patients I’ve always been strict about confidentiality: “You are free to say whatever you want, but as for me, what’s said in this room stays in this room. If you want me to give out any information a signed release is not enough. I’d need your signature on the document itself so you can approve exactly what I am sending, and you could keep a copy. Or if I talk with another doctor, you’re in charge of what I can and can’t say and you will know whatever is said.” That plus a no-action/no immediate consequences rule and a 3-session notice before quitting the psychotherapy are designed to remove all realistic reasons for the patient not to be fully honest and forthcoming, leaving the unrealistic reasons as fodder for exploration.
This works, but of course it is not the actual assurance of privacy that is operative here, but rather two other aspects. One is how the confidentiality and no impulsive action agreements help define a professional relationship, setting limits on my power (and threat) and emphasizing the importance of dialogue in the therapy. The second is how it creates an ambience, an impression, a fantasy of confidentiality. It’s that fantasy that counts, not the reality.
In reality I’m into the psychotechnologies (you know the type) with microphones, cameras, recording equipment, computers, modems, cables and other stuff variously poised or scattered around in my office along with journals, books, pamphlets and other… well… stuff. So this new patient came in and early on, totally ignoring all that electronic stuff, gazed suspiciously at the ceiling panels and asked, “Is this room bugged?” He took “no” for an answer and everything went well after that. (We did not get right into exploratory psychotherapy, by the way.)
What’s with actual privacy? Dead. Room bugged? Could be. Probably is. No. Actually, is.
Computers and cell phones have built-in microphones constantly sending signals to the central processing unit. Officially, the signals are ignored unless the User activates some app or program calling for sound input. Unofficially there are ways to bypass User control. Officially a little red light is supposed to tell when the camera is recording or streaming what it sees. Unofficially government agency hackers have reportedly kept the little red light off when enabling the cameras of “suspects” in an effort to keep the world safe for democracy. “Location” data from citizens’ mobile devices (both from GPS and plain cellphone company records) are routinely sent to central locations to enhance the User’s experience, to improve the developer’s products and for who knows what else. Our newest television monitors monitor us and are at the point of spotting what’s up when we are cuddling our sweetheart on the couch, all the better to send us timely ads for candy, matching pajamas or condoms. It’s getting worse: see here.
Experience shows that in general a therapist’s assurance of confidentiality has little or no impact on therapeutic outcome. Patients can engage in meaningful therapeutic dialog even when knowing that behind the one-way mirror or closed circuit TV there are trainees observing the session. Patients get absorbed in their work with a therapist despite a clear lack of privacy; and they deliberately withhold information despite reasonable proof of security and confidentiality.
We tell our telemental health patients: “Let us help you install HIPAA-compliant communication technology and anti-virus software. Make sure doors are closed so that other people in your home won’t overhear. Scan the room with your laptop as we watch so we can confirm that you are alone, in case you are afraid to expel an abusive significant other from the room.” That’s all fine for starters. But informed consent-wise we really have to ‘fess-up and admit that even encrypted communications via a well-set-up router do not guarantee privacy, that we know that the federal government secretly intercepts and stores conversations on a massive scale, that hackers explore and exploit even hardened federal government databases so we cannot guarantee privacy, not even a little bit. That kind of makes a mockery of ethics, but there’s no harm disclosing all this. Rather than produce shock and awe, our consent ritual fades away as the patient’s fantasies take over, allowing history-taking and psychotherapy to proceed as if privacy were alive and well.
Myron L. Pulier, MD is a clinical associate professor of psychiatry at Rutgers New Jersey Medical School and serves on the faculty and the Scientific Advisory Board of the Telemental Health Institute.
Disclaimer: The views and opinions expressed in the article and on this blog post are those of the authors. These do not necessarily reflect the views, opinions, and position of the Telebehavioral Health Institute (TBHI). Any content written by the authors are their opinion and are not intended to malign any organization, company or individuals.