Is AI Really Ready for Therapists? An interview with Dr. Maelisa McCaffrey
Curt and Katie interview Dr. Maelisa McCaffrey of QA Prep about her assessment of the AI tools available for therapists. We chat about the high expectations many clinicians have for note writing tools (and whether these expectations are really reasonable right now). We also look at what therapists are getting wrong when starting to use these tools, exploring some concerns related to HIPAA compliance and who is actually putting together these tech tools.
Click here to scroll to the podcast transcript.
Click here to scroll to the podcast transcript.
An Interview with Dr. Maelisa McCaffrey, QA Prep
Dr. Maelisa McCaffrey is a psychologist, nail design enthusiast, and multi-passionate entrepreneur. Through her business QA Prep, she empowers therapists with trainings and consultation on clinical documentation. Maelisa focuses on the “why” behind the usual recommendations and encourages clinicians to think outside the box, while also keeping their ethics intact. As someone with ADHD who’s had to figure out what works through trial and error, Maelisa aims to make sure her trainings are practical, while also allowing for plenty of laughter and fun.
In this podcast episode, we talk about the development of AI tools for therapists
Curt and Katie asked Dr. Maelisa McCaffrey to come on and talk about what she thinks about AI for documentation.
What are therapists getting wrong about AI?
“AI is not your silver bullet to getting paperwork done, to getting progress notes written magically, quickly and perfectly.” – Dr. Maelisa McCaffrey
- Therapists believe that AI can do their notes, but it is often a lot of work and/or is an expensive application is required
- There is a choice when using a free platform and/or an untrusted platform is that you are either writing an insufficient note or adding PHI, which causes you to break HIPAA
- AI for notes takes a very long time due to how slowly they process the information as well as your need to review and edit each note
- Some platforms are claiming to be HIPAA compliant and are not
What are the different ways that AI works to provide documentation?
- AI listens to the session and/or you upload the recording or a transcript
- You enter the information on what happened in the session and AI writes the formal notes
What do therapists need to know about an AI platform before using it?
- Checking for actual HIPAA compliance versus a false statement about HIPAA compliance
- Understand how it is telling you to use the software
- Do they give you a BAA?
- The pricing is relatively similar to an EHR – $10-$40 per month
- This is not an electronic health record or practice management system
- Some of the AI applications do not have access to the diagnosis, assessment, treatment plans, so you will have to insure that you prove medical necessity and demonstrate the clinical loop
Are these AI platforms really ready for therapists?
- The tech experts are taking care of data security
- Many companies are still figuring these things out
- All of the platforms Maelisa tested created fake elements of sessions (i.e., put things in the notes that did not happen)
What else can therapists use AI for?
“For things where you have no idea where to start, [AI] can be a really good starting place to help you filter information.” – Dr. Maelisa McCaffrey
- Creating templates for progress notes (i.e., not for a specific client, no PHI)
- Creating resources for clients
What do therapists need to do for their clients if they are using AI?
- You must inform your client that you are using AI (informed consent)
- AI is experimental, so it must be optional for your clients to opt in
- You must insure that the platform is actually secure and HIPAA compliant
- Ethics codes aren’t really saying anything, but some statements could be coming out soon on the ethics of how to use AI with or for clients
What do therapists need to know about AI?
“We might as well be a part of making these AI therapists and these AI therapy services better. I think if we could collaborate [with the people developing AI therapists and therapy services]…it could really be a very beautiful partnership.” – Dr. Maelisa McCaffrey
- It is very new and changing constantly
- This is going to happen, so we need to understand and participate in this transition
- New resources will be created through AI
Resources for Modern Therapists mentioned in this Podcast Episode:
We’ve pulled together resources mentioned in this episode and put together some handy-dandy links. Please note that some of the links below may be affiliate links, so if you purchase after clicking below, we may get a little bit of cash in our pockets. We thank you in advance!
QA Prep website: www.qaprep.com
Social Media Links:
LinkedIn Maelisa McCaffrey, Psy.D.
Relevant Episodes of MTSG Podcast:
Who we are:
Curt Widhalm, LMFT
Curt Widhalm is in private practice in the Los Angeles area. He is the cofounder of the Therapy Reimagined conference, an Adjunct Professor at Pepperdine University and CSUN, a former Subject Matter Expert for the California Board of Behavioral Sciences, former CFO of the California Association of Marriage and Family Therapists, and a loving husband and father. He is 1/2 great person, 1/2 provocateur, and 1/2 geek, in that order. He dabbles in the dark art of making “dad jokes” and usually has a half-empty cup of coffee somewhere nearby. Learn more at: http://www.curtwidhalm.com
Katie Vernoy, LMFT
Katie Vernoy is a Licensed Marriage and Family Therapist, coach, and consultant supporting leaders, visionaries, executives, and helping professionals to create sustainable careers. Katie, with Curt, has developed workshops and a conference, Therapy Reimagined, to support therapists navigating through the modern challenges of this profession. Katie is also a former President of the California Association of Marriage and Family Therapists. In her spare time, Katie is secretly siphoning off Curt’s youthful energy, so that she can take over the world. Learn more at: http://www.katievernoy.com
A Quick Note:
Our opinions are our own. We are only speaking for ourselves – except when we speak for each other, or over each other. We’re working on it.
Our guests are also only speaking for themselves and have their own opinions. We aren’t trying to take their voice, and no one speaks for us either. Mostly because they don’t want to, but hey.
Stay in Touch with Curt, Katie, and the whole Therapy Reimagined #TherapyMovement:
Consultation services with Curt Widhalm or Katie Vernoy:
Connect with the Modern Therapist Community:
Modern Therapist’s Survival Guide Creative Credits:
Voice Over by DW McCann https://www.facebook.com/McCannDW/
Music by Crystal Grooms Mangano https://groomsymusic.com/
Transcript for this episode of the Modern Therapist’s Survival Guide podcast (Autogenerated):
Transcripts do not include advertisements just a reference to the advertising break (as such timing does not account for advertisements).
You’re listening to the Modern Therapist’s Survival Guide where therapists live, breathe and practice as human beings. To support you as a whole person and a therapist, here are your hosts, Curt Widhalm and Katie Vernoy.
Curt Widhalm 0:15
Welcome back modern therapists. This is the Modern Therapist’s Survival Guide. I’m Curt Widhalm, with Katie Vernoy. And this is the podcast for therapists about things going on in our practices, things going on in our world. And we are joined, once again, it has been one of our very, very first guests from the podcast, Maelisa McCaffrey is coming back adding to the conversations about AI as it comes to our practices. The when we had first kind of started talking about this about a year ago, AI was coming to the therapy world was in its infancy. It has now been around for well over a year at this point. And the documentation Maven, I think is how I’ve always kind of pictured Maelisa, and just very excited to have you back today.
Dr. Maelisa McCaffrey 1:04
Yeah, thank you. I’m excited to be here.
Katie Vernoy 1:07
So, as you know, what we ask our guests to start out with is who are you and what are you putting out into the world?
Dr. Maelisa McCaffrey 1:14
I am a lot of things. I’m a psychologist. I’m a creative. And I’m an upcoming author.
Katie Vernoy 1:24
Dr. Maelisa McCaffrey 1:24
Exciting. Yes. Yes. Nature lover, nail design enthusiast. Those are the big things people lover. And things I’m putting out into the world are right now a lot of videos. So, like literally YouTube videos on all things, documentation, and groups. So, I’ve been running a lot of groups for therapists who are behind in their paperwork. So this is not about this. This is not what this episode is about. But if that’s you, you’re not alone. And that’s been awesome. Yeah.
Katie Vernoy 1:59
And that’s through QA Prep, right?
Dr. Maelisa McCaffrey 2:01
Yes, yes, with with my business, QA Prep. So it’s been a lot of fun.
Curt Widhalm 2:06
So when we had first talked about AI on our podcast, it was kind of around here’s legal and ethical ideas around what it should be. And a lot of it was kind of what was available at the time, as far as kind of the interplay of here’s what it could be, here’s maybe an introduction of things to consider when it comes to protected health information and that kind of stuff. And kind of some cautionary, like, let’s not maybe jump headfirst into this. And now that it’s approximately 12 months later, and everybody has jumped headfirst into it. And we start a lot of our episodes with the question of like, from a learning place, not a shaming place; what are people getting wrong about AI and therapy?
Dr. Maelisa McCaffrey 2:51
Yes, so obviously, this has been a really popular topic in my community. So you know, I do focus on helping people their documentation. And I would say, I was really surprised, I went to a training in person, like probably the only one I’ve been to in years, and in person. And most of the therapists there were not excited about AI. The presenter talked about it and had everyone raise hands. And it was like two people. But in my community, a community of people who are struggling with paperwork. They’re really excited because it feels like the silver bullet. So I would say the biggest thing to shatter your dreams is that AI is not your silver bullet to getting paperwork done, to getting progress notes written magically, quickly and perfectly. That’s kind of an overarching theme I’m seeing is that people are really hoping I’ll sign up for this thing, and it’s going to do my notes for me. There’s there’s so much here. I mean, the whole episode could just be on this one question. I’ll just pick a couple of the big, the big things. The first thing there, but they’re getting wrong about AI is that there is no such thing as writing a good individualized progress note, and de-identifying information. So this is probably the most common topic that comes up with ChatGPT specifically, and it comes up with other, I don’t want to call them out by name because I want to be 100% sure, and right now, I’m not 100% sure, but there is a you know, there is a platform an AI platform specifically for therapists for helping them with notes that is not HIPAA compliant, and that has misguided, misleading information on their homepage about being HIPAA compliant. So, you do want to really, you know, be diligent about that. But essentially, even that platform says don’t put in PHI and then it gives you all this information that you need to put in to that system in order to write a progress note and all of that information will be considered PHI. So, and PHI is personal health information. That is a really, really big problem. And I don’t want to really disappoint people. But that’s just the fact. And when I consulted with an attorney who’s also a licensed professional counselor about this, even he said to me, you know, as a therapist, you’re either paying for it with your own money, paying a platform to be HIPAA compliant so that you can enter PHI ethically, or you’re paying for it by essentially selling your clients data to a company that’s not HIPAA compliant. And so kind of thinking from that mindset of like, you cannot write a note about a session without including that person’s symptoms, diagnosis, their identification, like gender, age, their location, there are a lot of factors that would be very easy for an AI, an intelligent platform to determine about someone, once you put in enough information to write a good progress note. So, that’s the number one thing there’s no such thing as de-identifying a progress note.
Katie Vernoy 6:09
Well and I think…
Dr. Maelisa McCaffrey 6:10
That doesn’t match up.
Katie Vernoy 6:11
Yeah, so I just want to make sure I’m understanding. So like, if I go into a free platform and say, Well, I’m going to vaguely talk about what happened in the session, and ask for a note, I’m not getting a good note. But if I put in enough information to get a good note, now I’ve broken up, I broken HIPAA, and I’ve provided PHI to a platform that has no need to keep that information private.
Dr. Maelisa McCaffrey 6:36
Exactly. And I am not like a doom and gloom type person. I’m a, Hey, let’s go ahead and hang out in the gray area type of ethics person, too. But this is just too new to make that excuse. You don’t have to use AI to write your progress notes. It’s not like a necessary thing at this point. And it’s still so new and so unknown. It’s not worth taking that risk. And I say this as someone who’s not necessarily risk averse, you know, to me, it’s just, it’s too much. So that would be the big thing. Probably the biggest thing. The next thing that therapists are getting wrong about AI is kind of what I alluded to earlier, which is that it’s, it’s not going to save you all the time you want it to. I tested, I’ve tested a bunch of these platforms. And the thing I’ve been most consistently surprised by is how slow they they are not ChatGPT. I tested ChatGPT you know, and I’ve used it for other things. ChatGPT is awesome. It’s fast. It’s amazing. And if you give it a fake session or a real session, which we’re saying not to do, it’ll create a progress note in like five seconds, I mean, it’s crazy. These other platforms that are HIPAA compliant are not doing that. I mean, it’s literally like put in your information, go and have a coffee, go and get your coffee, come back, sit down, maybe it’s done. It…
Katie Vernoy 6:36
And then do you have to edit it afterwards?
Dr. Maelisa McCaffrey 8:05
And then you have to review it and edit it because it does get things wrong. All of them. Every single one I have tested so far, has added things that did not happen in the session.
Katie Vernoy 8:19
Dr. Maelisa McCaffrey 8:20
Every single one. So you do have to check it. Now, you know, and I’m not saying this as like a perfectionist, like, if a few of these things get by, is it going to end your career? No, right. But this is not something we want to rely on. You know, that’s, that’s the the thing I want people to really know. So it does take time. And then on top of that, there’s really only two way there, there’s two main ways to use AI for progress notes. So, either the AI kind of logs in like we are on something like Zoom some kind of platform and listens to your session, or you record your session, upload a recording of the session, then the AI writes a note. The other big way is you enter the information from what happened in your session. And then the AI writes a note from that. Well, if you have to enter that data into an AI platform, you have already taken up as much time as you would just writing the note on your own. And then on top of it, you have to then go you know, hit submit, go get your coffee, wait for it to produce the note, come back and then review the note. And I swear to you, I can teach you how to write a note faster than that. So that’s I would say those are the two big things. It’s not the saving grace you want it to be right now, that might improve. And you can’t de-identify enough information to get a good progress note.
Katie Vernoy 9:50
For the ones that listen how does that work?
Dr. Maelisa McCaffrey 9:52
So you can either upload a recording of your session. So there are people who for example, like use their phone. And so when we can get into you know, now we can get into all the like tech stuff like is that the HIPAA compliant to use your phone to record it and then upload it. And we won’t even get into all those details. But or it’s very similar to using whatever platform you already log in to do the session if you’re doing it virtually. And it’s just like a little AI bot in the corner that’s like, most of them do have this now where at least would say this is being recorded. So a client couldn’t accidentally be recorded without their knowledge. And then it like you end the session, and then it automatically generates a note within, you know, 10-15 minutes.
Curt Widhalm 10:38
What are some of the things that people should be looking at in some of these, these companies, as far as how they handle stuff? You’re taught, you’re not the first person that I’ve heard say that there are companies out there that will put the little HIPAA logo on their website, and then you get to actually deep into the user terms and you see, oh, no, they aren’t actually HIPAA compliant at all. So I’m wondering if you can talk maybe a little bit more kind of on the tech end, like, there seems to be kind of this: How much of tech experts do therapists need to be? Like, what understanding of technology do they need to have and be able to apply in their decision making process so that way, they can make the decision of Company A has this stuff advertised and they follow through on it. Where Company B says that they’re going to do it, but they don’t. What are some of the identifying things that people should be looking for?
Dr. Maelisa McCaffrey 11:33
Yeah, that’s a great question. The big giveaway is how it tells you to use the platform. So for example, the kind of popular one that’s out there that isn’t HIPAA compliant, but has a nice little symbol on their homepage. When you look at how to use the platform, it says, Do not enter any PHI.
Curt Widhalm 11:54
And I think, to interrupt you for a second, one of the questions that we get back from our audience all of the time is, who is this company? And why are we afraid of naming names?
Dr. Maelisa McCaffrey 12:05
Oh, yeah. Okay. Well, I was I just wanted to make sure I’m not thinking of, there are a couple I get mixed up sometimes. But yeah, AutoNotes, right now, that would say don’t use, because as of right now, maybe this will change. But they say they’re HIPAA compliant on their homepage, when you look at how to use it, it says not to enter PHI. And then when you dig into their terms and conditions, it has details that says there they are, I forget exactly what it says. But essentially, it says they’re not HIPAA compliant. That’s really, really concerning. Because I don’t want everyone to feel like you do have to dig into everyone’s Terms and Conditions every time you sign up for something electronic. And, and usually you don’t, but I would say, without feeling like you need to become a tech expert in everything, do look at how it’s telling you to use the software. So, for example, the ones that are being very careful about being HIPAA compliant, and being very mindful of all of these things, they have some type of aspect that if there is a recording, for example, they ensure that the client is aware they’re being recorded. Right. So you know, kind of these practical things that we would consider as therapists, is the tech company thinking of that, too? So companies like Mentalyc is another one that is, has started pretty early. And I think they’re pretty good as far as like the HIPAA compliance stuff and being mindful of that. And they adopted things like that very early, compared with some of these other AI platforms. They have like a consent form and recommendations for how you can talk with your client about getting consent. You know, a company that is coming with that type of information is very different than a company that is just giving you claims about saving you time. And then when you dig in, they’re talking about not putting in PHI. Right. So, it’s not a black and white answer. I mean, aside from read the terms and conditions. But I guess one quick thing is do they give you a BAA. So, AutoNotes won’t give you a business associate agreement, a BAA, which is what you need to be HIPAA compliant, you know, here in the US. So, if you’re not getting a BAA, it’s not HIPAA compliant. You can’t use it for client records. Like that’s a black and white, easy thing.
Katie Vernoy 14:34
What are the price points on some of these? Do you have a sense of it?
Dr. Maelisa McCaffrey 14:38
I do. So they’re typically a little bit cheaper than an EHR. So it’s anywhere from like $10 a month to maybe like $40 to $50 a month. Most of them are in that price range.
Katie Vernoy 14:50
Curt Widhalm 14:51
So, I’m glad you’re bringing up EHR because I’ve talked with a couple of these companies and you know, this is a place that has notes, they have client records at some point. EHRs are where we store client records. What makes some of these AI note generating companies not EHRs? Because I don’t know that I’ve had satisfactory answers from them other than just kind of, well, we’re not an EHR. So…
Dr. Maelisa McCaffrey 15:17
Curt Widhalm 15:18
What might be some of that distinguishing kind of like what makes a EHR as SimplePractice and TherapyNotes an EHR? Versus here’s a company that I upload a video and they’ve got all sorts of PHI, and BAAs, and all of this kind of stuff. You know, if a client is sending out subpoenas, what is an EHR getting versus what is an AI company getting?
Dr. Maelisa McCaffrey 15:43
Yes, and there’s some nuance here, right, with like, we’re saying EHR, electronic health record. But a lot of these would call themselves a practice management system, right? Like, it’s more of a practice management system where it has your calendar, and it has things integrated, and it has client records, but it also has a way of messaging clients, right. So, so there’s some nuance there. But in general, we say EHR, you know, most of us when we’re talking with one another. So an EHR would have a full client record. Meaning if you did get a subpoena your clients, like I want a copy of my records, everything you would need to copy, everything about that person’s treatment is in that file, right, that electronic file. These AI platforms are time saving documentation tools, they’re AI tools, but they’re not a client record keeping system. So, there are none that I’m aware of that have a treatment plan, for example, or where you can have intake documents, informed consent forms signed, I think Mentalyc does allow you to have that the one AI consent signed in their platform, but not all of your other forms that you might have. Like not your intake assessment, your biopsychosocial data, you know, all those other things that you would need to have an a client record are not in these AI platforms. So they only have your progress notes, your session notes, and some of them delete the session notes. Some of them, don’t. Some of them allow you to choose if you want the session notes to be deleted or not. So that’s how they get around, you know, the HIPAA stuff and whether or not they’re saving the information. That, I mean, does that answer the question?
Curt Widhalm 17:28
I’m going to take what you’re saying, and I’m going to maybe oversimplify it here just a little bit for entertainment purposes. So, if I’m comparing this to kind of the old school, keep things on pen and paper kinds of things. If I scattered the client’s records across enough different file cabinets, and enough different places, that’s not a client record, that’s just notes kept in a separate, that’s what you’re kind of saying here is that an EHR has all of that stuff shared in one folder, but if it’s kind of broken apart and separated across different companies, that doesn’t make like a Mentalyc a EHR system in this example.
Dr. Maelisa McCaffrey 18:06
Correct. So, if you use something like Mentalyc, or Upheal is another common one, you would have to go into that system, so go into Mentalyc or Upheal, do the note, then copy it and paste it into your SimplePractice or TherapyNotes or other EHR, which is part of why it goes back to when I said it is not the time saving silver bullet you want it to be. It’s it’s actually adding time, a lot of, in a lot of cases.
Katie Vernoy 18:36
Well, I’m also thinking about I think, in the future, I think these EHRs should really have some of this built in, because I was going to ask you like how does how does the clinical loop function if these AI note writing bots don’t have what the goals are, or what the retreat overall treatment is, or what the diagnosis is. To me, it seems like, you know, if you’re doing it in the session, maybe it comes out in the note, but if they if the, if the bot doesn’t know what you’re working on, it would be hard to write a documentation, you know, any kind of documentation that shows medical necessity.
Dr. Maelisa McCaffrey 19:12
Exactly. And that’s one of the big things that I have talked with a lot of these companies about. Is that, you know, insurance requirements are very real. And it’s a common concern. And how are we addressing that? And it needs to be something written into their platform to tell the AI to look for and evaluate. But even so, even if they can say, okay, hey, AI always include this type of statement, you know, pull out whatever information from the session to create a progress statement, which is what I call it for insurance. Does that actually align with the treatment goals, right? Does that align with the symptoms, with the diagnosis? That’s something you you still have to know how to write a note and you still have to know what needs to go in the note. Right? So I do know that, I do believe that a lot of these AI platforms can save a lot of people time and effort, even if it takes you the same amount of time. But it saves you the mental effort of writing a note, I can see benefit from that. But it is something that is still new enough that the reason current, a lot of them couldn’t tell you why they’re different from an EHR is frankly, they don’t know. Like, these are not therapists creating most of these platforms. These are people that I’ve had to explain what medical necessity is too. Like, I have literally talked with these people and said, What do we do about insurance? And they’re like, Oh, well, what should we do about insurance? Right?
Curt Widhalm 20:37
Doesn’t doesn’t that bring up the concern that maybe we’re not quite ready to hand over client information? Like they, I’ve seen plenty of legal people throw out therapy terms, whether it’s, you know, lawyers for companies like this, I’ve seen judges put things down on court orders for therapy that’s like, this is conflicting information, even within your own writing that refers to different things. Like, is this something where, alright, maybe you’re a little bit more out on kind of your risk tolerance than I am with this stuff. But is this, my feeling just kind of being like, maybe they’re not quite ready yet, until they do this a little bit more before we’re ready to jump in on this?
Dr. Maelisa McCaffrey 21:18
We’re still testing it. For sure. I’m not, I’m not as concerned about the tech part of it, because the people doing it as far as security, the people doing this stuff are the tech experts. So as far as the security aspect, they actually know a lot more about that than we do. Right. And they can explain a lot of that stuff way better than I ever could. So that part I’m okay with as long as they’re being honest and upfront.
Katie Vernoy 21:45
Dr. Maelisa McCaffrey 21:46
You know, and yeah, they’re still figuring it out. They’re still improving things. I will say, like, Mentalyc, heard about the fact that all of all of them created fake information in the notes that I was having them create, and made some changes in their system to try and avoid that, you know. So they are, some of these companies are really actively trying to improve. And they’re still new enough that there are things like that coming up that they need to improve, like, last month, you know. So it is something to be mindful of.
Katie Vernoy 22:23
I’ve mostly heard about using AI for you know, if it’s for documentation. I’ve mostly heard about it for notes, right, that this is the thing we all hate to do. And I I picture this time in the future when my EHR that has built in telehealth also has this built in AI. It’s it knows the whole chart, it helps me build the whole note and it’s gorgeous. It’s immediately inputted into the right chart, and all I have to do is a little bit of edit and sign, right. Like that would be an amazing thing. It’s not there yet, it sounds like and so the part that makes me want to like cringe, you know, my progress notes. That’s not really getting solved yet. Like people who are willing to do it, they’re the beta testers, they’re helping folks figure it out. Maybe it’ll be ready for primetime, ready for Curt, in a year or two. But you in, you know, kind of the prep for this episode, you said that there’s other uses for documentation beyond progress notes. What do you mean by that?
Dr. Maelisa McCaffrey 23:21
Absolutely. And I should say that I actually have discovered there is one combo EHR AI platform.
Katie Vernoy 23:28
Dr. Maelisa McCaffrey 23:29
It’s called Orchid. I have tested it a little bit, and it looks pretty awesome. I haven’t broken it yet. I haven’t found anything super negligent about it yet, it seems great. There’s an attorney that’s part of the team that started it. So they’re kind of on top of that stuff. And it does what you are saying right, where you go in and you write your notes. And you can choose to either do your note or you can choose to at the top of the note is kind of a text field where you can put in the information, or you can just have it upload a transcript, or you can log into their system and use their video platform. All the all the options that..
Katie Vernoy 24:09
Dr. Maelisa McCaffrey 24:10
It’ll give you a note. And even that platform, when I was getting the walkthrough from the creator of this platform said, I could see how AI would be really helpful initially, but over time, I think just using the checkboxes is faster than using the AI in the EHR. So, so even like it’s useful, but it has some really cool potential features, like if you can set up your account in that EHR, and have all of your specialties and have certain treatment plans based on your specialties. So, let’s say you’re an EMDR therapist, it will automatically pull out specifically EMDR interventions for you knowing that that is, you know, a treatment plan, treatment plan you’ve created. You know, so there are some really awesome ways I think it can help integrate the entire record together. Are and solve a lot of these these problems that we’re discussing. And it’s still really new. And even with that seeming like a really awesome platform, like I said it, even the creator himself was like this seems easier to just do on your own. I’m doing. So there’s that. So beyond that, there are a ton of ways you can use AI. So I mean, I still use ChatGPT for other things. You can use AI to create templates. So while you can’t use it for a specific like, ChatGPT let’s, let’s just go with that. Because it is kind of the best one out there. There’s no reason you can’t say create a treatment plan template for someone with depression. And use it to give you a starting place. Right. So it gives you kind of a framework. And then you can pick out what you like, what you don’t like, phrase it, it’ll give you some phrases, and then you’re kind of picking and choosing. So, in a very similar way to how you would use one of these progress notes planners or treatment planner books, they could order on Amazon or something, right, it’s kind of similar in that way. Use it to create some cheat sheets for yourself and get ideas of the type of language that you would like to use. You can also use it for things like creating resources for clients. Like how many of us have spent hour you know, collecting information, and that’s something ChatGPT could do for you in 20 seconds. Like, Please collect all of the emergency numbers for Los Angeles County in California, you know, and it’ll give you a whole list of everything. You do need to check it, you still got to fact check it. But that’s way easier than going on to 10 different websites and collecting numbers that you might want. Give me a list of psychiatrists in this city. And you know, again, for things where you have no idea where to start, it can be a really good starting place to help you filter information.
Curt Widhalm 27:04
How are you seeing people talk with their clients about this stuff? Like how much are we obligated to talk with clients? Obviously, you’ve referred to all right, if we’re recording sessions, but if a therapist is out there and isn’t recording a session, but just kind of jots down some information and types it into one of these companies, do we ethically have an obligation to tell clients hey, here’s what I’m doing to create your notes?
Dr. Maelisa McCaffrey 27:31
That’s a gray area, because it’s so new, that nobody’s ethics code has talked about it yet. Now we have principles, right, based on technology in general. And I would say if, if a technology is so new, that nothing has talked about it yet. In general, most of our ethics codes would tell us that you need to inform your clients, right? That it’s an experiment, you would treat it as an experimental type thing, where you would need to inform people have what you’re doing. And it is not acceptable to then say, Oh, well, because this saves me time, if you don’t want me to use it, I can’t see you. Right, like it’s an optional thing. And it’s your clients option, not yours. So, if your client is not comfortable with it, you have to be okay with that. I would say, at this point in time is to kind of use a comparison, it’s, it’s a little different than like an EHR, right? I could tell a client, I use this secure system to keep your record safe. And if someone’s like, I don’t want you to use that system. I could say this is not a good fit, right? Because there are tons and it’s just such an established practice. And such a safe practice that we can all feel comfortable with. AI is not like that. So you do need to tell people and I would say there’s one ethics code that has created a statement. I don’t know if they’ve officially published it yet, but it’s like anytime now or or has recently been published on AI specifically. I think it’s the AMHCA American Mental Health Counselors, I think it would be that one, they and they’re essentially saying exactly that. If you want to use AI, then you need to be conscious about it being HIPAA compliant and being secure. And you need to inform clients. They’re specifically outlining informed consent as a requirement for using AI.
Curt Widhalm 29:28
Makes it easy when it’s a new client coming into a practice but for those who are considering like, Hey, I I’ve been working with you for years and I am now switching over I am team AI and I will be doing this with your information. Is this just kind of like a alright, this is going to create termination issues where it’s like you can adopt what I’m adopting in my practice. Otherwise, you can get out in three sessions and here are some referrals that are generated by ChatGPT
Dr. Maelisa McCaffrey 30:01
So, I mean, obviously, if it creates enough of an issue, everything is clinical fodder. So we can have lots of conjecture around that. But you, you can’t, you can’t choose to make it an issue. Right, as the therapist, you have to give the client options, you need to present it as optional. And not as: this is what I’m doing now. I will say that as a general rule, it appears based on people I’ve therapists I’ve worked with who are using it, the general population is way more open to this than we are. So I have not heard of significant negative, I haven’t heard of anything negative from therapists who have talked with their clients about it, like current clients really start using it. Most of their clients think it’s pretty cool. Or they get it or they’re like, oh, yeah, that makes sense. Like I wouldn’t have thought of that, and they’re fine with it. That doesn’t mean every client’s gonna be okay with it. But I think, you know, we tend to be a little bit more hesitant about this stuff.
Katie Vernoy 30:59
That seems like a lot of folks are just embracing AI in a lot of different ways. I think I tried out ChatGPT and then forgot about it for a while. And then in prepping for an episode today. I was like, I don’t have time to read all this. And it feels really confusing. So let me just like, pop some of this in and say explain this to me like I’m 12. And it totally helped.
Dr. Maelisa McCaffrey 31:20
Katie Vernoy 31:21
But what do you think therapists need to know about AI generally, and advances in technology? Because I think that’s the issue around being therapist is some folks are like state of the art. They know everything, and they’re ready to jump in. And then there are folks that are like, I know, I should have an electronic health record, but I’m still writing my paper notes.
Dr. Maelisa McCaffrey 31:41
Yeah, well, and there’s no should. Like, if it works for you, it works for you. I think in general, the thing to know is that it’s still so new. I mean, it is brand new. Electronic health records are not new, they’ve been around for 20 years now, 20 plus years, right. AI is so so new, that it is something that will be different in six months. You know, you did an episode on this a year ago. And it was different, right?
Katie Vernoy 32:11
Dr. Maelisa McCaffrey 32:13
If we do this episode again, in a year, I expect it will be different. I do expect a lot of these EHRs will start collaborating with these AI platforms. And this is going to become kind of the way that it really works is having those two things integrated five years from now. So five years from now, this may not even be a big topic, because it’s going to be integrated in all the EHRs. Right? In that way. It’s still very, very new. And it is here. Don’t be afraid of it. Don’t shy away from it. Because it’s here. It’s not going anywhere. Enough people like it and are on the bandwagon that like this is it. So figure out how to feel comfortable with it and how to use it, and how to do that in an ethical way.
Curt Widhalm 32:58
So you’re talking about the ethics codes haven’t really been updated yet? If there’s people who are on those ethics committees who are listening in on this, What would hyouer recommend? I mean, I may or may not be one of them. What would you recommend seeing in some of the ethics codes to update to be prepared with this transition now?
Dr. Maelisa McCaffrey 33:23
I think the client consent issue is huge. Right, that comes up over and again, regardless of what we’re talking about, right with technology with all the things we do. So absolutely having informed consent as part of it. And then I think making sure that a therapist has at least checked to see: Is something HIPAA compliant? You know, and I don’t know exactly, I know the ethics codes don’t specifically say HIPAA compliant, but…
Curt Widhalm 33:52
Laws around technology.
Dr. Maelisa McCaffrey 33:53
Yeah. However, they frame the…
Katie Vernoy 33:55
Dr. Maelisa McCaffrey 33:56
Yeah. But there’s no reason, there’s there are already enough AI platforms for therapists for using for progress notes that are HIPAA compliant. There’s no reason not to use one of those. There’s no reason to use ChatGPT for your progress notes, if you can pay 15 bucks a month to have something secure your client records. Right. So I think it needs to address, at least on my end. Those are the two big things that come up, like is this platform actually secure? And are our clients informed of the process?
Katie Vernoy 34:32
So the question that always comes up around AI for me, so this is kind of switching gears just a little bit before we wrap up. Everyone’s worried that AI is going to take our jobs. And to me uploading a whole bunch of sessions and having AI either transcribe them or make them into notes or write reports or whatever they do. It’s a really good way to train AI on what therapy looks like. What do you think about that fear? Am I just totally like paranoid? Or is this…?
Dr. Maelisa McCaffrey 35:04
I think it .Yeah, I think it is in some way. And I don’t think we have to be worried about it taking our jobs. I think it’s okay for people to have AI services that help them with something, you know. I don’t think that’s a bad thing, having allowing people more access to mental health resources. Which is really what it’s going to be. And if we can be involved in that process, it will vastly improve it, right, which is why I’ve been willing to talk to a lot of these AI platforms from the beginning, because I’m like, Hey, you haven’t thought about insurance. And none of them had. You know, I would rather be a part of them improving that system, then rolling it out without even considering that, and then having other therapists take, you know, take the fall, right. So we might as well be a part of making these AI therapists, and these AI therapy services better. And there are a lot of people with really, really good intentions and really good hearts, who are trying to do that. There are tech companies who are just trying to make money off of it and seize the opportunity. But there are also like, really good people involved in this, that we can work with, and make this a good thing. And I think if we could collaborate it, like it’s, it could really be a very beautiful partnership. Like if you could have clients, and know that they also have this AI service that they can use throughout the week when they’re not seeing you. And that it gives them little tips, or it reminds them of things you two have talked about in a session, or it reminds them of homework you’ve talked about, or it reminds them, or it even brings up to you things that maybe you haven’t brought up with. And you could even what if you could just like click a button, oh, yeah, I forgot to tell them about this. And it shows up in their app, right, that and they have access to it. And you don’t have to remember to bring it up next week. And this client can actually access it now. There’s a lot of benefits to integrating traditional therapy with AI therapy. And if if we’re too scared to look at that, and too scared it’s gonna take away a job that I think we’re gonna miss a lot of those benefits.
Katie Vernoy 35:40
I actually think that sounds pretty good. So I’m with you. My paranoia is gone.
Curt Widhalm 37:37
Where can people find out more about you and the courses that you’re doing and all the wonderful things that you’re up to?
Dr. Maelisa McCaffrey 37:44
Yeah, QAPrep.com is the go to place for everything. And if you find me on YouTube, so that’s under my name, Maelisa McCaffrey. I do have a whole AI series and I am doing specific reviews on all the different AI platforms. So, you can check that out, too.
Curt Widhalm 38:01
And we’ll include links to those over in our show notes over at mtsgpodcast.com. Our show notes are totally not written by AI and…
Katie Vernoy 38:12
They actually aren’t, Curt. You made it sound like they were.
Curt Widhalm 38:14
They aren’t, I…
Katie Vernoy 38:19
It’s painstaking. What is been written by AI is our transcripts (though they are double checked and corrected by a human…).
Curt Widhalm 38:25
You can also find the transcripts over at mtsgpodcast.com. And follow us on our social media. Join us on our Facebook group, the Modern Therapists Group to continue on this conversation and until next time, I’m Curt Widhalm with Katie Vernoy and Maelisa McCaffrey.
Thank you for listening to the Modern Therapist’s Survival Guide. Learn more about who we are and what we do at mtsgpodcast.com. You can also join us on Facebook and Twitter. And please don’t forget to subscribe so you don’t miss any of our episodes.