Photo of a smartphone and smart device. Text over the image reads,

Who’s in the Room? Siri, Alexa, and Confidentiality

Curt and Katie chat about how therapists can maintain confidentiality in a world of AI assistants and smart devices. What duty do clinicians have to inform clients? How can we balance confidentiality with the reality of how commonly these devices are involved in therapy? Can telehealth therapy be completely confidential and data secure? We discuss our shift in clinical responsibility, best practices, and how we can minimize exposure of clinical data to ensure the confidentiality our clients expect and deserve.

Transcript

Click here to scroll to the podcast transcript.

In this podcast episode we talk about something therapists might not consider: smart devices and AI assistants

We received a couple of requests to talk about the impact of smart devices on confidentiality and their compliance with HIPAA within a therapeutic environment. We tackle this question in depth:

What are best practices for protecting client confidentiality with smart devices?

  • Turning off the phone, or placing the phone on “airplane mode”
  • Warning clients about their own smart devices and confidentiality risks
  • The ethical responsibilities to inform about limits of confidentiality and take precautions
  • It’s all about giving clients choice and information

What should therapists consider when smart devices and AI assistants are in the room?

“It’s not to say we have to be luddites, it’s that we have to disclose the potential limits of confidentiality that clients have come to expect.”

– Curt Widhalm

  • Whistle-blower reports on how often these devices are actually listening
  • Turning off your phone is a lot cheaper than identity theft
  • Consider your contacts, geolocation, and Wi-Fi connection
  • Some of this, as we progress into a more technological world, might be unavoidable

How do Alexa and Siri impact HIPAA compliance for therapists?

  • The importance of end-to-end encryption for all HIPAA activities (and your smart device may not be compliant)
  • The cost of HIPAA violations if identity theft can be traced back
  • Understand the risks you are taking, do what you can, and remember no one is perfect

What can modern therapists do with their smart devices?

“Whether it’s convenience or practicality that has you putting your client’s contacts into your phone, we have to think beyond that because it really can harm our ability to keep that data safe.” – Katie Vernoy

  • GPS location services can be left on for a safety reason, emergency services use GPS location
  • Adjusting settings for voice activation, data sharing, when apps are running, locations, etc.
  • Turning off and airplane mode are also options
  • Always let the client know the limits of confidentiality

Our Generous Sponsor for this episode of the Modern Therapist’s Survival Guide:

Buying Time logo

Buying Time LLC

Buying Time is a full team of Virtual Assistants, with a wide variety of skill sets to support your business. From basic admin support, customer service, and email management to marketing and bookkeeping. They’ve got you covered. Don’t know where to start? Check out the systems inventory checklist which helps business owners figure out what they don’t want to do anymore and get those delegated asap. You can find that checklist at http://buyingtimellc.com/systems-checklist/

Buying Time’s VA’s support businesses by managing email communications, CRM or automation systems, website admin and hosting, email marketing, social media, bookkeeping and much more. Their sole purpose is to create the opportunity for you to focus on supporting those you serve while ensuring that your back office runs smoothly. With a full team of VA’s it gives the opportunity to hire for one role and get multiple areas of support. There’s no reason to be overwhelmed with running your business with this solution available. Book a consultation to see where and how you can get started getting the support you need – https://buyingtimellc.com/book-consultation/

Resources for Modern Therapists mentioned in this Podcast Episode:

We’ve pulled together resources mentioned in this episode and put together some handy-dandy links. Please note that some of the links below may be affiliate links, so if you purchase after clicking below, we may get a little bit of cash in our pockets. We thank you in advance!

Psychotherapy in Ontario: How Confidential is my Therapy? By Beth Mares, Registered Psychotherapist

The Privacy Problem with Digital Assistants by Kaveh Waddell

Hey Siri and Alexa: Let’s Talk Privacy Practices by Elizabeth Weise, USA Today

Patient and Consumer Safety Risks When Using Conversational Assistants for Medical Information: An Observational Study of Siri, Alexa, and Google Assistant, 2018

Hey Siri: Did you Break Confidentiality, or did I? By Nicole M. Arcuri Sanders, Counseling Today

Alexa, Siri, Google Assistant Not HIPAA Compliant, Psychiatry Advisor

Hey Alexa, are you HIPAA compliant? 2018

Person-Centered Tech

Relevant Episodes of MTSG Podcast:

Which Theoretical Orientation Should You Choose?

Is Your Practice Ready for Paid Digital Marketing? An Interview with John Sanders

Waiving Goodbye to Telehealth Progress: An interview with Dr. Ben Caldwell, LMFT

Malpractice is No Joke

Who we are:

Picture of Curt Widhalm, LMFT, co-host of the Modern Therapist's Survival Guide podcast; a nice young man with a glorious beard.Curt Widhalm, LMFT

Curt Widhalm is in private practice in the Los Angeles area. He is the cofounder of the Therapy Reimagined conference, an Adjunct Professor at Pepperdine University and CSUN, a former Subject Matter Expert for the California Board of Behavioral Sciences, former CFO of the California Association of Marriage and Family Therapists, and a loving husband and father. He is 1/2 great person, 1/2 provocateur, and 1/2 geek, in that order. He dabbles in the dark art of making “dad jokes” and usually has a half-empty cup of coffee somewhere nearby. Learn more at: http://www.curtwidhalm.com

Picture of Katie Vernoy, LMFT, co-host of the Modern Therapist's Survival Guide podcastKatie Vernoy, LMFT

Katie Vernoy is a Licensed Marriage and Family Therapist, coach, and consultant supporting leaders, visionaries, executives, and helping professionals to create sustainable careers. Katie, with Curt, has developed workshops and a conference, Therapy Reimagined, to support therapists navigating through the modern challenges of this profession. Katie is also a former President of the California Association of Marriage and Family Therapists. In her spare time, Katie is secretly siphoning off Curt’s youthful energy, so that she can take over the world. Learn more at: http://www.katievernoy.com

A Quick Note:

Our opinions are our own. We are only speaking for ourselves – except when we speak for each other, or over each other. We’re working on it.

Our guests are also only speaking for themselves and have their own opinions. We aren’t trying to take their voice, and no one speaks for us either. Mostly because they don’t want to, but hey.

Stay in Touch with Curt, Katie, and the whole Therapy Reimagined #TherapyMovement:

http://www.mtsgpodcast.com

http://www.therapyreimagined.com

https://www.facebook.com/therapyreimagined/

https://twitter.com/therapymovement

https://www.instagram.com/therapyreimagined/

Consultation services with Curt Widhalm or Katie Vernoy:

The Fifty-Minute Hour

Connect with the Modern Therapist Community:

Our Facebook Group – The Modern Therapists Group

Modern Therapist’s Survival Guide Creative Credits:

Voice Over by DW McCann https://www.facebook.com/McCannDW/

Music by Crystal Grooms Mangano http://www.crystalmangano.com/

Transcript for this episode of the Modern Therapist’s Survival Guide podcast (Autogenerated):

Curt Widhalm 00:00

This episode of the modern therapist Survival Guide is sponsored by buying time

Katie Vernoy  00:04

Buying Time has a full team of virtual assistants with a wide variety of skill sets to support your business. From basic admin support customer service and email management to marketing and bookkeeping, they’ve got you covered. Don’t know where to start, check out the system’s inventory checklist, which helps business owners figure out what they don’t want to do anymore and get those delegated ASAP. You can find that checklist at buying time. llc.com forward slash systems stash checklist.

Curt Widhalm  00:31

Listen at the end of the episode for more information.

Announcer  00:34

You’re listening to the modern therapist survival guide where therapists live, breathe, and practice as human beings. To support you as a whole person and a therapist. Here are your hosts, Curt Widhalm and Katie Vernoy.

Curt Widhalm  00:50

Welcome back modern therapists. This is the modern therapist Survival Guide. I’m Curt Widhalm, with Katie Vernoy. And this is the podcast for therapists about all things therapy, the things that we consider the things that we don’t. And stay is one of those days where we’re going to be talking about some of the things that we might not consider. And this really comes with some of those smart devices in our homes, our offices, potentially even in our clients homes, and what it means for confidentiality, especially in terms of compliance with things like HIPAA, and who’s always listening. And you know, Google a few years ago changed kind of their motto from do no evil to whatever it is. Now I just know that they’re, they’re no longer committing to not doing evil. But I want to start with kind of this idea of when we especially start with telehealth clients, but this is also going to be true when it comes to our in person sessions with things like smartphones and just kind of being cool in the modern era and having things like Amazon echoes or Google Docs, or any of these kinds of things in our offices of are those things always listening, and what does this mean for client data?

Katie Vernoy  02:07

That’s a big intro. Yeah, I, I’ve worried about this for a while. And that’s why I don’t have a any kind of AI in my office, although after reading some of these articles I actually do because I have my phone in my office because I receive messages. And I do all kinds of stuff. So it’s a little bit scary to think about what might be listening.

Curt Widhalm  02:33

So I mean, this is where I think any of us who have a Windows laptop, there’s Cortana, if you have one of these Amazon devices, there’s Alexa, if somebody you know, has Siri, these things are listening. And well, some of the tech stuff, you know, might say that they’re only listening for key words that would activate them articles that we’re looking at here is what we’re going to dive into today. As far as does this mean that our sessions with clients are actually as confidential as we’re talking about? And what does this mean for our own best practices as we go forward, having smart devices in our offices in our homes, and potentially even in our client’s homes. And the way that this conversation initially came up was I was at a dinner party with some other therapists and talking about great dinner party talk that happens wherever I’m at with other therapists, which is,

Katie Vernoy  03:34

Yeah, only although therapists with me, I tend to

Curt Widhalm  03:37

Get people asking a lot ethics questions. And one of the questions that was up for discussion was our duties when it comes to talking with clients about confidentiality, particularly when it comes to telehealth. And I was describing that we have a responsibility to talk with our clients about the limits of confidentiality, that may include privacy in their own homes, if there’s potentially somebody who’s walking down the hallway, outside their bedroom or office door, wherever they’re doing sessions from, and one of the other therapists at this party said, Well, what about any of the smart devices? Do you ever warn them about Google or Alexa or Siri actually listening? And that’s what sparked this. So if you ever want a podcast episode, I am available for dinner parties for you to float ideas by.

Katie Vernoy  04:28

Okay, okay, there we go. And so this

Curt Widhalm  04:31

Has led to some research on our part here as far as what is our responsibility? And what do we need to do with our clients as it pertains to some of this AI discussion, even when we don’t think that it’s happening?

Katie Vernoy  04:47

Well, to me, when you propose this idea for the podcast, the first thing that came to mind was really around convenience versus confidentiality. Because when we’re looking at a lot of these things, When we don’t turn off voice activation, when we don’t make sure that we’re not connected to everything through our phones, and all of the contacts and everything within our phones, data is at risk. I mean, even if it says little as a GPA, GPS colocation, it could be a contact could be content that you’re actually discussing. I mean, there’s, there’s a lot of different ways that folks use their phones kind of just live their lives and the convenience of having Google read through your emails, or, you know, whatever it is to be able to scan for things that need to go on your calendar, or to scan for things. You know, like, I love that I can, you know, in the before times when I was traveling, I loved that Google knew where I was flying to what flight I was on, and I would be able to get that information and notifications like you should be leaving for the airport right now. So I think it’s something where the convenience of having the AI tracking us and listening to us and reading our emails, and all of that has sometimes trumped our need for privacy.

Curt Widhalm  06:09

The first article that I came across in this is an article from counseling today. This is publication of the American Counseling Association. This article was by Nicole R. Curry Sanders called Hey, Siri, did you break confidentiality? Or did I in this article, Dr. Curry Sanders actually cites an article from The Guardian newspaper, talking about an apple contractor who’s a whistleblower. And this contractor is quoted as saying that they regularly hear confidential medical information, drug deals, recordings of couples having sex as a part of this contractors job providing quality control. So these devices are, at least historically have listened. Now, this flies in the face of what some of the tech articles that I’m seeing out there who say that these devices are only listening for those keywords that activate them, but that they’re actually constantly on. And according to this contractors cited in this Guardian article, they are recording and sharing this information. So it’s very theoretically easily believed that it’s also listening in on your therapy sessions. If that’s the case, with Apple having this information, everybody who’s got an iPhone, that’s either bringing it into your session, these devices are potentially listening to everything that’s being discussed in your sessions, which is scary, because I imagine that most therapists are not talking about this as a potential breaking of the limits of confidentiality and the promise of confidentiality that makes therapy so sacred.

Katie Vernoy  07:55

And I think that as a society, we have kind of cosign on this lack of privacy, I mean, Siri, or Alexa or Google or whatever, potentially are, they’re constantly listening to all of us. And that’s part of life. And so are we, are we responsible above this risk that all of us are willing to take by having phones in our pockets,

Curt Widhalm  08:20

And I don’t think many of us are, and we’ll include the links to what we’re talking about here in our show notes. You can find those over at MTS g podcast.com. The next thing that I’m looking at here is a blog post on psychiatry, advisor.com called Alexa Siri, Google Assistant are not HIPAA compliant, and it warns against. Obviously, we all at this point should know that you shouldn’t be doing your notes onto one of these devices using some of these voice prompts. But if this article also warns about don’t add clients to your schedule using one of these either because it’s not an end to end encrypted sort of device, which is one of the requirements of HIPAA, and that HIPAA violations can cost people hundreds or 1000s of dollars. If identity theft can be traced back to them think of how convenient it is to just turn off your phone. So that way, and how much potential money this may end up saving you by just doing the simplest of things.

Katie Vernoy  09:28

Yes, yes, I again, but I still want to you know, we’re I know we’re

Curt Widhalm  09:34

Any good one ethics discussion should leave people anxious.

Katie Vernoy  09:39

But my question still stands. If I do my part because I am a HIPAA provider. I put my phone on Do Not Disturb or whatever I airplane mode. I put my phone on airplane mode. I don’t have any other devices with listening capability in my room, and I only use my electronic health record for scheduling and communication and HIPAA compliant email, blah, blah, blah, like I do all the things, and my client still has a smartphone in their pocket, like do I actually need to warn them about that smartphone in their pocket, because they already theoretically are agreeing to this constant surveillance. By having that smartphone in their pocket,

Curt Widhalm  10:22

I think that we have a duty. And this is reflected in our ethics codes. And we have a duty to tell our clients even things that they may not consider as it pertains to therapy about, okay, where limits of confidentiality may lie? Well, there may be the constant surveillance of these devices in everyday life, but to further prompt them, at least, and especially in our first telehealth session with them that, hey, just in case you haven’t considered this, your smart devices in the room may also be listening to your therapy session. And well, you know, it’s not the same thing as a sibling or somebody else, brother, parents child’s, you know, walking down the hallway, there is the potential that some of this information may be transmitted to people that you don’t want to and if that’s a consideration, if you want to unplug those devices in the general listening area right now, now would be the time to do so.

Katie Vernoy  11:24

Okay. I mean, that seems fair, I think there’s going to be people talking about this, now that we’ve put this podcast episode out. So I think we also don’t want to freak people out. I mean, I think about also there, yes, the data is being transmitted, but it’s kind of like how much data are people actually looking at. I mean, it’s, it’s such an inundation of all of this surveillance data, that the likelihood of someone honing in on a therapy session feels small as part of quality control. And I’m not saying we shouldn’t do anything about it, I’m just saying, I’m gonna.

Curt Widhalm  11:59

Wave your argument away and saying that the likelihood of somebody breaking into your office and working at client files is also very small. But that does not absolve you of your responsibility to take the precautions to let our clients know about the limits of confidentiality,

Katie Vernoy  12:17

I think it’s I think, in talking about it with clients, the way you just said, it sounded a little paranoid, you

Curt Widhalm  12:22

Are being listened to.

Katie Vernoy  12:25

You’re being listened to. It’s I think there’s potentially a clinical clinically relevant way to talk about it. I mean, I think, as you know, smart devices that have voice activation potentially can get activated by words that we use, you may want to turn those on, or turn them off their devices in your room, turn them off, turn off voice activation, whatever. But like, there are devices listening in your room, you may want to unplug them. You sounded a little paranoid. It’s true. But But I think we want to I don’t know, it just it feels a little bit. I don’t know paranoid to me, I don’t I don’t know what

Curt Widhalm  13:03

Your paranoia is my legal precaution of that. And it doesn’t have to be presented in that paranoid sort of way. It’s just, you know, hey, it’s known at this point, like little disclosure, here, we have a little you know, Alexa thing sitting in our living room, sometimes our TV activates it. And then we get little ads on the Alexa based on whatever show that’s activated Alexa. So all of a sudden, we’re getting, you know, Airbnb recommendations of, you know, wherever the TV show we just watched was located, it’s not that much of a stretch of the imagination to think these things are listening, it’s happened a couple of times with my phone, just in this episode, it doesn’t have to be done in a paranoia sort of way. It’s just kind of a, hey, if your privacy means that much to you, and you’re gonna be talking about these sensitive things, you might want to consider shutting off those voice activated things in your room.

Katie Vernoy  13:55

Well, I mean, the other thing that we talked about before starting to record is also the the geolocation and potentially contacts on your phone. And so to me, I feel like, at some point there, if we are going to be in a technological society, there may be things that we just cannot avoid. And maybe I’m wrong. I mean, maybe do I do I just never turn on my phone when another person’s in my office, like, I feel like being able to not have, you know, if someone’s actually physically coming to my office, and our phones have crossed GPS, and all of our apps say like, Oh, they’re in the same room, they must like the same things and then start feeding us all of the ads, on the things that either we’ve talked about, because voice activation is on, or the things that each other have searched for. I mean, it starts to get a little bit nutty, to like, basically be Luddites at the moments during which we’re doing therapy.

Curt Widhalm  14:53

It’s not to say that we have to be Luddites, it’s that we have to disclose the potential So limits of confidentiality that clients may be coming to expect sharing on a Wi Fi network, if you’re a well intentioned therapist who has a parent who wants to be, you know, on the Wi Fi network in your office while their kids doing therapy. That’s one way that some of these algorithms work to match up people who should be connected on some of the social media sites, if you’ve got a client’s phone number saved in your phone, and you’ve given third party apps, the permission to scan through your phonebook. These are other ways that you’re potentially transmitting data to people that you have maybe lied to people about in your Notice of Privacy Practices that you give to your clients, if the information that you say that what you’re doing with it, and how it’s going to be shared. And you’re sharing this information in inadvertent ways, I’m not aware of any court cases where a therapist has been taken to court on this, but I could see where a therapist could be held liable by having some of this data shared in ways that they never heard that their Notice of Privacy Practices, you know, they take their boilerplate language from somebody down the street, who took it from somebody down the street, who took it from somebody down the street, who took it from actually a paid layer that they actually were responsible with. So since we tend to copy and paste and borrow and pay homage to other people’s paperwork, by just borrowing and stealing, and calling it our own, we may not actually be aware of everything in some of these Notice of Privacy Practices that we give out, if what you’re doing is transmitting some of this client data, you at least should document that you’ve had some of these discussions with your clients, as a way of limiting your liability when it comes to having any of these kinds of devices around you. And if the conversation and your own anxieties hasn’t pointed it out. So far, we all have these devices, this should be a regular part of the conversation. And should be something where especially talking about a lot of protected health information, especially if you’re already a HIPAA covered entity, you have to be aware of this

Katie Vernoy  17:11

Going back to kind of the original thought that I had around this is that whether it’s convenience, or practicality that has you put the contacts in your phone, for example, I think that we have to think beyond that. Because it really can harm our ability to keep those that data say I mean, I think about inadvertently, I have done a really good job at keeping my data away from Facebook, I don’t take any of the things I don’t log into anything with Facebook, I’ve tried to keep Facebook fairly separate, as well as I use a really old email. And it’s not connected to my practice in any way. I’m not sure that anybody else wants to do that. But they’re like, I don’t share contacts with any of my social media. So my phone is never mind for those things I actively go through and, and deny those permissions. But to me, it could be very simple, even a slip of your of a button press so to speak, where you’ve shared all your contact to LinkedIn, Twitter, social media, any other social media platform that you allow all of the permissions on your phone, because it’s easier because like, oh, well, I’ll find my friends, I don’t have to go search for them individually. I mean, there’s so many ways that are very seductive, that we could do this in an inadvertent data sharing,

Curt Widhalm  18:33

You know, this is no commentary on you. But you identified yourself not as like a super tech savvy person. And yet, I would say that what you just described is more tech savvy than what most people would think about. And that’s why we have some of the responsibilities that we do in talking with clients about how their health information may go beyond just our therapy sessions here. Some of these articles that we’ve seen talk about, you know, don’t do things like write your notes, you know, pay Google write in this patient chart, X, Y, and Z. Like, those things would seem obvious, especially to a lot of our modern therapist community who would be like, yeah, that totally makes sense. But just actually having the presence of any of these devices around us, is, you know, a matter of lifestyle for some people and it’s knowing to go in and how to shut off some of these things or be able to talk with some of our clients about this because something that’s happened during the COVID pandemic and with a lot of telehealth is, we’ve also become de facto, it people when it comes to explaining to some of our clients just even how to make some of the telehealth stuff work. And so if you know our EHR platforms, and as simple as they get made before for user experiences, if people are still having trouble with those knowing to go in and where to look on a phone for here’s where data gets shared back and forth with each other, well, that might be a little bit outside. The scope of what we want to talk about with clients, it’s sometimes more simple as far as if you have these devices. And you don’t want the conversation of what we’re what we’re talking about being shared with any of the apps on your phone. Best practice might be just to turn them off during our sessions. But if you leave them on, just know that we can’t guarantee complete confidentiality, that’s it.

Katie Vernoy  20:20

That seems fair. Um, one of the things that you said earlier, though, struck me because I think that you and I are like, obviously, we wouldn’t, you know, kind of transcribe our notes or, you know, kind of do voice over notes on our phone. But that’s kind of an accessibility issue for some folks who can’t type or handwrite their notes. And I would be very curious on how to protect in that regard. You know, if I’ve got a voice recorder, that helps me to do my notes, is it within a HIPAA compliant platform that goes directly into my notes? I mean, this might be things that people need to research is how do all of my apps interact? And how do I make sure that I’m not there’s not more than what I’m working on open and listening? Because I think that’s hard. And I don’t know that I wouldn’t say I’m tech savvy, I think I actually am. But I think it’s something where understanding how privacy and data works, and how things interact with each other how there’s data handoffs, I think those types of things feel like they are beyond the scope of being a therapist, but I like what you’re saying is like, then just turn these devices off. I guess the only problem is, I have clients that use their phone for their telehealth session. So I don’t know if you know, I use simple practice. So I don’t know simple practice, then make sure that other apps on the phone are not listening. I don’t know if there’s even a way to do that. And or if there is a way for people to, you know, like, do you go through and you just kind of disable each of the apps that you don’t want to listen, I mean, it feels like there’s, there’s a challenge here to really having a practical solution, unless we can be certain that the platform that we’re using for our video calls on the phone are actually is actually secure. And my assumption is that’s the case, I just don’t know what else is listening, if and if that’s possible.

Curt Widhalm  22:11

And in preparation of this episode, I did not do a deep dive into how, you know, our EHR platforms when they are used on our devices, more popular EHR companies, simple practice, you mentioned those video sessions, if there is a HIPAA compliance, if they have signed a BA agreement with you, those are end to end encrypted communications. Now, what I did not do a deep dive on is does that also prevent other apps and things from also listening, if it is being used on this solitary device that your session is on TBD? You know, follow us on our social media, or whatever. And we’ll sort through that through that. It does come back to this point. And especially as we can see some of these tech companies moving more and more into the healthcare space that they’re going to make closer and closer approximate efforts to become HIPAA compliant. And this is always kind of a cautionary sort of thing, where I’m a part of a lot of Facebook groups, with therapists, a lot of online communities, and I see a number of people wanting to do things as inexpensively as possible. But without those ba agreements, as business associate agreements, you’re not guaranteed to have the same a HIPAA protections if that data does get leaked out or shared in other ways. And so these are your responsibilities as therapists when it comes to confidentiality and this AI conversation.

Katie Vernoy  23:44

And there’s a lot of different ways to try to do that. I was one of you were talking, I was thinking about a conversation I had with Roy Huggins from persons under attack, who unfortunately recently just died. And it’s a very tragic loss for our profession. And just the way that he would talk about HIPAA compliance. And I’m sure Person Center tech will continue that work was that you have to understand the risks that you’re taking, and do what you can and then be comfortable with a risk you’re still taking because he’s not be perfect. And so I think it’s I think it’s, it’s hard because it can be very scary, because we can’t necessarily get to a place where we’ve we’ve taken every single precaution. I mean, we could go to a black site, have everyone come in separate ways, no GPS phones are left at their houses, and then be in a room together and then leave. There might be other liability if nobody knows where you are, and you’re alone in a room with a client. But I think as a society, I don’t think we can protect ourselves from every single thing. But these are things that we can protect ourselves against. pretty simply, I mean, you just turn it off. Um, I think, and that’s something that I don’t know that a lot of people were thinking about this. Now,

Curt Widhalm  24:57

One of the questions on one of these articles got asked, I think is worth discussing here is for people who are working at sites that require you to have a cell phone on you for safety reasons, whether it be in the floor of a hospital male use system, if you’re working for an agency where you go and visit clients houses or whatever, it’s what did you see, in kind of the responses to those articles there,

Katie Vernoy  25:26

The main thing is to turn off voice activation, so that there’s not a voice activation element. So it’s not recording the content, making your phone, a regular cell phone and trying to get rid of some of the other, you know, kind of the smart elements of it, I think can be very helpful. The thing that you can’t avoid, if you’re trying to go for safety is really, you got to keep GPS on if you need to make an emergency call, they need to be able to ping your cell phone. And so I think there’s there are some, some safety issues or not, there are some privacy issues that you can’t avoid if you need to have a cell phone. And it’s for safety reasons. But I think it’s something where the voice assistant technologies, those things are maybe not that easy to find, but but you can, you know, there’s some instructions in this, and I’ll put this in the show notes so that you can find it. But you know, turning off those voice activation, making sure that you’ve made yourself as tight as possible. As far as any kind of data that’s going out turning off, you know, all of the apps, making sure there’s nothing running in the background, even going through your apps and having the permission set to only while the app is on, I think is helpful, because then if Facebook is tracking your location, and Instagram is tracking your location, and Google and whatever, if those are tracking your location all the time, then there’s a lot of data being shared. But if you turn those, if you only have those on when you have those apps open, and you consciously close them before you go in my hope is that they’re not also running in the background. I’ve also had something where I put my phone on really low battery use before where it only allows for phone. So it basically shuts down anything running in the background so that you don’t have things going that you don’t know about. But you know, if you’re wanting safety going all the way to turning it off or airplane mode is going to maybe an advisable for safety.

Curt Widhalm  27:24

And in these conversations and what I would suggest is let your clients know what the limits of confidentiality are. And and this doesn’t have to be a huge in depth pieces of conversations. Some of your clients may have more interest in what you’re talking about, or paranoia depending on why you’re seeing those clients. But we would love to hear your experiences with this kind of stuff or thoughts or considerations that you have. You can share those with us on our social media. You can find links to those in our show notes. And once again, those are over at MTS g podcast.com. You can join our Facebook group, the modern therapist group and spill your data to us and Mark Zuckerberg. And until next time, I’m Kurt Wilhelm with Katie Vernoy and Siri.

Katie Vernoy  28:17

Thanks again to our sponsor buying time

Curt Widhalm  28:20

Buying Time’s VAs support businesses by managing email communications, CRM or automation systems, website admin and hosting email marketing, social media, bookkeeping and much more. Their sole purpose is to create the opportunity for you to focus on supporting those you serve while ensuring that your back office runs smoothly with a full team of VAs gives the opportunity to hire for one role and get multiple areas of support. There’s no reason to be overwhelmed with running your business with this solution available.

Katie Vernoy  28:48

Book a consultation to see where and how you can get started getting the support you need. That’s buyingtimellc.com/book-consultation once again, buying time llc.com forward slash book dash consultation.

Announcer  29:04

Thank you for listening to the modern therapist Survival Guide. Learn more about who we are and what we do at mtsgpodcast.com. You can also join us on Facebook and Twitter. And please don’t forget to subscribe so you don’t miss any of our episodes.

0 replies
SPEAK YOUR MIND

Leave a Reply

Your email address will not be published.