From the course: Handling Sensitive Data with Cloud and Local AI

Configure AI assistants for maximum data security

From the course: Handling Sensitive Data with Cloud and Local AI

Configure AI assistants for maximum data security

When we use AI assistants, we want to make sure we configure them to protect data to the best of our ability. Now, this reduces exposure, it doesn't guarantee safety, and we should always aim to have these as organization defaults. So if you have admin privileges, it's a good idea to set these defaults so that when somebody's onboarded, they are there and ready. It's also important to revisit these settings once in a while to see if anything changed as far as protecting personal data. We're going to look at ChatGPT and Cloud and settings that relate to data privacy. And the first one is here in ChatGPT settings under personalization. And while this is not under data control, memory has a lot to do with protecting personal data and not having things pop up in times when you may not want them to. So a lot of times you may prefer to have memory off to keep conversations isolated. That way data can't leak from one conversation to another. There's also the record mode. So you can ask it not to reference recording history here. And if we look here, we see web search. you can decide to disable web search, but in data controls, you have more control over browsing data. So you can have remote browser data off. Now, it's important to note that this option here, improve model for everyone, involves using some form of your conversations as training data. It's probably a good idea to have this off so that you prevent a situation when a model is exposed to your data or your user's data and then somehow regurgitates that data back. Now, once in a while, you may wanna look at your retention policy and delete chats, and it's really up to your organization to decide how you want to manage that. Cloud has similar options where you have this help improve Cloud, which you wanna make sure you toggle off. There's also location metadata, which you may or may not be comfortable with sharing. Now, you wanna explore not just privacy settings, but all of the general settings as well, and see if some of them maintain to your use cases. Another thing you wanna look at is the privacy policy, and it's linked right here, and you wanna look through that to see if it aligns with the work that you're doing and whether it's sufficient to protect your personal data. The next piece of advice is just my opinion. And that is, if you want to expect more privacy, you don't wanna use free tools. That means if you're using a free tool, there's a limit to what kind of privacy you can expect when it comes to your data. So if you pay for a service, you will likely have more privacy features, but you can also expect a little bit more when it comes to data privacy when you pay for something.

Contents