From the course: Handling Sensitive Data with Cloud and Local AI

Unlock this course with a free trial

Join today to access over 25,500 courses taught by industry experts.

Choosing an inference platform

Choosing an inference platform

Where we run our AI models or our inference platform is one of the most important choices when it comes to data privacy, security, and AI systems. Commonly, we have assistance as a service, so this would be something like ChatGPT, Gemini, Copilot, or Cloud. We also have cloud-hosted inference solutions like AWS Bedrock and Azure Foundry. Then we have on-premise inference, which means having your own hardware where you can run your AI solutions. Now, assistant-as-a-service solutions are pretty straightforward to set up, and that makes them very approachable. They should always be configured for maximum safety, and they may be susceptible to data disclosure since we're using third-party solutions. Cloud-hosted inference solutions may reduce exposure, especially since many businesses already trust cloud providers with their data. They could present a vulnerability called unbounded consumption, where basically a malicious actor drains resources from the system. This can be mitigated with…

Contents