From the course: Handling Sensitive Data with Cloud and Local AI
Unlock this course with a free trial
Join today to access over 25,500 courses taught by industry experts.
Implement an LLM proxy with liteLLM
From the course: Handling Sensitive Data with Cloud and Local AI
Implement an LLM proxy with liteLLM
An LLM proxy is a piece of software that sits somewhere between the language model and our user or team members. This can help us log things, it can help us look at data that's passing through that proxy, it can also help control cost, and allows for fine-tuning model access. It also makes things easier when you want to migrate from one large language model to another since it provides a more consistent interface. A very popular implementation of a proxy is LightLLM, and LightLLM does offer an enterprise product as well as an open source toolset for dealing with LLMs. Here I've set up a Docker environment for my proxy and basically I've set up some environment variables that are temporary. This proxy is not as robust as one we would use in a production environment but to get going with this you'll want to export your OpenAI API key and set this to your key. And of course to work with this in a more robust way, you would change this master LLM key as well as this Postgres password…