Keep your code where it belongs: Running GitHub Copilot CLI with your own models on Azure Local + GitHub Enterprise Server
Illustration of sovereign and on-prem AI infrastructure for GitHub Copilot CLI

Keep your code where it belongs: Running GitHub Copilot CLI with your own models on Azure Local + GitHub Enterprise Server

For a lot of the customers I work with — regulated industries, public sector, sovereign clouds, and teams operating under strict data residency requirements — "just send your prompts to a public AI endpoint" isn't an option. Source code, IP, and context can't leave the boundary. But that shouldn't mean giving up the developer productivity gains of an AI coding assistant.

That's exactly the gap that GitHub Copilot CLI's Bring Your Own Key (BYOK) support closes. Pair it with Azure Local and GitHub Enterprise Server (GHES), and you get a fully on-premises, residency-aligned AI coding workflow — your repos in GHES, your model running on Azure Local, and Copilot CLI stitching it all together on the developer's machine.

What BYOK gives you

Copilot CLI can be pointed at any of three provider types instead of GitHub-hosted models:

  • OpenAI – any OpenAI Chat Completions–compatible endpoint (Ollama, vLLM, Foundry Local, etc.)
  • Azure – Azure OpenAI Service (including deployments running in your tenant and region of choice)
  • Anthropic – Anthropic Claude models

The model you bring must support tool calling and streaming, and GitHub recommends a context window of at least 128k tokens for best results.

The on-prem / sovereign pattern

A typical setup for a data-residency-sensitive customer looks like this:

  1. GitHub Enterprise Server hosts the repos, inside the customer's network.
  2. Azure Local runs the inference workload — either Azure OpenAI in a region you control, or an OpenAI-compatible runtime (Foundry Local, vLLM, Ollama) hosting an approved model on-prem.
  3. Copilot CLI on the developer workstation is pointed at that endpoint via environment variables.

Connecting to Azure OpenAI

export COPILOT_PROVIDER_BASE_URL=https://YOUR-RESOURCE-NAME.openai.azure.com/openai/deployments/YOUR-DEPLOYMENT-NAME
export COPILOT_PROVIDER_TYPE=azure
export COPILOT_PROVIDER_API_KEY=YOUR-AZURE-API-KEY
export COPILOT_MODEL=YOUR-DEPLOYMENT-NAME
copilot        

Connecting to a local OpenAI-compatible runtime (Foundry Local, vLLM, Ollama)

export COPILOT_PROVIDER_BASE_URL=http://your-local-endpoint:PORT
export COPILOT_MODEL=YOUR-MODEL-NAME
copilot        

No API key is required if your local runtime doesn't use authentication.

The piece people miss: offline mode

For air-gapped and sovereign environments:

export COPILOT_OFFLINE=true        

With COPILOT_OFFLINE=true, Copilot CLI will not phone home to GitHub's servers. Combined with a provider endpoint that's also inside your boundary (Azure Local, on-prem Foundry Local, etc.), prompts and code context stay entirely within your environment.

One caveat straight from the docs: offline mode only guarantees full network isolation if your provider is also local or inside the same isolated environment. Point COPILOT_PROVIDER_BASE_URL at a remote endpoint and your context travels to that endpoint — regardless of the offline flag. Architect accordingly.

Why this matters

For customers who have been told "AI coding assistants and data residency can't coexist," the combination of GHES + Azure Local + Copilot CLI BYOK + offline mode is a concrete answer:

  • Code stays in GHES, inside your boundary.
  • Inference happens on Azure Local, in the region and jurisdiction you control.
  • The developer experience is the same Copilot CLI workflow — just pointed at your infrastructure.

If you're in financial services, healthcare, government, defense, or any team with a "nothing leaves our tenant" mandate, this is worth a closer look.

Full reference: Using your own LLM models in GitHub Copilot CLI

Views are my own and do not represent my employer

Hey! I launched a GitHub game: Weekly Builds Community. Share weekly progress via PRs, top contributors get featured Mondays. Week W17 is open—join 👉 github.com/P-r-e-m-i-u-m/weekly-builds-community

To view or add a comment, sign in

More articles by Melissa Durbin

  • Engineering & Construction Industry is Transforming

    The Engineering and Construction Industry is Transforming. IBIS World data shows the Construction industry opportunity…

    1 Comment
  • Top Five Benefits of LinkedIn for Legal

    To be successful today in the age of the online network, when immediacy crosses all time zones and geographic…

    1 Comment

Others also viewed

Explore content categories