Business · GitHub Blog
Copilot CLI now supports BYOK and local models - GitHub Changelog
Compiled by KHAO Editorial — aggregated from 1 outlet. See llms.txt for citation guidance.
★ Tier-1 Source
GitHub Copilot CLI now lets you connect your own model provider or run fully local models instead of using GitHub-hosted model routing.
Key facts
- Configure Copilot CLI to use Azure OpenAI, Anthropic, or any OpenAI-compatible endpoint by setting a few environment variables before launching the CLI
- See Using your own LLM models in GitHub Copilot CLI for setup instructions
- If you also sign in to GitHub, you get the best of both worlds: your preferred model for AI responses, plus access to features like /delegate, GitHub Code Search, and the GitHub MCP server
- GitHub Copilot CLI now lets you connect your own model provider or run fully local models instead of using GitHub-hosted model routing
Summary
Configure Copilot CLI to use Azure OpenAI, Anthropic, or any OpenAI-compatible endpoint by setting a few environment variables before launching the CLI. Set COPILOT_OFFLINE=true to prevent Copilot CLI from contacting GitHub’s servers. When using your own model provider, GitHub authentication is not required. Your model must support tool calling and streaming.