GitHub Copilot bring your own key (BYOK) enhancements #184350
Replies: 4 comments
-
Beta Was this translation helpful? Give feedback.
-
|
Hoping to have this feature in Teams Plan in the future |
Beta Was this translation helpful? Give feedback.
-
|
Hope the BYOK feature can also be made available for individual accounts. |
Beta Was this translation helpful? Give feedback.
-
|
Hi this is a feature we can really benefit from, unfortunately, we struggle configure Anthropic via Microsoft Foundary. It looks like that you are not recognizing that Antropic with message endpoint is configured. |
Beta Was this translation helpful? Give feedback.

Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Bring your own key (BYOK) for GitHub Copilot just got more powerful. Enterprises can now connect a wider range of LLM providers, unlock structured outputs, fine-tune context windows, and stream responses in real time—giving you complete control over your AI infrastructure.
What's new
New provider options
Connect API keys from AWS Bedrock, Google AI Studio, and any OpenAI-compatible provider. These join Anthropic, Microsoft Foundry, OpenAI, and xAI as supported BYOK choices—giving your organization maximum flexibility in choosing models that fit your stack.
Support for the Responses API
BYOK now supports models using the Responses API, unlocking structured outputs and richer multimodal interactions. This enables more sophisticated use cases and cleaner integrations with your workflows.
Maximum context window configuration
Admins can now define a maximum context window for BYOK models, balancing cost, performance, and response quality based on your team's needs.
Streaming responses for faster interaction
Watch responses stream in real time as Copilot generates them, rather than waiting for full completion. This keeps your workflow smooth and responsive.
Start using the new BYOK features today
These capabilities are available now in public preview for GitHub Enterprise and Business customers. Head to your enterprise or organization settings, connect your LLM provider's API key, and start using your models in Copilot Chat and supported IDEs.
For detailed setup and configuration, visit our BYOK documentation.
Help us shape the future
We're just getting started, and your feedback will guide what comes next. Join the conversation below to share how BYOK is transforming your enterprise's Copilot experience.
Beta Was this translation helpful? Give feedback.
All reactions