vscode官方Blog 10月23日 02:20
VS Code 扩展增强模型选择,支持自带密钥
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

VS Code 近期通过 v1.104 版本引入了语言模型聊天提供商 API,进一步扩展了模型的选择范围。这项更新允许模型提供商通过 VS Code 扩展直接贡献模型,从而构建一个开放、可扩展的生态系统。此前推出的 BYOK(Bring Your Own Key)功能已支持用户使用来自 OpenRouter、Ollama、Google、OpenAI 等供应商的数百种模型。新的 API 允许任何提供商通过简单的扩展安装来提供模型,这将极大地满足开发者对模型多样性的需求。目前,通过此 API 提供的模型适用于个人 GitHub Copilot 计划用户。文中还介绍了 AI Toolkit、Cerebras Inference、Hugging Face Provider 等几个推荐的扩展,并提及了对 OpenAI 兼容模型的支持以及未来的发展计划。

🚀 **扩展模型选择能力**:VS Code 通过引入语言模型聊天提供商 API,允许模型提供商直接通过扩展贡献模型,打破了中心化系统,构建了一个开放、可扩展的生态系统,极大地满足了开发者对模型多样化的需求。

🔑 **BYOK 功能升级**:在已有的 Bring Your Own Key (BYOK) 功能基础上,用户现在可以通过简单的扩展安装,便捷地接入来自不同供应商(如 OpenRouter, Ollama, Google, OpenAI 等)的数百种模型,用于 VS Code 中的聊天体验。

💡 **精选推荐扩展**:文章推荐了几个实用的 VS Code 扩展,包括 AI Toolkit for Visual Studio Code、Cerebras Inference 和 Hugging Face Provider for GitHub Copilot Chat,它们分别提供了对自定义模型、本地模型、以及 Hugging Face 上前沿开源大模型的支持,加速了开发者的工作流程。

🌐 **OpenAI 兼容模型与未来展望**:VS Code 还支持 OpenAI 兼容模型,并提供了配置选项。未来的发展计划包括改进模型管理用户界面、简化模型安装流程以及优化内置语言模型提供商,旨在提供更原生、更全面的模型使用体验。

Expanding Model Choice in VS Code with Bring Your Own Key

October 22, 2025 by Olivia Guzzardo McVicker, Pierce Boggan

We know that model choice is important to you. Our team has been hard at work making the latest models like Claude Haiku 4.5 and GPT 5 available to you on the same day they were announced. But we've also heard your feedback that you want support for even more models in VS Code, be it locally or in the cloud.

In March, we released the bring your own key (BYOK) functionality to let you pick from hundreds of models from supported providers like OpenRouter, Ollama, Google, OpenAI, and more to power chat experiences in VS Code.

Now, we're taking BYOK to the next level. In the v1.104 release, we introduced the Language Model Chat Provider API that enables model providers to contribute their models directly through VS Code extensions.

What is Bring Your Own Key (BYOK)?

BYOK lets you use any model from a supported provider by bringing your own API key for that provider. This means you can access a vast ecosystem of models beyond those built into VS Code. Whether you want to use a specialized model for code generation, a different model for general chat, or experiment with local models through providers like Ollama, BYOK makes it possible with just your API key. You can configure this through the Chat: Manage Language Models command.

But managing an ever-growing list of supported providers presented challenges for both users and our team. That's why we've released the Language Model Chat Provider API, allowing model providers to contribute their models directly through VS Code extensions.

The Language Model Chat Provider API

The Language Model Chat Provider API shifts BYOK from a centralized system to an open, extensible ecosystem where any provider can offer their models with a simple extension install. We will still support a subset of built-in providers, but this extensible ecosystem will allow us to scale out our model choice to meet developers' needs.

Note

Models provided through the Language Model Chat Provider API are currently available to users on individual GitHub Copilot plans (Free, Pro, and Pro+).

Here are some of our favorite extensions you can install right now to get access to more models in VS Code:

    The AI Toolkit for Visual Studio Code extension gives you access to its provided models directly in VS Code, whether it's a custom model you've tuned in Azure AI Foundry, a local model via Foundry Local, or any of the models in GitHub Models.

    Cerebras Inference powers the world's top coding models, making code generation pretty much instant, great for rapid iteration. It runs Qwen3 Coder and GPT OSS 120B at 2,000 tokens/s which is 20x faster than most inference APIs.

    The Hugging Face Provider for GitHub Copilot Chat extension enables you to use frontier open LLMs like Kimi K2, DeepSeek V3.1, GLM 4.5 directly in VS Code. Hugging Face’s Inference Providers give developers access to hundreds of LLMs, powered by world-class inference providers built for high availability and low latency.

For extension developers interested in contributing their own models, check out our Language Model Chat Provider API documentation and sample extension to get started building today.

OpenAI-compatible Models

For developers using OpenAI-compatible models, you can use the custom OpenAI Compatible provider for any OpenAI-compatible API endpoint and configure the models for use in chat. This feature is currently available in VS Code Insiders only.

Additionally, you can explicitly configure the list of edit tools through the github.copilot.chat.customOAIModels setting, giving you fine-grained control over which capabilities are available for your custom models.

What's Next?

The Language Model Chat Provider API is just the beginning of bringing more model choice to you. As this ecosystem grows, we expect to see:

    Model management UI that allows you to learn about model capabilities and manage modelsSmoother flow for installing extensions that contribute language modelsImprovements to the built-in language model providers, using latest provider APIs and having specialized prompts depending on the model

We're continuously investing in the BYOK experience. Recent enhancements include improved edit tools for better integration with VS Code's built-in tools, but we know there's work to be done to make the experience feel more native to VS Code - for example, BYOK does not currently work with completions. We'd love to hear your feedback on our GitHub repository!

Happy coding!

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

VS Code AI Language Models BYOK Extensions GitHub Copilot OpenAI Hugging Face Cerebras
相关文章