vscode官方Blog 前天 21:42
VS Code 将AI内联建议开源,迈入新里程碑
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

VS Code团队宣布,将GitHub Copilot的内联代码建议功能正式开源,这是其将VS Code打造成开源AI编辑器的重要一步。此前,GitHub Copilot的聊天功能已开源,而此次的内联建议功能是AI辅助编码的核心部分。未来,所有Copilot功能将整合到名为Copilot Chat的单一扩展中,逐步淘汰原有的Copilot扩展。用户体验将保持一致,继续享受智能代码提示及聊天功能。此次重构优化了建议的检测、缓存、请求复用、提示构建、模型推理、后处理和多行智能等环节,显著提升了性能和响应速度,并减少了延迟。VS Code团队欢迎社区的贡献和反馈,并计划将更多AI功能集成到VS Code核心中。

🚀 **AI功能整合与开源**: VS Code正逐步将GitHub Copilot的AI功能,包括此前已开源的聊天功能和此次开源的内联代码建议,整合到一个名为Copilot Chat的单一扩展中,旨在提供统一且更强大的AI编码辅助体验,并促进社区参与。

💡 **内联建议技术优化**: 新的开源内联建议系统通过“键入时建议”检测、智能缓存、复用进行中的LLM请求、优化提示构建、多提供者模型推理以及代码风格适配等技术,显著提升了代码建议的生成速度和质量,并降低了延迟。

🔧 **用户体验与过渡**: 此次整合对用户而言体验基本无变化,仍能获得智能代码提示和聊天功能。原Copilot扩展将于2026年初被弃用,用户可选择性地回退到之前的双扩展模式以应对潜在问题,并可通过社区反馈帮助完善。

🌐 **社区协作与未来展望**: 内联建议代码已在vscode-copilot-chat仓库公开,欢迎开发者探索、贡献代码,共同塑造未来AI编辑器的发展方向。下一步计划是将部分AI功能重构进VS Code核心。

Open Source AI Editor: Second Milestone

November 6th, 2025 by the VS Code Team

In May, we announced our initial plan to make VS Code an open source AI editor, and in June, we reached our first milestone by open sourcing the GitHub Copilot Chat extension.

While chat was a significant step forward, an important part of our AI functionality still remained: the inline suggestions that appear as you type. Today, we're reaching that next milestone in our journey: inline suggestions are now open source.

One extension, same user experience

For the past few years, GitHub Copilot in VS Code has been split across two extensions: the GitHub Copilot extension (for ghost text suggestions) and the GitHub Copilot Chat extension (for chat and next edit suggestions). We are working towards providing all Copilot functionality in a single VS Code extension: Copilot Chat.

To achieve this, we are now testing disabling the Copilot extension and serving all inline suggestions from Copilot Chat. We have ported the vast majority of features into the chat extension, so the progressive rollout of a single extension experience should feel consistent and transparent to everyone.

Nothing should change in your experience. You'll continue to get the same intelligent code suggestions as you type, plus all the chat and agent mode features you're already using. If you encounter any problems, please report an issue or see how to use the previous experience if needed.

As part of this refactoring, the GitHub Copilot extension will be deprecated by early 2026, which means it will be removed from the VS Code Marketplace.

We've also simplified our terminology: we now use inline suggestions to refer to all AI-generated code suggestions that appear as you type (including ghost text and next edit suggestions). We continue working to unify the actual product experiences as well, including the UX and timing for different kinds of suggestions.

Explore and contribute

With inline suggestions available in the vscode-copilot-chat repository, you can explore and contribute to howhttps://code.visualstudio.com/assets/blogs/2025/11/04/inline-suggestions-chart.png/inline-suggestions-chart.png" alt="Flow diagram displaying how inline suggestions work" loading="lazy">

    "Typing-as-suggested" detection - As you type, the extension first checks if you're following a previous suggestion and can continue showing it without making a new requestCaching - If not typing as suggested, the extension checks if cached suggestions can be reused to improve performanceReusing ongoing requests - If no cached suggestions are available, the extension checks if there's an ongoing LLM request from the previous keystroke that hasn't finished streaming back yet. Since this ongoing request is likely similar to the current request, the extension reuses it instead of firing off a new request and canceling the ongoing one, which significantly improves performancePrompt construction - If no ongoing request can be reused, the extension gathers relevant context from your current file, open files, and workspace, then formats it into a prompt to send to the LLMModel inference - The extension requests inline suggestions from multiple providers: ghost text suggestions for the current cursor position, and next edit suggestions that predict where you might edit next. Ghost text suggestions at the cursor are prioritized when available; otherwise, next edit suggestions are usedPost-processing - Raw model outputs are refined to ensure they fit your code style, indentation, and syntaxMulti-line intelligence - The extension decides whether to show a single line or multiple lines, based on confidence and context

Performance improvements

Along with consolidating into a single extension, this refactoring has led to technical improvements to inline suggestions:

    Reduced latency - We fixed networking issues to optimize how suggestions are delivered, enabling the chat extension to serve ghost text fasterQuality validation - We ran extensive experiments to ensure there are no regressions in either latency or suggestion quality

Troubleshooting

As with all changes, despite our best efforts, there is a chance that we missed something! If you encounter any issues with the unified extension experience, you can temporarily revert to the previous two-extension behaviohttps://code.visualstudio.comhttps://code.visualstudio.com/assets/blogs/2025/11/04/unify-setting.pngp>

What's next?

The next phase of our OSS journey is to refactor some AI features and components from the Copilot Chat extension into VS Code core. We're excited to continue this journey with the community and shape the future of development as an open source AI editor.

We'll continue actively improving our inline suggestions experiences - as always, you can follow along on

We welcome your feedback and contributions. Feel free to open pull requests and file issues.

Happy coding! 💙

The VS Code Team

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

VS Code GitHub Copilot AI 开源 内联建议 代码编辑器 Developer Tools
相关文章