TechCrunch News 10月10日 07:51
Reflection AI 融资 20 亿美元,旨在构建美国开源大模型
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

初创公司 Reflection AI 近期宣布完成 20 亿美元融资,估值达到 80 亿美元,较七个月前大幅增长。该公司由前 DeepMind 研究员创立,正致力于成为 OpenAI 和 Anthropic 等封闭实验室的开源替代品,并对标中国的 DeepSeek 等 AI 公司。Reflection AI 计划利用这笔资金构建一个先进的 AI 训练堆栈,并计划于明年发布一款训练数据量达数十万亿 token 的前沿语言模型。公司 CEO Misha Laskin 表示,Reflection AI 已识别出一种可扩展的商业模式,该模式与公司的“开放智能战略”相符,将通过面向大型企业和政府提供定制化、可控的 AI 模型来盈利。此举旨在确保美国在人工智能领域的竞争力。

🌟 **融资与定位转型**:Reflection AI,一家由前 DeepMind 研究员创立的初创公司,已成功融资 20 亿美元,估值飙升至 80 亿美元。该公司在成立初期专注于自主编码代理,现已转型为一家致力于成为 OpenAI 和 Anthropic 等封闭 AI 实验室的开源替代品,并与中国的 DeepSeek 等公司竞争的平台。这一战略转型意在为美国 AI 领域注入新的活力,并确保其在全球竞争中的地位。

🚀 **技术实力与愿景**:Reflection AI 的创始人拥有开发 AlphaGo 和 Gemini 等先进 AI 系统的背景,这为他们构建前沿模型的能力提供了有力支撑。公司宣布已组建一支来自 DeepMind 和 OpenAI 的顶尖人才团队,并开发了一个先进的 AI 训练堆栈,承诺将向所有人开放。其目标是发布一款规模庞大的语言模型,并将其应用于通用智能体推理,挑战当前大型封闭实验室的垄断地位。

💰 **商业模式与开放策略**:Reflection AI 旨在通过一种“开放智能战略”实现盈利。虽然模型权重将对公众开放,以便开发者和研究人员使用和定制,但数据集和完整的训练流程将保持专有。公司的主要收入将来自构建在 Reflection AI 模型之上的大型企业产品,以及各国政府开发的“主权 AI”系统。这种模式旨在满足企业对模型所有权、成本控制和定制化的需求,同时保持技术领先性。

Reflection AI, a startup founded just last year by two former Google DeepMind researchers, has raised $2 billion at an $8 billion valuation, a whopping 15x leap from its $545 million valuation just seven months ago. The company, which originally focused on autonomous coding agents, is now positioning itself as both an open source alternative to closed frontier labs like OpenAI and Anthropic, and a Western equivalent to Chinese AI firms like DeepSeek.

The startup was launched in March 2024 by Misha Laskin, who led reward modeling for DeepMind’s Gemini project, and Ioannis Antonoglou, who co-created AlphaGo, the AI system that famously beat the world champion in the board game Go in 2016. Their background developing these very advanced AI systems is central to their pitch, which is that the right AI talent can build frontier models outside established tech giants.

Along with its new round, Reflection AI announced that it has recruited a team of top talent from DeepMind and OpenAI, and built an advanced AI training stack that it promises will be open for all. Perhaps most importantly, Reflection AI says it has “identified a scalable commercial model that aligns with our open intelligence strategy.”

Reflection AI’s team currently numbers about 60 people — mostly AI researchers and engineers across infrastructure, data training, and algorithm development, per Laskin, the company’s CEO. Reflection AI has secured a compute cluster and hopes to release a frontier language model next year that’s trained on “tens of trillions of tokens,” he told TechCrunch.

“We built something once thought possible only inside the world’s top labs: a large-scale LLM and reinforcement learning platform capable of training massive Mixture-of-Experts (MoEs) models at frontier scale,” Reflection AI wrote in a post on X. “We saw the effectiveness of our approach first-hand when we applied it to the critical domain of autonomous coding. With this milestone unlocked, we’re now bringing these methods to general agentic reasoning.”

MoE refers to a specific architecture that powers frontier LLMs — systems that, previously, only large, closed AI labs were capable of training at scale. DeepSeek had a breakthrough moment when it figured out how to train these models at scale in an open way, followed by Qwen, Kimi, and other models in China.

“DeepSeek and Qwen and all these models are our wake-up call because if we don’t do anything about it, then effectively, the global standard of intelligence will be built by someone else,” Laskin said. “It won’t be built by America.”

Techcrunch event

San Francisco | October 27-29, 2025

Laskin added that this puts the U.S. and its allies at a disadvantage because enterprises and sovereign states often won’t use Chinese models due to potential legal repercussions.

“So you can either choose to live at a competitive disadvantage or rise to the occasion,” Laskin said.

American technologists have largely celebrated Reflection AI’s new mission. David Sacks, the White House AI and Crypto Czar, posted on X: “It’s great to see more American open source AI models. A meaningful segment of the global market will prefer the cost, customizability, and control that open source offers. We want the U.S. to win this category too.”

Clem Delangue, co-founder and CEO of Hugging Face, an open and collaborative platform for AI builders, told TechCrunch of the round, “This is indeed great news for American open-source AI.” Added Delangue, “Now the challenge will be to show high velocity of sharing of open AI models and datasets (similar to what we’re seeing from the labs dominating in open-source AI).”

Reflection AI’s definition of being “open” seems to center on access rather than development, similar to strategies from Meta with Llama or Mistral. Laskin said Reflection AI would release model weights — the core parameters that determine how an AI system works — for public use while largely keeping datasets and full training pipelines proprietary.

“In reality, the most impactful thing is the model weights, because the model weights anyone can use and start tinkering with them,” Laskin said. “The infrastructure stack, only a select handful of companies can actually use that.”

That balance also underpins Reflection AI’s business model. Researchers will be able to use the models freely, Laskin said, but revenue will come from large enterprises building products on top of Reflection AI’s models and from governments developing “sovereign AI” systems, meaning AI models developed and controlled by individual nations.

“Once you get into that territory where you’re a large enterprise, by default you want an open model,” Laskin said. “You want something you will have ownership over. You can run it on your infrastructure. You can control its costs. You can customize it for various workloads. Because you’re paying some ungodly amount of money for AI, you want to be able to optimize it as much as much as possible, and really that’s the market that we’re serving.”

Reflection AI hasn’t yet released its first model, which will be largely text-based, with multimodal capabilities in the future, according to Laskin. It will use the funds from this latest round to get the compute resources needed to train the new models, the first of which the company is aiming to release early next year.

Investors in Reflection AI’s latest round include Nvidia, Disruptive, DST, 1789, B Capital, Lightspeed, GIC, Eric Yuan, Eric Schmidt, Citi, Sequoia, CRV, and others.

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

Reflection AI AI 开源AI 大模型 融资 人工智能 Reflection AI AI Open Source AI Large Language Models Funding Artificial Intelligence
相关文章