All Content from Business Insider 09月22日
OpenAI将推出计算密集型新服务,部分功能限Pro用户
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

OpenAI首席执行官Sam Altman宣布,公司将在未来几周内推出新的计算密集型服务。由于成本原因,部分新功能将首先对Pro订阅用户开放,而一些新产品可能会额外收费。Altman表示,此举旨在探索在当前模型成本下,投入大量计算资源可以实现哪些新想法。OpenAI正努力平衡AI的可及性与高昂的计算成本,并致力于长期降低智能成本并普及其服务。文章还提及了AI行业对GPU的巨大需求,以及Meta等公司在GPU上的投入,凸显了计算能力在AI发展中的关键作用。

🚀 **计算密集型新服务上线**:OpenAI计划在未来几周内推出一系列计算需求更高的服务。CEO Sam Altman在X平台上宣布了这一消息,并指出这是为了探索在现有模型成本下,投入大量计算资源能够催生出哪些新的可能性。

💰 **收费策略与用户分级**:鉴于计算成本的增加,部分新功能将优先提供给Pro订阅用户。此外,一些全新的产品可能会设置额外的收费项目,以覆盖其高昂的运营成本,同时保持公司推动AI成本下降和普及服务的长期目标。

📊 **GPU需求与行业竞争**:文章强调了AI行业对GPU(图形处理器)的巨大需求。OpenAI首席产品官提到公司GPU使用量巨大,并计划到年底增加大量GPU。与其他科技巨头(如Meta)一样,OpenAI和其他公司都在积极争夺GPU资源,这反映了算力已成为AI领域竞争的关键要素。

OpenAI CEO Sam Altman.

Sam Altman is bracing OpenAI users for pricier, compute-heavy products — and hinting at what's next.

The OpenAI CEO said in a post on X on Monday that the company is launching "new compute-intensive offerings" over the next few weeks.

Altman said because of the costs involved, some features will initially be limited to Pro subscribers, while certain new products will have extra fees.

Altman framed the push as an experiment in stretching AI infrastructure to its limits: "We also want to learn what's possible when we throw a lot of compute, at today's model costs, at interesting new ideas," he wrote.

The announcement adds on to OpenAI's balancing act: making advanced AI accessible while covering the steep costs of compute. Altman said the company's intention "remains to drive the cost of intelligence down as aggressively as we can and make our services widely available."

"We are confident we will get there over time," he added.

Altman and OpenAI did not respond to a request for comment from Business Insider.

The race for GPUs

OpenAI has been vocal about its insatiable demand for computing power.

"Every time we get more GPUs, they immediately get used," OpenAI's chief product officer, Kevin Weil, said on an episode of the "Moonshot" podcast published last month.

Weil said the need for compute is simple: "The more GPUs we get, the more AI we'll all use." He highlighted that adding bandwidth made the explosion of video possible.

Altman said in July that the company will bring on more than 1 million GPUs by the end of the year.

"Very proud of the team but now they better get to work figuring out how to 100x that lol," Altman wrote on X in July.

For comparison, Elon Musk's xAI disclosed that it used a supercluster of over 200,000 GPUs called Colossus to help train Grok4.

Other tech giants have also been blunt about their appetite for GPUs.

Mark Zuckerberg said on an episode of the "Access" podcast published Thursday that Meta is making "compute per researcher" a competitive advantage and is outspending rivals on GPUs and the custom infrastructure needed to power them.

Read the original article on Business Insider

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

OpenAI Sam Altman AI Compute GPU Pro Subscription 人工智能 算力 GPU需求 Pro订阅
相关文章