钛媒体:引领未来商业与生活新知 05月19日
Nvidia Launches Lepton Platform to Streamline AI Chip Marketplace
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

英伟达发布Lepton平台,旨在为AI算力创建一个中心化市场,连接提供英伟达GPU的云服务商和寻求算力来训练AI模型的软件开发者。目前,CoreWeave和Nebius Group等新兴的“neoclouds”已加入该平台。Lepton旨在简化GPU资源的获取过程,解决当前算力供需信息不对称的问题。虽然微软、亚马逊和谷歌等大型云服务商暂未加入,但英伟达表示欢迎他们参与。Lepton未来将支持地理位置筛选,帮助开发者寻找特定国家的GPU,满足数据存储需求。此举被认为是英伟达的一项战略举措,旨在为其庞大的开发者群体提供更便捷的英伟达技术访问途径。

🚀Lepton平台旨在连接提供英伟达GPU的云服务商和需要算力训练AI模型的软件开发者,简化GPU资源的获取流程,解决目前市场上GPU算力信息不对称的问题。

🏢包括CoreWeave和Nebius Group在内的新兴云服务商“neoclouds”,以及Crusoe、Firmus、富士康、GMI Cloud、Lambda、Nscale、软银和Yotta Data Services等公司,已率先加入Lepton市场。

🌍Lepton平台未来将支持地理位置筛选功能,允许开发者在特定国家寻找GPU资源,从而满足数据本地化和存储需求,更好地服务全球开发者。

👨‍💻IDC分析师认为,Lepton平台是英伟达的一项战略举措,旨在为其庞大的开发者群体(近500万)提供更广泛、更便捷的英伟达技术访问途径。


Credit: CFP


AsianFin -- Nvidia unveiled a new software platform on Monday aimed at creating a centralized marketplace for AI computing power, as demand for the company's high-performance GPUs continues to soar.

The platform, called Lepton, is designed to connect cloud providers offering Nvidia graphics processing units (GPUs) with software developers seeking computing capacity for training AI models. Nvidia's GPUs are widely considered the industry standard for artificial intelligence workloads.

A growing wave of specialized cloud providers—dubbed "neoclouds"—such as CoreWeave and Nebius Group, has emerged to lease Nvidia GPUs to developers. These companies, along with Crusoe, Firmus, Foxconn, GMI Cloud, Lambda, Nscale, SoftBank Corp, and Yotta Data Services, are among the first to join the Lepton marketplace.

Despite booming demand, the process of sourcing available GPUs remains inefficient, said Alexis Bjorlin, vice president of Nvidia's cloud business. "It's almost like everyone's calling everyone to find out what compute capacity is available," she said. "We're trying to make it seamless—to grow the ecosystem and open up access to Nvidia's developer community."

Notably absent from Lepton's launch are hyperscale cloud providers like Microsoft, Amazon Web Services, and Google Cloud. However, Bjorlin said the platform is open for them to participate and list their own capacity if they choose.

Lepton will eventually support geographic filtering, allowing developers to locate GPUs in specific countries to comply with data residency and storage requirements. It will also enable companies that already own Nvidia hardware to easily find additional capacity to rent.

IDC's Group Vice President Mario Morales said the platform is a strategic move by Nvidia. "They have close to 5 million developers. This is about giving them broader and easier access to Nvidia technology," he noted.

Nvidia has not yet revealed the business model for Lepton, including whether it will collect commissions or fees. However, Bjorlin confirmed that developers will continue to maintain direct relationships with the compute providers they contract with.

 

更多精彩内容,关注钛媒体微信号(ID:taimeiti),或者下载钛媒体App

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

英伟达 Lepton AI算力 GPU 云计算
相关文章