Newsroom Anthropic 09月13日
Anthropic API推出Message Batches API
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

Anthropic API正式发布Message Batches API,这是一个强大的异步处理大量查询的解决方案。该API允许开发者一次性提交高达10,000个查询,处理时间少于24小时,费用比标准API调用低50%。这对于处理非实时任务非常高效和经济。该API目前支持Anthropic API上的Claude 3.5 Sonnet、Claude 3 Opus和Claude 3 Haiku模型,并在Amazon Bedrock中提供批量推理支持。Google Cloud的Vertex AI也即将支持Claude的批量处理。这一更新使得大规模数据处理更加经济可行,例如分析整个企业文档库,涉及数百万文件,通过批量处理折扣可以更经济地完成。

🔍 Message Batches API允许开发者一次性提交高达10,000个查询,处理时间少于24小时,费用比标准API调用低50%,极大地提高了处理非实时任务的高效性和经济性。

🚀该API目前支持Anthropic API上的Claude 3.5 Sonnet、Claude 3 Opus和Claude 3 Haiku模型,并在Amazon Bedrock中提供批量推理支持,未来Google Cloud的Vertex AI也将支持Claude的批量处理。

💡通过批量处理折扣,大规模数据处理变得更加经济可行,例如分析整个企业文档库,涉及数百万文件,可以更经济地完成,解锁了新的数据处理可能性。

📈该API的推出使得处理大量数据(如分析客户反馈、翻译语言)更加高效,开发者无需管理复杂的排队系统或担心速率限制,只需提交批量查询, Anthropic将负责处理,同时享受50%的折扣。

🌐Message Batches API为大规模数据处理提供了新的可能性,使得以前不太实用或成本高昂的任务变得更加经济,例如分析整个企业文档库,涉及数百万文件。

Update: As of December 17, 2024, the Message Batches API is Generally Available on the Anthropic API. Customers using Claude in Amazon Bedrock can use batch inference. Batch predictions is also available in preview on Google Cloud’s Vertex AI.


We’re introducing a new Message Batches API—a powerful, cost-effective way to process large volumes of queries asynchronously.

Developers can send batches of up to 10,000 queries per batch. Each batch is processed in less than 24 hours and costs 50% less than standard API calls. This makes processing non-time-sensitive tasks more efficient and cost-effective.

The Batches API is available today in public beta with support for Claude 3.5 Sonnet, Claude 3 Opus, and Claude 3 Haiku on the Anthropic API. Customers using Claude in Amazon Bedrock can use batch inference. Support for batch processing for Claude on Google Cloud’s Vertex AI is coming soon.

High throughput at half the cost

Developers often use Claude to process vast amounts of data—from analyzing customer feedback to translating languages—where real-time responses aren't necessary.

Instead of managing complex queuing systems or worrying about rate limits, you can use the Batches API to submit groups of up to 10,000 queries and let Anthropic handle the processing at a 50% discount. Batches will be processed within 24 hours, though often much quicker. Additional benefits include:

    Enhanced throughput: Enjoy higher rate limits to process much larger request volumes without impacting your standard API rate limits.Scalability for big data: Handle large-scale tasks such as dataset analysis, classification of large datasets, or extensive model evaluations without infrastructure concerns.

The Batches API unlocks new possibilities for large-scale data processing that were previously less practical or cost-prohibitive. For example, analyzing entire corporate document repositories—which might involve millions of files—becomes more economically viable by leveraging our batching discount.

Pricing

The Batches API allows you to take advantage of infrastructure cost savings and is offered at a 50% discount for both input and output tokens.

Claude 3.5 Sonnet
    Our most intelligent model to date200K context window
Batch Input
    $1.50 / MTok
Batch Output
    $7.50 / MTok
Claude 3 Opus
    Powerful model for complex tasks200K context window
Batch Input
    $7.50 / MTok
Batch Output
    $37.50 / MTok
Claude 3 Haiku
    Fastest, most cost-effective model200K context window
Batch Input
    $0.125 / MTok
Batch Output
    $0.625 / MTok
Pricing table for the Batch API

Customer Spotlight: Quora

Quora, a user-based question-and-answer platform, leverages Anthropic's Batches API for summarization and highlight extraction to create new end-user features.

"Anthropic's Batches API provides cost savings while also reducing the complexity of running a large number of queries that don't need to be processed in real time," said Andy Edmonds, Product Manager at Quora. "It's very convenient to submit a batch and download the results within 24 hours, instead of having to deal with the complexity of running many parallel live queries to get the same result. This frees up time for our engineers to work on more interesting problems.”

Get started

To start using the Batches API in public beta on the Anthropic API, explore our documentation and pricing page.

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

Anthropic API Message Batches API 批量处理 异步处理 大规模数据处理
相关文章