AWS Machine Learning Blog 前天 05:57
Thomson Reuters 推广无代码AI平台Open Arena
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

Thomson Reuters(TR)推出了Open Arena,一个基于Amazon Bedrock等AWS服务的无代码AI解决方案,旨在赋能所有员工,无论技术背景如何,都能创建定制化AI应用。该平台已实现约70%的员工采纳率,拥有19,000月活跃用户,并已生成数千个定制AI解决方案。Open Arena从早期原型快速发展为企业级解决方案,显著提高了AI意识,促进了跨团队协作,并加速了AI能力开发。它解决了企业在AI赋能、安全质量、速度复用及资源成本管理方面的挑战,支持多种AI应用场景,如内容创作、数据分析和问题解决等。TR选择AWS是因为其全面的AI/ML能力、企业级安全、可扩展基础设施以及与AWS的良好合作关系。

💡 **AI民主化与无代码平台Open Arena:** Thomson Reuters(TR)推出了Open Arena,一个创新的无代码AI解决方案,旨在将AI能力普及给组织内的所有员工,无论其技术背景如何。该平台利用Amazon Bedrock等AWS服务,使用户能够轻松创建定制化的AI应用程序,从而极大地扩展了AI的应用范围并提高了工作效率。Open Arena已实现约70%的员工采纳率,拥有19,000月活跃用户,并成功生成了数千个定制AI解决方案,证明了无代码AI在企业级应用中的巨大潜力。

🚀 **快速演进与企业级赋能:** Open Arena最初作为一个快速原型项目,在生成式AI爆发初期由TR Labs开发,仅用不到六周时间便得以实现。其显著的成功和对新功能的需求推动了其向企业版发展。企业版Open Arena建立在TR AI平台上,提供安全、可扩展且标准化的服务,覆盖整个AI开发生命周期,大大缩短了产品上市时间。它通过提供100%自助服务功能,使各种技术背景的用户都能开发、评估和部署生成式AI解决方案,真正实现了AI的民主化。

✅ **解决企业AI挑战与广泛应用场景:** Open Arena有效应对了企业在AI应用中面临的四大关键挑战:赋能(为不同用户群体提供一致的AI解决方案构建体验)、安全与质量(通过评估和监控服务确保AI解决方案的质量,并遵守数据治理和伦理政策)、速度与复用(自动化工作流并利用现有的AI解决方案和提示词)以及资源与成本管理(追踪和展示AI解决方案的资源消耗,提升透明度和效率)。该平台支持包括技术支持、内容创作、编码辅助、数据提取与分析、校对、项目管理、内容摘要、个人发展、翻译和问题解决在内的多种AI体验,满足了组织内不同用户的多样化需求。

☁️ **AWS的战略选择与技术架构:** Thomson Reuters选择AWS作为Open Arena的主要云提供商,是基于AWS全面的AI/ML能力(特别是Amazon Bedrock)、企业级安全与治理、可扩展的基础设施以及TR与AWS之间已有的深厚合作关系和专业知识。Open Arena的架构设计优先考虑了可扩展性、可扩展性和安全性,同时保持了对非技术用户的简洁性。它利用Amazon Bedrock进行基础模型交互,并通过Amazon Bedrock Flows作为自定义工作流构建器,允许用户通过拖放组件创建复杂的AI工作流。此外,该架构还使用了Amazon OpenSearch、Amazon DynamoDB、Amazon API Gateway和AWS Lambda等服务,确保了数据隔离、安全性和高效的资源管理。

This post is cowritten by Laura Skylaki, Vaibhav Goswami, Ramdev Wudali and Sahar El Khoury from Thomson Reuters.

Thomson Reuters (TR) is a leading AI and technology company dedicated to delivering trusted content and workflow automation solutions. With over 150 years of expertise, TR provides essential solutions across legal, tax, accounting, risk, trade, and media sectors in a fast-evolving world.

TR recognized early that AI adoption would fundamentally transform professional work. According to TR’s 2025 Future of Professionals Report, 80% of professionals anticipate AI significantly impacting their work within five years, with projected productivity gains of up to 12 hours per week by 2029. To unlock this immense potential, TR needed a solution to democratize AI creation across its organization.

In this blog post, we explore how TR addressed key business use cases with Open Arena, a highly scalable and flexible no-code AI solution powered by Amazon Bedrock and other AWS services such as Amazon OpenSearch Service, Amazon Simple Storage Service (Amazon S3), Amazon DynamoDB, and AWS Lambda. We’ll explain how TR used AWS services to build this solution, including how the architecture was designed, the use cases it solves, and the business profiles that use it. The system demonstrates TR’s successful approach of using existing TR services for rapid launches while supporting thousands of users, showcasing how organizations can democratize AI access and support business profiles (for example, AI explorers and SMEs) to create applications without coding expertise.

Introducing Open Arena: No-code AI for all

TR introduced Open Arena to non-technical professionals to create their own customized AI solutions. With Open Arena users can use cutting-edge AI powered by Amazon Bedrock in a no-code environment, exemplifying TR’s commitment to democratizing AI access.

Today, Open Arena supports:

The Open Arena journey: From prototype to enterprise solution

Conceived as a rapid prototype, Open Arena was developed in under six weeks at the onset of the generative AI boom in early 2023 by TR Labs – TR’s dedicated applied research division focused on the research, development, and application of AI and emerging trends in technologies. The goal was to support internal team exploration of large language models (LLMs) and discover unique use cases by merging LLM capabilities with TR company data.

Open Arena’s introduction significantly increased AI awareness, fostered developer-SME collaboration for groundbreaking concepts, and accelerated AI capability development for TR products. The rapid success and demand for new features quickly highlighted Open Arena’s potential for AI democratization, so TR developed an enterprise version of Open Arena. Built on the TR AI Platform, Open Arena enterprise version offers secure, scalable, and standardized services covering the entire AI development lifecycle, significantly accelerating time to production.

The Open Arena enterprise version uses existing system capabilities for enhanced data access controls, standardized service access, and compliance with TR’s governance and ethical standards. This version introduced self-served capabilities so that every user, irrespective of their technical ability, can create, evaluate, and deploy customized AI solutions in a no-code environment.

The foundation of the AI Platform has always been about empowerment; in the early days it was about empowering Data Scientists but with the rise of Gen AI, the platform adapted and evolved on empowering users of any background to leverage and create AI Solutions.”

– Maria Apazoglou, Head of AI Engineering, CoCounsel

As of July 2025, the TR Enterprise AI Platform consists of 15 services spanning the entire AI development lifecycle and user personas. Open Arena remains one of its most popular, serving 19,000 users each month, with increasing monthly usage.

Addressing key enterprise AI challenges across user types

Using the TR Enterprise AI Platform, Open Arena helped thousands of professionals transition into using generative AI. AI-powered innovation is now readily in the hands of everyone, not just AI scientists.

Open Arena successfully addresses four critical enterprise AI challenges:

The solution currently supports several AI experiences, including tech support, content creation, coding assistance, data extraction and analysis, proof reading, project management, content summarization, personal development, translation, and problem solving, catering to different user needs across the organization.

Figure 1. Examples of Open Arena use cases.

AI explorers use Open Arena to speed up day-to-day tasks, such as summarizing documents, engaging in LLM chat, building custom workflows, and comparing AI models. AI creators and Subject Matter Experts (SMEs) use Open Arena to build custom AI workflows and experiences and to evaluate solutions without requiring coding knowledge. Meanwhile, developers can develop and deploy new AI solutions at speed, training models, creating new AI skills, and deploying AI capabilities.

Why Thomson Reuters selected AWS for Open Arena

TR strategically chose AWS as a primary cloud provider for Open Arena based on several critical factors:

“Our long-standing partnership with AWS and their robust, flexible and innovative services made them the natural choice to power Open Arena and accelerate our AI initiatives.”

– Maria Apazoglou, Head of AI Engineering, CoCounsel

Open Arena architecture: Scalability, extensibility, and security

Designed for a broad enterprise audience, Open Arena prioritizes scalability, extensibility and security while maintaining simplicity for non-technical users to create and deploy AI solutions. The following diagram illustrates the architecture of Open Arena.

Figure 2. Architecture design of Open Arena.

The architecture design facilitates enterprise-grade performance with clear separation between capability and usage, aligning with TR’s enterprise cost and usage tracking requirements.

The following are key components of the solution architecture:

TR developed Open Arena using AWS services such as Amazon Bedrock, Amazon OpenSearch, Amazon DynamoDB, Amazon API Gateway, AWS Lambda, and AWS Step Functions. It uses Amazon Bedrock for foundational model interactions, supporting simple chat and complex Retrieval-Augmented Generation (RAG) tasks. Open Arena uses Amazon Bedrock Flows as the custom workflow builder where users can drag-and-drop components like prompts, agents, knowledge bases and Lambda functions to create sophisticated AI workflows without coding. The system also integrates with AWS OpenSearch for knowledge bases and external APIs for advanced agent capabilities.

For data separation, orchestration is managed using the Enterprise AI Platform AWS account, capturing operational data. Flow instances and user-specific data reside in the user’s dedicated AWS account, stored in a database. Each user’s data and workflow executions are isolated within their respective AWS accounts, which is required for complying with Thomson Reuters data sovereignty and enterprise security policies with strict regional controls. The system integrates with Thomson Reuters SSO solution to automatically identify users and grant secure, private access to foundational models.

The orchestration layer, centrally hosted within the Enterprise AI Platform AWS account, manages AI workflow activities, including scheduling, deployment, resource provisioning, and governance across user environments.

The system features fully automated provisioning of  Amazon Bedrock Flows directly within each user’s AWS account, avoiding manual setup and accelerating time to value. Using AWS Lambda for serverless compute and DynamoDB for scalable, low-latency storage, the system dynamically allocates resources based on real-time demand. This architecture makes sure prompt flows and supporting infrastructure are deployed and scaled to match workload fluctuations, optimizing performance, cost, and user experience.

“Our decision to adopt a cross-account architecture was driven by a commitment to enterprise security and operational excellence. By isolating orchestration from execution, we make sure that each user’s data remains private and secure within their own AWS account, while still delivering a seamless, centrally-managed experience. This design empowers organizations to innovate rapidly without compromising compliance or control.”

– Thomson Reuters’ architecture team

Evolution of Open Arena: From classic to Amazon Bedrock Flows-powered chain builder

Open Arena has evolved to cater to varying levels of user sophistication:

Thomson Reuters uses Amazon Bedrock Flows as a core feature of Chain Builder. Users can define, customize, and deploy AI-driven workflows using Amazon Bedrock models. Bedrock Flows supports advanced workflows combining multiple prompt nodes, incorporating AWS Lambda functions, and supporting sophisticated RAG pipelines. Operating seamlessly across user AWS accounts, Bedrock Flows facilitates secure, scalable execution of personalized AI solutions, serving as the fundamental engine for the Chain Builder workflows and driving TR’s ability to deliver robust, enterprise-grade automation and innovation.

What’s next?

TR continues to expand Open Arena’s capabilities through the strategic partnership with AWS, focusing on:

“From innovating new product ideas to reimagining daily tasks for Thomson Reuters employees, we continue to push the boundaries of what’s possible with Open Arena.”

– Maria Apazoglou, Head of AI Engineering, CoCounsel

Conclusion

In this blog post, we explored how Thomson Reuters’ Open Arena demonstrates the successful democratization of AI across an enterprise by using AWS services, particularly Amazon Bedrock and Bedrock Flows. With 19,000 monthly active users and 70% employee adoption, the system proves that no-code AI solutions can deliver enterprise-scale impact while maintaining security and governance standards.

By combining the robust infrastructure of AWS with innovative architecture design, TR has created a blueprint for AI democratization that empowers professionals across technical skill levels to harness generative AI for their daily work.

As Open Arena continues to evolve, it exemplifies how strategic cloud partnerships can accelerate AI adoption and transform how organizations approach innovation with generative AI.


About the authors

Laura Skylaki, PhD, leads the Enterprise AI Platform at Thomson Reuters, driving the development of GenAI services that accelerate the creation, testing and deployment of AI solutions, enhancing product value. A recognized expert with a doctorate in stem cell bioinformatics, her extensive experience in AI research and practical application spans legal, tax, and biotech domains. Her machine learning work is published in leading academic journals, and she is a frequent speaker on AI and machine learning

Vaibhav Goswami is a Lead Software Engineer on the AI Platform team at Thomson Reuters, where he leads the development of the Generative AI Platform that empowers users to build and deploy generative AI solutions at scale. With expertise in building production-grade AI systems, he focuses on creating tools and infrastructure that democratize access to cutting-edge AI capabilities across the enterprise.

Ramdev Wudali is a Distinguished Engineer, helping architect and build the AI/ML Platform to enable the Enterprise user, data scientists and researchers to develop Generative AI and machine learning solutions by democratizing access to tools and LLMs. In his spare time, he loves to fold paper to create origami tessellations, and wearing irreverent T-shirts

As the director of AI Platform Adoption and Training, Sahar El Khoury guides users to seamlessly onboard and successfully use the platform services, drawing on her experience in AI and data analysis across robotics (PhD), financial markets, and media.

Vu San Ha Huynh is a Solutions Architect at AWS with a PhD in Computer Science. He helps large Enterprise customers drive innovation across different domains with a focus on AI/ML and Generative AI solutions.

Paul Wright is a Senior Technical Account Manager, with over 20 years experience in the IT industry and over 7 years of dedicated cloud focus. Paul has helped some of the largest enterprise customers grow their business and improve their operational excellence. In his spare time Paul is a huge football and NFL fan.

Mike Bezak is a Senior Technical Account Manager in AWS Enterprise Support. He has over 20 years of experience in information technology, primarily disaster recovery and systems administration. Mike’s current focus is helping customers streamline and optimize their AWS Cloud journey. Outside of AWS, Mike enjoys spending time with family & friends.

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

Thomson Reuters Open Arena AI 无代码AI Amazon Bedrock AWS 企业AI AI民主化 No-code AI Enterprise AI AI Democratization
相关文章