AWS Machine Learning Blog 11月08日 07:17
实现跨账户Amazon Bedrock Agent与Redshift知识库的集成
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

本文提供了一种解决方案,用于连接位于不同AWS账户中的Amazon Bedrock Agent和Amazon Redshift知识库。当AI代理需要访问跨账户的结构化数据时,直接集成存在挑战。该方法利用AWS Lambda作为中间层,通过安全的无服务器架构实现跨账户数据访问,同时保持安全访问控制。文章详细介绍了解决方案的整体流程、关键组件、先决条件、详细的实施步骤以及最佳实践和资源清理方法,旨在帮助企业在多账户架构下,有效利用Redshift中的结构化数据驱动AI代理。

🎯 **实现跨账户数据访问**:该方案的核心在于解决Amazon Bedrock Agent与位于不同AWS账户的Amazon Redshift知识库之间的数据集成难题。通过引入AWS Lambda作为中介,构建了一个安全的无服务器架构,使得AI代理能够安全地查询跨账户的结构化数据,而无需复制数据或打破账户间的安全隔离。

⚙️ **核心组件与工作流程**:解决方案的关键组件包括位于“代理账户”的Amazon Bedrock Agent、位于“代理-知识库账户”的Amazon Redshift Serverless工作组、Amazon Bedrock Knowledge Base、一个Lambda函数,以及用于实现跨账户访问的IAM角色和策略。其工作流程是:用户在Bedrock Agent中提问,Agent通过Action Group调用Lambda函数,Lambda函数假设代理账户中创建的IAM角色以连接到知识库账户,最后Bedrock Knowledge Base使用其账户中的IAM角色访问Redshift查询数据。

🔧 **详细实施与安全保障**:文章提供了详细的实施步骤,包括设置Redshift Serverless工作组、创建知识库、配置IAM角色和策略以确保安全的跨账户访问,以及使用CloudFormation部署Bedrock Agent。通过脚本化地创建和管理IAM资源,确保了部署的可重复性和安全性,并为模型访问和数据查询提供了必要的权限。

💡 **最佳实践与资源管理**:为了优化AI代理的性能和用户体验,文章建议采取更具体的提问方式,使用与表描述匹配的术语,并利用Amazon Bedrock Guardrails添加安全防护。同时,提供了详细的资源清理步骤,指导用户删除已部署的CloudFormation堆栈、IAM角色和策略,以避免不必要的费用。

Organizations need seamless access to their structured data repositories to power intelligent AI agents. However, when these resources span multiple AWS accounts integration challenges can arise. This post explores a practical solution for connecting Amazon Bedrock agents to knowledge bases in Amazon Redshift clusters residing in different AWS accounts.

The challenge

Organizations that build AI agents using Amazon Bedrock can maintain their structured data in Amazon Redshift clusters. When these data repositories exist in separate AWS accounts from their AI agents, they face a significant limitation: Amazon Bedrock Knowledge Bases doesn’t natively support cross-account Redshift integration.

This creates a challenge for enterprises with multi-account architectures who want to:

Solution overview

Our solution enables cross-account knowledge base integration through a secure, serverless architecture that maintains secure access controls while allowing AI agents to query structured data. The approach uses AWS Lambda as an intermediary to facilitate secure cross-account data access.

The action flow as shown above:

    Users enter their natural language question in Amazon Bedrock Agents which is configured in the agent account. Amazon Bedrock Agents invokes a Lambda function through action groups which provides access to the Amazon Bedrock knowledge base configured in the agent-kb account above. Action group Lambda function running in agent account assumes an IAM role created in agent-kb account above to connect to the knowledge base in the agent-kb account. Amazon Bedrock Knowledge Base in the agent-kb account uses an IAM role created in the same account to access Amazon Redshift data warehouse and query data in the data warehouse.

The solution follows these key components:

    Amazon Bedrock agent in the agent account that handles user interactions. Amazon Redshift serverless workgroup in VPC and private subnet in the agent-kb account containing structured data. Amazon Bedrock Knowledge base using the Amazon Redshift serverless workgroup as structured data source. Lambda function in the agent account. Action group configuration to connect the agent in the agent account to the Lambda function. IAM roles and policies that enable secure cross-account access.

Prerequisites

This solution requires you to have the following:

    Two AWS accounts. Create an AWS account if you do not have one. Specific permissions required for both account which will be set up in subsequent steps. Install the AWS CLI (2.24.22 – current version) Set up authentication using IAM user credentials for the AWS CLI for each account Make sure you have jq installed, jq is lightweight command-line JSON processor. For example, in Mac you can use the command brew install jq (jq-1.7.1-apple – current version) to install it. Navigate to the Amazon Bedrock console and make sure you enable access to the meta.llama3-1-70b-instruct-v1:0 model for the agent-kb account and access for us.amazon.nova-pro-v1:0 model in the agent account in the us-west-2, US West (Oregon) AWS Region.

Assumption

Let’s call the AWS account profile, agent profile that has the Amazon Bedrock agent. Similarly, the AWS account profile be called agent-kb that has the Amazon Bedrock knowledge base with Amazon Redshift Serverless and the structured data source. We will use the us-west-2 US West (Oregon) AWS Region but feel free to choose another AWS Region as necessary (the prerequisites will be applicable to the AWS Region you choose to deploy this solution in). We will use the meta.llama3-1-70b-instruct-v1:0 model for the agent-kb. This is an available on-demand model in us-west-2. You are free to choose other models with cross-Region inference but that would mean changing the roles and polices accordingly and enable model access in all Regions they are available in. Based on our model choice for this solution the AWS Region must be us-west-2. For the agent we will be using an Amazon Bedrock agent optimized model like us.amazon.nova-pro-v1:0.

Implementation walkthrough

The following is a step-by-step implementation guide. Make sure to perform all steps in the same AWS Region in both accounts.

These steps are to deploy and test an end-to-end solution from scratch and if you are already running some of these components, you may skip over those steps.

      Make a note of the AWS account numbers in the agent and agent-kb account. In the implementation steps we will refer them as follows:
      ProfileAWS accountDescription agent 111122223333 Account for the Bedrock Agent agent-kb 999999999999 Account for the Bedrock Knowledge base

      Note: These steps use example profile names and account numbers, please replace with actuals before running.

      Create the Amazon Redshift Serverless workgroup in the agent-kb account:
        Log on to the agent-kb account Follow the workshop link to create the Amazon Redshift Serverless workgroup in private subnet Make a note of the namespace, workgroup, and other details and follow the rest of the hands-on workshop instructions.
      Set up your data warehouse in the agent-kb account. Create your AI knowledge base in the agent-kb account. Make a note of the knowledge base ID. Train your AI Assistant in the agent-kb account. Test natural language queries in the agent-kb account. You can find the code in aws-samples git repository: sample-for-amazon-bedrock-agent-connect-cross-account-kb. Create necessary roles and policies in both the accounts. Run the script create_bedrock_agent_kb_roles_policies.sh with the following input parameters.
      Input parameterValueDescription –agent-kb-profile agent-kb The agent knowledgebase profile that you set up with the AWS CLI with aws_access_key_id, aws_secret_access_key as mentioned in the prerequisites. –lambda-role lambda_bedrock_kb_query_role This is the IAM role the agent account Bedrock agent action group lambda will assume to connect to the Redshift cross account –kb-access-role bedrock_kb_access_role This is the IAM role the agent-kb account which the lambda_bedrock_kb_query_role in agent account assumes to connect to the Redshift cross account –kb-access-policy bedrock_kb_access_policy IAM policy attached to the IAM role bedrock_kb_access_role –lambda-policy lambda_bedrock_kb_query_policy IAM policy attached to the IAM role lambda_bedrock_kb_query_role –knowledge-base-id XXXXXXXXXX Replace with the actual knowledge base ID created in Step 4 –agent-account 111122223333 Replace with the 12-digit AWS account number where the Bedrock agent is running. (agent account) –agent-kb-account 999999999999 Replace with the 12-digit AWS account number where the Bedrock knowledge base is running. (agent-kb acccount)
      Download the script (create_bedrock_agent_kb_roles_policies.sh) from the aws-samples GitHub repository. Open Terminal in Mac or similar bash shell for other platforms. Locate and change the directory to the downloaded location, provide executable permissions:
      cd /my/locationchmod +x create_bedrock_agent_kb_roles_policies.sh
      If you are still not clear on the script usage or inputs, then you can run the script with the –help option and the script will display the usage:
      ./create_bedrock_agent_kb_roles_policies.sh –help Run the script with the right input parameters as described in the previous table.
      ./create_bedrock_agent_kb_roles_policies.sh --agent-profile agent \   --agent-kb-profile agent-kb \   --lambda-role lambda_bedrock_kb_query_role \   --kb-access-role bedrock_kb_access_role \   --kb-access-policy bedrock_kb_access_policy \   --lambda-policy lambda_bedrock_kb_query_policy \   --knowledge-base-id XXXXXXXXXX \   --agent-account 111122223333 \   --agent-kb-account 999999999999

      The script on successful execution shows the summary of the IAM, roles and policies created in both accounts.
      Log on to both the agent and agent-kb account to verify the IAM roles and policies are created.
            For the agent account: Make a note of the ARN of the lambda_bedrock_kb_query_role as that will be the value of CloudFormation stack parameter AgentLambdaExecutionRoleArn in the next step.
            For the agent-kb account: Make a note of the ARN of the bedrock_kb_access_role as that will be the value of CloudFormation stack parameter TargetRoleArn in the next step.
      Run the AWS CloudFormation script to create a Bedrock agent:
              Download the CloudFormation script: cloudformation_bedrock_agent_kb_query_cross_account.yaml from the aws-samples GitHub repository. Log on to the agent account and navigate to the CloudFormation console, and verify you are in the us-west-2 (Oregon) Region, choose Create stack and choose With new resources (standard).
              In the Specify template section choose Upload a template file and then Choose file and select the file from (1). Then, choose Next. Enter the following stack details and choose Next.
              ParameterValueDescription Stack name bedrock-agent-connect-kb-cross-account-agent You can choose any name AgentFoundationModelId us.amazon.nova-pro-v1:0 Do not change AgentLambdaExecutionRoleArn arn:aws:iam:: 111122223333:role/lambda_bedrock_kb_query_role Replace with you agent account number BedrockAgentDescription Agent to query inventory data from Redshift Serverless database Keep this as default BedrockAgentInstructions You are an assistant that helps users query inventory data from our Redshift Serverless database using the action group. Do not change BedrockAgentName bedrock_kb_query_cross_account Keep this as default KBFoundationModelId meta.llama3-1-70b-instruct-v1:0 Do not change KnowledgeBaseId XXXXXXXXXX Knowledge base id from Step 4 TargetRoleArn arn:aws:iam::999999999999:role/bedrock_kb_access_role Replace with you agent-kb account number

              Complete the acknowledgement and choose Next. Scroll down through the page and choose Submit. You will see the CloudFormation stack is getting created as shown by the status CREATE_IN_PROGRESS. It will take a few minutes, and you will see the status change to CREATE_COMPLETE indicating creation of all resources. Choose the Outputs tab to make a note of the resources that were created.
              In summary, the CloudFormation script does the following in the agent account.

                    Creates a Bedrock agent Creates an action group Also creates a Lambda function which is invoked by the Bedrock action group Defines the OpenAPI schema Creates necessary roles and permissions for the Bedrock agent Finally, it prepares the Bedrock agent so that it is ready to test.
      Check for model access in Oregon (us-west-2)
              Verify Nova Pro (us.amazon.nova-pro-v1:0) model access in the agent account. Navigate to the Amazon Bedrock console and choose Model access under Configure and learn. Search for Model name : Nova Pro to verify access. If not, then enable model access.
              Verify access to the meta.llama3-1-70b-instruct-v1:0 model in the agent-kb account. This should already be enabled as we set up the knowledge base earlier.
      Run the agent. Log on to agent account. Navigate to Amazon Bedrock console and choose Agents under Build. Choose the name of the agent and choose Test. You can test the following questions as mentioned the workshop’s Stage 4: Test Natural Language Queries page. For example:
              Who are the top 5 customers in Saudi Arabia? Who are the top parts supplier in the United States by volume? What is the total revenue by region for the year 1998? Which products have the highest profit margins? Show me orders with the highest priority from the last quarter of 1997.

      Choose Show trace to investigate the agent traces.

Some recommended best practices:

        Phrase your question to be more specific Use terminology that matches your table descriptions Try questions similar to your curated examples Verify your question relates to data that exists in the TPCH dataset Use Amazon Bedrock Guardrails to add configurable safeguards to questions and responses.

Clean up resources

It is recommended that you clean up any resources you do not need anymore to avoid any unnecessary charges:

        Navigate to the CloudFormation console for the agent and agent-kb account, search for the stack and and choose Delete. S3 buckets need to be deleted separately.
        For deleting the roles and policies created in both accounts, download the script delete-bedrock-agent-kb-roles-policies.sh from the aws-samples GitHub repository.
          Open Terminal in Mac or similar bash shell on other platforms. Locate and change the directory to the downloaded location, provide executable permissions:
        cd /my/location          chmod +x delete-bedrock-agent-kb-roles-policies.sh
        If you are still not clear on the script usage or inputs, then you can run the script with the –help option then the script will display the usage:
        ./ delete-bedrock-agent-kb-roles-policies.sh –help Run the script: delete-bedrock-agent-kb-roles-policies.sh with the same values for the same input parameters as in Step7 when running the create_bedrock_agent_kb_roles_policies.sh script. Note: Enter the correct account numbers for agent-account and agent-kb-account before running.
        ./delete-bedrock-agent-kb-roles-policies.sh --agent-profile agent \     --agent-kb-profile agent-kb \     --lambda-role lambda_bedrock_kb_query_role \    --kb-access-role bedrock_kb_access_role \       --kb-access-policy bedrock_kb_access_policy \       --lambda-policy lambda_bedrock_kb_query_policy \    --agent-account 111122223333 \      --agent-kb-account 999999999999

        The script will ask for a confirmation, say yes and press enter.

Summary

This solution demonstrates how the Amazon Bedrock agent in the agent account can query the Amazon Bedrock knowledge base in the agent-kb account.

Conclusion

This solution uses Amazon Bedrock Knowledge Bases for structured data to create a more integrated approach to cross-account data access. The knowledge base in agent-kb account connects directly to Amazon Redshift Serverless in a private VPC. The Amazon Bedrock agent in the agent account invokes an AWS Lambda function as part of its action group to make a cross-account connection to retrieve response from the structured knowledge base.

This architecture offers several advantages:

        Uses Amazon Bedrock Knowledge Bases capabilities for structured data Provides a more seamless integration between the agent and the data source Maintains proper security boundaries between accounts Reduces the complexity of direct database access codes

As Amazon Bedrock continues to evolve, you can take advantage of future enhancements to knowledge base functionality while maintaining your multi-account architecture.


About the Authors

Kunal Ghosh is an expert in AWS technologies. He passionate about building efficient and effective solutions on AWS, especially involving generative AI, analytics, data science, and machine learning. Besides family time, he likes reading, swimming, biking, and watching movies, and he is a foodie.

Arghya Banerjee is a Sr. Solutions Architect at AWS in the San Francisco Bay Area, focused on helping customers adopt and use the AWS Cloud. He is focused on big data, data lakes, streaming and batch analytics services, and generative AI technologies.

Indranil Banerjee is a Sr. Solutions Architect at AWS in the San Francisco Bay Area, focused on helping customers in the hi-tech and semi-conductor sectors solve complex business problems using the AWS Cloud. His special interests are in the areas of legacy modernization and migration, building analytics platforms and helping customers adopt cutting edge technologies such as generative AI.

Vinayak Datar is Sr. Solutions Manager based in Bay Area, helping enterprise customers accelerate their AWS Cloud journey. He’s focusing on helping customers to convert ideas from concepts to working prototypes to production using AWS generative AI services.

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

Amazon Bedrock Amazon Redshift AWS Lambda 跨账户集成 AI Agent Knowledge Base Serverless IAM Cross-account Integration Structured Data
相关文章