This project sets up an AWS Bedrock Knowledge Base integrated with an Aurora Serverless PostgreSQL database. It also includes scripts for database setup and file upload to S3.
- Project Overview
- Prerequisites
- Project Structure
- Deployment Steps
- Using the Scripts
- Customization
- Troubleshooting
This project consists of several components:
-
Stack 1 - Terraform configuration for creating:
- A VPC
- An Aurora Serverless PostgreSQL cluster
- s3 Bucket to hold documents
- Necessary IAM roles and policies
-
Stack 2 - Terraform configuration for creating:
- A Bedrock Knowledge Base
- Necessary IAM roles and policies
-
A set of SQL queries to prepare the Postgres database for vector storage
-
A Python script for uploading files to an s3 bucket
The goal is to create a Bedrock Knowledge Base that can leverage data stored in an Aurora Serverless database, with the ability to easily upload supporting documents to S3. This will allow us to ask the LLM for information from the documentation.
Before you begin, ensure you have the following:
- AWS CLI installed and configured with appropriate credentials
- Terraform installed (version 0.12 or later)
- Python 3.10 or later
- pip (Python package manager)
project-root/
│
├── stack1
| ├── main.tf
| ├── outputs.tf
| └── variables.tf
|
├── stack2
| ├── main.tf
| ├── outputs.tf
| └── variables.tf
|
├── modules/
│ ├── aurora_serverless/
│ │ ├── main.tf
│ │ ├── variables.tf
│ │ └── outputs.tf
│ └── bedrock_kb/
│ ├── main.tf
│ ├── variables.tf
│ └── outputs.tf
│
├── scripts/
│ ├── aurora_sql.sql
│ └── upload_to_s3.py
│
├── spec-sheets/
│ └── machine_files.pdf
│
└── README.md
-
Clone this repository to your local machine.
-
Navigate to the project Stack 1. This stack includes VPC, Aurora servlerless and S3
-
Initialize Terraform:
terraform init
-
Review and modify the Terraform variables in
main.tf
as needed, particularly:- AWS region
- VPC CIDR block
- Aurora Serverless configuration
- s3 bucket
-
Deploy the infrastructure:
terraform apply
Review the planned changes and type "yes" to confirm.
-
After the Terraform deployment is complete, note the outputs, particularly the Aurora cluster endpoint.
-
Prepare the Aurora Postgres database. This is done by running the sql queries in the script/ folder. This can be done through Amazon RDS console and the Query Editor.
-
Navigate to the project Stack 2. This stack includes Bedrock Knowledgebase
-
Initialize Terraform:
terraform init
-
Use the values outputs of the stack 1 to modify the values in
main.tf
as needed:- Bedrock Knowledgebase configuration
-
Deploy the infrastructure:
terraform apply
- Review the planned changes and type "yes" to confirm.
-
Upload pdf files to S3, place your files in the
spec-sheets
folder and run:python scripts/upload_to_s3.py
- Make sure to update the S3 bucket name in the script before running.
-
Sync the data source in the knowledgebase to make it available to the LLM.
The upload_to_s3.py
script does the following:
- Uploads all files from the
spec-sheets
folder to a specified S3 bucket - Maintains the folder structure in S3
To use it:
- Update the
bucket_name
variable in the script with your S3 bucket name. - Optionally, update the
prefix
variable if you want to upload to a specific path in the bucket. - Run
python scripts/upload_to_s3.py
.
- Open the bedrock_utils.py file and the following functions:
- query_knowledge_base
- generate_response
-
Open the bedrock_utils.py file and the following function:
- valid_prompt
Hint: categorize the user prompt
- If you encounter permissions issues, ensure your AWS credentials have the necessary permissions for creating all the resources.
- For database connection issues, check that the security group allows incoming connections on port 5432 from your IP address.
- If S3 uploads fail, verify that your AWS credentials have permission to write to the specified bucket.
- For any Terraform errors, ensure you're using a compatible version and that all module sources are correctly specified.
For more detailed troubleshooting, refer to the error messages and logs provided by Terraform and the Python scripts.