Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New serverless pattern - apigw-lambda-bedrock-nova-reel-dynamodb-s3 #2581

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 10 additions & 0 deletions apigw-lambda-bedrock-dynamodb-s3-cdk-python/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
*.swp
package-lock.json
__pycache__
.pytest_cache
.venv
*.egg-info

# CDK asset staging directory
.cdk.staging
cdk.out
144 changes: 144 additions & 0 deletions apigw-lambda-bedrock-dynamodb-s3-cdk-python/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,144 @@
# Reel Generation with Amazon Bedrock Nova, API Gateway, Lambda, DynamoDB, and S3

This pattern demonstrates how to build a serverless reel generation service using Amazon Bedrock's Nova model. The service exposes REST APIs through API Gateway that allow users to submit reel generation requests and check their status. The generated reels are stored in S3.

![architecture](architecture/architecture.png)

Important: this application uses various AWS services and there are costs associated with these services after the Free Tier usage - please see the [AWS Pricing page](https://aws.amazon.com/pricing/) for details. You are responsible for any AWS costs incurred. No warranty is implied in this example.

### Requirements

* [Create an AWS account](https://portal.aws.amazon.com/gp/aws/developer/registration/index.html) if you do not already have one and log in. The IAM user that you use must have sufficient permissions to make necessary AWS service calls and manage AWS resources.
* [AWS CLI](https://docs.aws.amazon.com/cli/latest/userguide/install-cliv2.html) installed and configured
* [Git Installed](https://git-scm.com/book/en/v2/Getting-Started-Installing-Git)
* [AWS Cloud Development Kit](https://docs.aws.amazon.com/cdk/v2/guide/getting_started.html) installed
* [Amazon Bedrock Nova model Access](https://docs.aws.amazon.com/bedrock/latest/userguide/model-access.html#add-model-access)

## Amazon Bedrock setup instructions
You must request access to a model before you can use it. If you try to use the model (with the API or console) before you have requested access to it, you receive an error message. For more information, see [Model access](https://docs.aws.amazon.com/bedrock/latest/userguide/model-access.html).

1. In the AWS console, select the region from which you want to access Amazon Bedrock.

![Region Selection](bedrock_setup/region-selection.png)

1. Find **Amazon Bedrock** by searching in the AWS console.

![Bedrock Search](bedrock_setup/bedrock-search.png)

1. Expand the side menu.

![Bedrock Expand Menu](bedrock_setup/bedrock-menu-expand.png)

1. From the side menu, select **Model access**.

![Model Access](bedrock_setup/model-access-link.png)

1. Select the **Edit** button.

![Model Access View](bedrock_setup/model-access-view.png)

6. Use the checkboxes to select the models you wish to enable. Review the applicable EULAs as needed. Click **Save changes** to activate the models in your account. For this pattern we only need Amazon / model id: amazon.nova-reel-v1:0.

## Deployment Instructions

1. Create a new directory, navigate to that directory in a terminal and clone the GitHub repository:
```
git clone https://github.com/aws-samples/serverless-patterns
```
1. Change directory to the pattern directory:
```
cd serverless-patterns/apigw-lambda-bedrock-dynamodb-s3-cdk-python
```
1. Create virtual environment for Python
```
python -m venv .venv
```
For a Windows platform, activate the virtualenv like this:
```
.venv\Scripts\activate.bat
```
For a OSX / Linux platform, activate the virtualenv like this:
```
source .venv/bin/activate
```
1. Install the Python required dependencies:
```
pip install -r requirements.txt
```
1. Run the command below to bootstrap your account. CDK needs it to deploy
```
cdk bootstrap
```
1. Review the CloudFormation template CDK generates for you stack using the following AWS CDK CLI command:
```
cdk synth
```
1. From the command line, use AWS CDK to deploy the AWS resources.
```
cdk deploy
```
1. After deployment completes, take a look at the Outputs section. There will be an entry containing the URL of the API Gateway resource you just created. Copy that URL as you'll need it for your tests.

The format of the URL will be something like `https://{id}.execute-api.{region}.amazonaws.com/prod`


## How it works

- Users submit video generation requests through an API endpoint

- The request is processed by a Lambda function that creates a job entry in DynamoDB

- A second Lambda function processes the request using Amazon Bedrock's Nova model

- The generated video is stored in S3

- Users can check the job status through another API endpoint


## Testing

We'll be making requests to the *reel_gen* endpoint with a desired prompt.

Follow the example below and replace `{your-api-url}` with your api url from step 8.

```bash
curl -X POST \
{your-api-url}/prod/reel_gen \
-H "Content-Type: application/json" \
-d '{"prompt": "Your text description of the video"}'
```
Then you can use JobID from above response in below API to get latest status.

```bash
curl {your-api-url}/prod/status/{job_id}
```

Example Prompts:
1. Noodles falling into a bowl of soup.
2. The camera pans left across a cozy, well-equipped kitchen, with sunlight streaming through large windows and illuminating the gleaming countertops and appliances. A steam-filled pot bubbles on the stovetop, hinting at the culinary creations to come.
3. A teddy bear in a leather jacket, baseball cap, and sunglasses playing guitar in front of a waterfall.


## Review results

Go to Amazon S3, and navigate to the S3 bucket. It will have a name similar to 'apigwlambdabedrockdynamodb-reelvideobucket...'. In the S3 bucket you should see `output.mp4`.

Here is an example of generated reel:
![Generated reel](example/output.mp4)



## Cleanup

1. Run below script in the `apigw-lambda-bedrock-dynamodb-s3-cdk-python` directory to delete AWS resources created by this sample stack.
```bash
cdk destroy
```

## Extra Resources
* [Bedrock Api Reference](https://docs.aws.amazon.com/bedrock/latest/APIReference/welcome.html)

----
Copyright 2025 Amazon.com, Inc. or its affiliates. All Rights Reserved.

SPDX-License-Identifier: MIT-0
Original file line number Diff line number Diff line change
@@ -0,0 +1,128 @@
from aws_cdk import (
Stack,
aws_lambda as _lambda,
aws_iam as iam,
Duration,
aws_apigateway as apigw,
aws_s3 as s3,
RemovalPolicy,
CfnOutput,
aws_dynamodb as dynamodb
)
from constructs import Construct

class ApigwLambdaBedrockDynamodbS3Stack(Stack):
def __init__(self, scope: Construct, construct_id: str, **kwargs) -> None:
super().__init__(scope, construct_id, **kwargs)

# Create DynamoDB table for job tracking
job_table = dynamodb.Table(
self, "JobTrackingTable",
partition_key=dynamodb.Attribute(
name="job_id",
type=dynamodb.AttributeType.STRING
),
time_to_live_attribute="ttl",
removal_policy=RemovalPolicy.DESTROY
)

# Bedrock policy
invoke_model_policy = iam.Policy(
self, "InvokeModelPolicy",
statements=[
iam.PolicyStatement(
actions=[
"bedrock:InvokeModel",
"bedrock:StartAsyncInvoke",
"bedrock:ListAsyncInvokes",
"bedrock:GetAsyncInvoke"
],
resources=[
f"arn:aws:bedrock:{self.region}::foundation-model/amazon.nova-reel-v1:0",
f"arn:aws:bedrock:{self.region}:{self.account}:async-invoke/*",
f"arn:aws:bedrock:{self.region}:{self.account}:foundation-model/amazon.nova-reel-v1:0"
]
)
]
)

# Create S3 bucket
video_bucket = s3.Bucket(
self,
"ReelVideoBucket",
versioned=True,
removal_policy=RemovalPolicy.DESTROY,
auto_delete_objects=True
)

# Define Lambda layers
boto_layer = _lambda.LayerVersion.from_layer_version_arn(
self, "Boto3Layer",
f"arn:aws:lambda:{self.region}:770693421928:layer:Klayers-p311-boto3:19"
)

# Create the processing Lambda function FIRST
process_function = _lambda.Function(
self, "ProcessReelGeneration",
runtime=_lambda.Runtime.PYTHON_3_11,
handler="process.handler",
code=_lambda.Code.from_asset("./function_code"),
layers=[boto_layer],
timeout=Duration.minutes(15),
environment={
"BUCKET": video_bucket.bucket_name,
"MODEL_ID": "amazon.nova-reel-v1:0",
"TABLE_NAME": job_table.table_name
}
)

# Then create the submit function with process function name
submit_function = _lambda.Function(
self, "SubmitReelGeneration",
runtime=_lambda.Runtime.PYTHON_3_11,
handler="submit.handler",
code=_lambda.Code.from_asset("./function_code"),
layers=[boto_layer],
timeout=Duration.seconds(29),
environment={
"BUCKET": video_bucket.bucket_name,
"MODEL_ID": "amazon.nova-reel-v1:0",
"TABLE_NAME": job_table.table_name,
"PROCESS_FUNCTION_NAME": process_function.function_name
}
)

# Grant permissions
video_bucket.grant_read_write(process_function)
video_bucket.grant_write(submit_function)
job_table.grant_read_write_data(submit_function)
job_table.grant_read_write_data(process_function)
invoke_model_policy.attach_to_role(process_function.role)

# Grant permission for submit function to invoke process function
process_function.grant_invoke(submit_function)

# Create API Gateway
api = apigw.RestApi(
self, "ReelGenAPI",
default_cors_preflight_options=apigw.CorsOptions(
allow_origins=['*'],
allow_methods=['POST', 'GET'],
allow_headers=['Content-Type']
)
)

# Add resources and methods
reel_gen = api.root.add_resource("reel_gen")
reel_gen.add_method("POST", apigw.LambdaIntegration(submit_function))

# Add status check endpoint
status = api.root.add_resource("status")
status.add_resource("{jobId}").add_method(
"GET",
apigw.LambdaIntegration(submit_function)
)

# Outputs
CfnOutput(self, "S3-Video-Bucket", value=video_bucket.bucket_name)
CfnOutput(self, "ApiGatewayUrl", value=api.url)
28 changes: 28 additions & 0 deletions apigw-lambda-bedrock-dynamodb-s3-cdk-python/app.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
#!/usr/bin/env python3
import os

import aws_cdk as cdk

from apigw_lambda_bedrock_dynamodb_s3.apigw_lambda_bedrock_dynamodb_s3_stack import ApigwLambdaBedrockDynamodbS3Stack


app = cdk.App()
ApigwLambdaBedrockDynamodbS3Stack(app, "ApigwLambdaBedrockDynamodbS3Stack",
# If you don't specify 'env', this stack will be environment-agnostic.
# Account/Region-dependent features and context lookups will not work,
# but a single synthesized template can be deployed anywhere.

# Uncomment the next line to specialize this stack for the AWS Account
# and Region that are implied by the current CLI configuration.

#env=cdk.Environment(account=os.getenv('CDK_DEFAULT_ACCOUNT'), region=os.getenv('CDK_DEFAULT_REGION')),

# Uncomment the next line if you know exactly what Account and Region you
# want to deploy the stack to. */

#env=cdk.Environment(account='123456789012', region='us-east-1'),

# For more information, see https://docs.aws.amazon.com/cdk/latest/guide/environments.html
)

app.synth()
Loading