-
Notifications
You must be signed in to change notification settings - Fork 951
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Merge pull request #1725 from abha1376/abha-s3-event-sfn-lambda-ddb-sns
- Loading branch information
Showing
5 changed files
with
370 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,72 @@ | ||
# Amazon S3 to AWS Step Functions via Amazon EventBridge and get notification back via SNS | ||
|
||
The SAM template deploys an Amazon S3 bucket that publishes events to Amazon EventBridge, and sets up an AWS Step Functions workflow to show how to consume these events via an EventBridge rule. It deploys the the IAM resources required to run the application. It also deploys a Lambda function, SNS topic and DynamoDB table. EventBridge consumes events directly from S3 buckets when the NoticationConfiguration is enabled, as shown in the template. | ||
|
||
The template contains a sample Step Functions workflow that reads the files and uploads the data to a Lambda function. The Lambda function sends the data to DynamoDb and retuns the token back. With the help of callback pattern, the workflow pauses and sends success message via SNS. Replace this workflow with your own state machine. | ||
|
||
|
||
|
||
Important: this application uses various AWS services and there are costs associated with these services after the Free Tier usage - please see the [AWS Pricing page](https://aws.amazon.com/pricing/) for details. You are responsible for any AWS costs incurred. No warranty is implied in this example. | ||
|
||
## Requirements | ||
|
||
- [Create an AWS account](https://portal.aws.amazon.com/gp/aws/developer/registration/index.html) if you do not already have one and log in. The IAM user that you use must have sufficient permissions to make necessary AWS service calls and manage AWS resources. | ||
- [AWS CLI](https://docs.aws.amazon.com/cli/latest/userguide/install-cliv2.html) installed and configured | ||
- [Git Installed](https://git-scm.com/book/en/v2/Getting-Started-Installing-Git) | ||
- [AWS Serverless Application Model](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-sam-cli-install.html) (AWS SAM) installed | ||
|
||
## Deployment Instructions | ||
|
||
1. Create a new directory, navigate to that directory in a terminal and clone the GitHub repository: | ||
``` | ||
git clone https://github.com/aws-samples/serverless-patterns | ||
``` | ||
1. Change directory to the pattern directory: | ||
``` | ||
cd s3-event-sfn-lambda-ddb-sns | ||
``` | ||
1. From the command line, use AWS SAM to deploy the AWS resources for the pattern as specified in the template.yml file: | ||
``` | ||
sam deploy --guided | ||
``` | ||
1. During the prompts: | ||
|
||
- Enter a stack name | ||
- Enter the desired AWS Region | ||
- Allow SAM CLI to create IAM roles with the required permissions. | ||
|
||
Once you have run `sam deploy --guided` mode once and saved arguments to a configuration file (samconfig.toml), you can use `sam deploy` in future to use these defaults. | ||
|
||
1. Note the outputs from the SAM deployment process. These contain the resource names and/or ARNs which are used for testing. | ||
|
||
## How it works | ||
|
||
This pattern sets up the following resources: | ||
|
||
- An Amazon S3 bucket that is configured to send any events regarding its content, such as an `Object created` event that is emitted when an object is uploaded to the bucket, to Amazon EventBridge. | ||
- An Amazon EventBridge rule that triggers a Step Functions workflow if a new `Object created` event is emitted by the S3 bucket. The workflow receives an [EventBridge event message](https://docs.aws.amazon.com/AmazonS3/latest/userguide/ev-events.html) as its input which contains information such as the name of the S3 bucket and the key of the uploaded csv. | ||
- A sample AWS Step Functions workflow that calls the [Lambda](https://docs.aws.amazon.com/lambda/latest/dg/welcome.html) to identify objects uploaded to the S3 bucket. The workflow posesses an IAM role that authorizes it to read from the S3 bucket and to call Lambda and SNS. | ||
- A Dynamodb table and an SNS topic | ||
- Step function using [wait for callback pattern](https://docs.aws.amazon.com/step-functions/latest/dg/connect-to-resource.html#connect-wait-token) | ||
|
||
## Testing | ||
|
||
1. In the Outputs tab of the AWS CloudFormation console, note the `IngestionBucket` , `UpdateTableStateMachine` , `DynamoDBTable`, `LambdaFunction`, `MySnsTopic` outputs. You can use the provided friends.csv for testing. | ||
2. Add the subscribers to SNS topic `MySnsTopic` . | ||
3. In the Amazon S3 console, upload friends.csv file to the `IngestionBucket`: containing list of items. | ||
4. In the AWS Step Functions console, find the new workflow executions for the `UpdateTableStateMachine` workflow. The workflow will send data to Lambda and will send success message upon callback pattern completion. | ||
|
||
|
||
## Cleanup | ||
|
||
Delete the stack: | ||
|
||
```bash | ||
sam delete | ||
``` | ||
|
||
--- | ||
|
||
Copyright 2021 Amazon.com, Inc. or its affiliates. All Rights Reserved. | ||
|
||
SPDX-License-Identifier: MIT-0 |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,58 @@ | ||
{ | ||
"title": "Amazon S3 to AWS Step Functions via Amazon EventBridge and get notification back via SNS", | ||
"description": "The SAM template deploys an Amazon S3 bucket that publishes events to Amazon EventBridge, and sets up an AWS Step Functions workflow to show how to consume these events via an EventBridge rule. It deploys the the IAM resources required to run the application. It also deploys a Lambda function, SNS topic and DynamoDB table. EventBridge consumes events directly from S3 buckets when the NoticationConfiguration is enabled, as shown in the template.", | ||
"language": "Python", | ||
"level": "200", | ||
"framework": "SAM", | ||
"introBox": { | ||
"headline": "How it works", | ||
"text": [ | ||
"This pattern sets up the following resources:", | ||
|
||
"An Amazon S3 bucket that is configured to send any events regarding its content, such as an `Object created` event that is emitted when an object is uploaded to the bucket, to Amazon EventBridge.", | ||
"An Amazon EventBridge rule that triggers a Step Functions workflow if a new `Object created` event is emitted by the S3 bucket. The workflow receives an [EventBridge event message](https://docs.aws.amazon.com/AmazonS3/latest/userguide/ev-events.html) as its input which contains information such as the name of the S3 bucket and the key of the uploaded csv.", | ||
"A sample AWS Step Functions workflow that calls the [Lambda](https://docs.aws.amazon.com/lambda/latest/dg/welcome.html) to identify objects uploaded to the S3 bucket. The workflow posesses an IAM role that authorizes it to read from the S3 bucket and to call Lambda and SNS.", | ||
"A DynamoDB table and an SNS topic", | ||
"Step function using [wait for callback pattern](https://docs.aws.amazon.com/step-functions/latest/dg/connect-to-resource.html#connect-wait-token)" | ||
|
||
|
||
] | ||
}, | ||
"gitHub": { | ||
"template": { | ||
"repoURL": "https://github.com/aws-samples/serverless-patterns/tree/main/s3-event-sfn-lambda-ddb-sns", | ||
"templateURL": "serverless-patterns/s3-event-sfn-lambda-ddb-sns", | ||
"projectFolder": "s3-event-sfn-lambda-ddb-sns", | ||
"templateFile": "template.yaml" | ||
} | ||
}, | ||
"deploy": { | ||
"text": [ | ||
"sam deploy" | ||
] | ||
}, | ||
"testing": { | ||
"text": [ | ||
"See the GitHub repo for detailed testing instructions." | ||
] | ||
}, | ||
"cleanup": { | ||
"text": [ | ||
"Delete the stack: <code>sam delete</code>." | ||
] | ||
}, | ||
"authors": [ | ||
{ | ||
"name": "Abha Tripathi", | ||
"image": "https://avatars.githubusercontent.com/u/36122588?s=400&u=209552eb3fa55e5c4176e54a24a7737ce899083b&v=4", | ||
"bio": "Cloud Engineer @ AWS", | ||
"linkedin": "www.linkedin.com/in/abha-tripathi-344a0a146" | ||
}, | ||
{ | ||
"name": "Sushir V R", | ||
"image": "https://avatars.githubusercontent.com/u/42473846?s=400&u=4c7b19858e4ec3d9ae24e379c35848219ee57a86&v=4", | ||
"bio": "Cloud Engineer @ AWS", | ||
"linkedin": "www.linkedin.com/in/sushirvr" | ||
} | ||
] | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,5 @@ | ||
1,John,Maths | ||
2,Mike,Arts | ||
3,Shobha,Science | ||
4,Noel,History | ||
5,Rocky,Physics |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,50 @@ | ||
import boto3 | ||
import os | ||
import json | ||
|
||
s3_client = boto3.client("s3") | ||
dynamodb = boto3.resource("dynamodb") | ||
sfn = boto3.client('stepfunctions') | ||
ddbtable = os.environ['TableName'] | ||
print("Table name :", ddbtable) | ||
table = dynamodb.Table(ddbtable) | ||
|
||
def lambda_handler(event, context): | ||
|
||
parsed_data = json.loads(json.dumps(event)) | ||
|
||
# Extract bucket name and key | ||
bucket_name = parsed_data["data"]["detail"]["bucket"]["name"] | ||
s3_file_name = parsed_data["data"]["detail"]["object"]["key"] | ||
resp = s3_client.get_object(Bucket=bucket_name,Key=s3_file_name) | ||
data = resp['Body'].read().decode("utf-8") | ||
|
||
# Below code is customized as per the provided example csv Kindly customize it as per your requirement. | ||
Students = data.split("\n") | ||
for friend in Students: | ||
print(friend) | ||
friend_data = friend.split(",") | ||
# add to dynamodb | ||
try: | ||
put = table.put_item( | ||
Item = { | ||
"transaction_id": friend_data[0], | ||
"N": friend_data[1], | ||
"S": friend_data[2] | ||
} | ||
) | ||
print("Response: ", put) | ||
except Exception as e: | ||
print("End of file", e) | ||
|
||
output = {"outcome":"success"} | ||
output_str = json.dumps(output) | ||
response = sfn.send_task_success( | ||
taskToken=event['token'], | ||
output= output_str | ||
) | ||
|
||
return { | ||
"statusCode": 200, | ||
"body":"uploaded successfully" | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,185 @@ | ||
AWSTemplateFormatVersion: "2010-09-09" | ||
Transform: AWS::Serverless-2016-10-31 | ||
Description: Serverless patterns - Object upload to S3 triggers a state machine which uses Lambda function to put records to DynamoDB post which a notification would be sent to user via SNS. | ||
|
||
Resources: | ||
#===== | ||
# Creates an Amazon S3 bucket to which images will be uploaded to trigger our workflow. | ||
#===== | ||
IngestionBucket: | ||
Type: "AWS::S3::Bucket" | ||
Properties: | ||
# Amazon EventBridge receives notifications for all events in the bucket. | ||
NotificationConfiguration: | ||
EventBridgeConfiguration: | ||
EventBridgeEnabled: true | ||
|
||
|
||
#===== | ||
# Creates a Step Functions workflow. | ||
#===== | ||
UpdateTableStateMachine: | ||
Type: AWS::Serverless::StateMachine | ||
Properties: | ||
|
||
# state machine definition | ||
Definition: | ||
StartAt: LambdaInvoke | ||
States: | ||
LambdaInvoke: | ||
Type: Task | ||
Resource: arn:aws:states:::lambda:invoke.waitForTaskToken | ||
TimeoutSeconds: 3600 | ||
Parameters: | ||
FunctionName: MyLambdaFunction | ||
Payload: | ||
data.$: "$" | ||
token.$: "$$.Task.Token" | ||
Retry: | ||
- ErrorEquals: | ||
- Lambda.ServiceException | ||
- Lambda.AWSLambdaException | ||
- Lambda.SdkClientException | ||
- Lambda.TooManyRequestsException | ||
IntervalSeconds: 1 | ||
MaxAttempts: 3 | ||
BackoffRate: 2 | ||
Catch: | ||
- ErrorEquals: | ||
- Lambda.ServiceException | ||
- Lambda.UnknownError | ||
- Lambda.AWSLambdaException | ||
- Lambda.SdkClientException | ||
- Lambda.TooManyRequestsException | ||
Next: PrepareSNSMessage | ||
ResultPath: "$.error" | ||
Next: PrepareSNSMessage | ||
PrepareSNSMessage: | ||
Type: Pass | ||
Result: | ||
TopicArn: | ||
Fn::Sub: "arn:aws:sns:${AWS::Region}:${AWS::AccountId}:my-sns-topic" | ||
Message.$: "$" | ||
ResultPath: "$.snsInput" | ||
Next: SNSPublish | ||
SNSPublish: | ||
Type: Task | ||
Resource: arn:aws:states:::sns:publish | ||
Parameters: | ||
Message: "Successfully completed the workflow!" | ||
TopicArn.$: "$.snsInput.TopicArn" | ||
End: true | ||
|
||
Role: !GetAtt StateMachineExecutionRole.Arn | ||
|
||
# The Step Functions workflow is triggered each time an object is created in our S3 bucket. | ||
Events: | ||
StateChange: | ||
Type: EventBridgeRule | ||
Properties: | ||
EventBusName: default | ||
Pattern: | ||
source: | ||
- aws.s3 | ||
detail-type: | ||
- Object Created | ||
detail: | ||
bucket: | ||
name: | ||
- !Ref IngestionBucket | ||
|
||
|
||
#===== | ||
# DynamoDB Table | ||
#===== | ||
DynamoDBTable: | ||
Type: AWS::DynamoDB::Table | ||
Properties: | ||
BillingMode: PAY_PER_REQUEST | ||
AttributeDefinitions: | ||
- AttributeName: transaction_id | ||
AttributeType: "S" | ||
KeySchema: | ||
- AttributeName: transaction_id | ||
KeyType: HASH | ||
|
||
|
||
#===== | ||
# Lambda Function | ||
#===== | ||
LambdaFunction: | ||
Type: AWS::Serverless::Function | ||
Properties: | ||
CodeUri: src/ | ||
Runtime: python3.11 | ||
Environment: | ||
Variables: | ||
TableName: !Ref DynamoDBTable | ||
FunctionName: MyLambdaFunction | ||
Handler: lambda-dynamoDb.lambda_handler | ||
MemorySize: 128 | ||
Policies: | ||
- CloudWatchLogsFullAccess | ||
- AWSStepFunctionsFullAccess | ||
- AmazonS3FullAccess | ||
- DynamoDBCrudPolicy: | ||
TableName: !Ref DynamoDBTable | ||
Timeout: 30 | ||
|
||
|
||
|
||
#===== | ||
# SNS Topic | ||
#===== | ||
MySnsTopic: | ||
Type: AWS::SNS::Topic | ||
Properties: | ||
TopicName: my-sns-topic | ||
|
||
|
||
|
||
#===== | ||
# IAM Roles | ||
#===== | ||
StateMachineExecutionRole: | ||
Type: "AWS::IAM::Role" | ||
Properties: | ||
AssumeRolePolicyDocument: | ||
Version: "2012-10-17" | ||
Statement: | ||
- Effect: "Allow" | ||
Principal: | ||
Service: | ||
- !Sub states.${AWS::Region}.amazonaws.com | ||
Action: "sts:AssumeRole" | ||
Path: "/" | ||
Policies: | ||
- PolicyName: Access | ||
PolicyDocument: | ||
Version: "2012-10-17" | ||
Statement: | ||
- Effect: Allow | ||
Action: | ||
- "lambda:InvokeFunction" | ||
- "sns:Publish" | ||
Resource: "*" | ||
|
||
|
||
Outputs: | ||
IngestionBucket: | ||
Description: "S3 bucket name" | ||
Value: !Ref IngestionBucket | ||
UpdateTableStateMachine: | ||
Description: "Example state machine" | ||
Value: !Ref UpdateTableStateMachine | ||
LambdaFuncton: | ||
Value: !Ref LambdaFunction | ||
Description: Lambda Function ARN | ||
TableName: | ||
Value: !Ref DynamoDBTable | ||
Description: DynamoDb Table Name | ||
MySnsTopicName: | ||
Description: SNS topic name | ||
Value: !GetAtt MySnsTopic.TopicName | ||
Export: | ||
Name: MySNSTopicName |