Building an AWS Lambda Function with Amazon S3 Trigger
In this specific workflow, the three primary services used are:
Visual flow summary:
User --> S3 Bucket --> (Trigger) --> Lambda Function (authorized by IAM) --> CloudWatch Logs & Metric
Step 1: Create the S3 Bucket: Go to the S3 service in the AWS Management console to create a bucket. Make sure that it is within the same region that are going to run Lambda function.
Bucket name has to be unique and with the rest of the settings being default and create bucket.
Step 2: Go to Lambda service and choose Create function. Select Author, created my own function, and choose desired runtime (e.g. Python 3.12).
Set Permissions: In the Additional Settings, select a custom execution role and add a new role with an existing policy template. Choose the AmazonS3ReadOnlyAccess template so that the function can be provided the necessary access and create the function by keeping the remaining settings as default.
Deploy the Code: Once the function is created, write code in the code editor. Click the Deploy button to save and activate changes.
Here is the code:
import json
import urllib.parse
import boto3
print('Loading function')
s3 = boto3.client('s3')
def lambda_handler(event, context):
#print("Received event: " + json.dumps(event, indent=2))
# Get the object from the event and show its content type
bucket = event['Records'][0]['s3']['bucket']['name']
key = urllib.parse.unquote_plus(event['Records'][0]['s3']['object']['key'], encoding='utf-8')
Recommended by LinkedIn
try:
response = s3.get_object(Bucket=bucket, Key=key)
print("CONTENT TYPE: " + response['ContentType'])
return response['ContentType']
except Exception as e:
print(e)
print('Error getting object {} from bucket {}. Make sure they exist and your bucket is in the same region as this function.'.format(key, bucket))
raise e
Step 3:
Create the Trigger: Navigate to the function overview and press Add trigger. Choose S3 as a source, a particular bucket, and an event type to be All object create events. Take note of the caution on possible recursive loops.
Step 4:
Event notifications and resource-based policy statements work together to enable the trigger mechanism between S3 and Lambda:
Event Notifications: When configure an S3 bucket to trigger a Lambda function, S3 creates an event notification. You can view this in the S3 bucket's properties tab under event notifications, which tells the bucket to notify the Lambda function whenever a specific action, such as a file upload, occurs.
Resource-Based Policy Statements: Adding the trigger automatically updates the Lambda function's configuration and permissions. A new resource-based policy statement is added, which explicitly grants the S3 service the necessary permission to invoke that specific Lambda function. This backend configuration ensures that the S3 bucket has the authority to execute the code in response to the event.
Step 5:
Test the Function: Upload a file to S3 bucket. Once the upload is complete, navigate to the Monitor tab in your Lambda function and click View logs in CloudWatch to verify the execution output.
Conclusion :
In this serverless architecture, an S3 bucket acts as the trigger source when a file is uploaded. Event notifications automatically signal the Lambda function to execute the specified code in response. An IAM execution role provides the necessary read-only permissions for the function to safely access S3 objects. This event-driven flow eliminates the need to manage servers, as Lambda scales automatically to process the incoming data.
Thank you !!