Building an AWS Lambda Function with Amazon S3 Trigger

Building an AWS Lambda Function with Amazon S3 Trigger

In this specific workflow, the three primary services used are:

  • S3: Files are stored here and the function is triggered when the files are uploaded.
  • Lambda: The serverless compute service that runs your code in response to the S3 event
  • CloudWatch: This is used to see the metrics, invocation status, and execution logs created by the function.

Visual flow summary:

User --> S3 Bucket --> (Trigger) --> Lambda Function (authorized by IAM) --> CloudWatch Logs & Metric

Step 1: Create the S3 Bucket: Go to the S3 service in the AWS Management console to create a bucket. Make sure that it is within the same region that are going to run Lambda function.

Article content


Article content

Bucket name has to be unique and with the rest of the settings being default and create bucket.


Step 2: Go to Lambda service and choose Create function. Select Author, created my own function, and choose desired runtime (e.g. Python 3.12).

Article content

Set Permissions: In the Additional Settings, select a custom execution role and add a new role with an existing policy template. Choose the AmazonS3ReadOnlyAccess template so that the function can be provided the necessary access and create the function by keeping the remaining settings as default.

Article content
Article content
Article content
Article content
Article content

Deploy the Code: Once the function is created, write code in the code editor. Click the Deploy button to save and activate changes.

Article content

Here is the code:

import json

import urllib.parse

import boto3

print('Loading function')

s3 = boto3.client('s3')

def lambda_handler(event, context):

    #print("Received event: " + json.dumps(event, indent=2))

    # Get the object from the event and show its content type

    bucket = event['Records'][0]['s3']['bucket']['name']

    key = urllib.parse.unquote_plus(event['Records'][0]['s3']['object']['key'], encoding='utf-8')

    try:

        response = s3.get_object(Bucket=bucket, Key=key)

        print("CONTENT TYPE: " + response['ContentType'])

        return response['ContentType']

    except Exception as e:

        print(e)

        print('Error getting object {} from bucket {}. Make sure they exist and your bucket is in the same region as this function.'.format(key, bucket))

        raise e


Step 3:

Create the Trigger: Navigate to the function overview and press Add trigger. Choose S3 as a source, a particular bucket, and an event type to be All object create events. Take note of the caution on possible recursive loops.

Article content
Article content
Article content

Step 4:

Event notifications and resource-based policy statements work together to enable the trigger mechanism between S3 and Lambda:

Event Notifications: When configure an S3 bucket to trigger a Lambda function, S3 creates an event notification. You can view this in the S3 bucket's properties tab under event notifications, which tells the bucket to notify the Lambda function whenever a specific action, such as a file upload, occurs.

Article content

Resource-Based Policy Statements: Adding the trigger automatically updates the Lambda function's configuration and permissions. A new resource-based policy statement is added, which explicitly grants the S3 service the necessary permission to invoke that specific Lambda function. This backend configuration ensures that the S3 bucket has the authority to execute the code in response to the event.

Article content


Step 5:

Test the Function: Upload a file to S3 bucket. Once the upload is complete, navigate to the Monitor tab in your Lambda function and click View logs in CloudWatch to verify the execution output.

Article content
Article content
Article content
Article content

Conclusion :

In this serverless architecture, an S3 bucket acts as the trigger source when a file is uploaded. Event notifications automatically signal the Lambda function to execute the specified code in response. An IAM execution role provides the necessary read-only permissions for the function to safely access S3 objects. This event-driven flow eliminates the need to manage servers, as Lambda scales automatically to process the incoming data.


Thank you !!

To view or add a comment, sign in

More articles by Vyshnavi Chalichama

  • To Do List App

    Build a To-Do List App using Django. To-Do list app with User Registration, Login, Search and full Create Read Update…

    1 Comment
  • Voting Application Using AWS Services

    To Build a Serverless Voting APP using AWS Cloud Services. Services used are AWS - Lambda, Dynamo DB, API Gateway, and…

    3 Comments
  • Reduce the Food Wastage And Improve The Food Security

    INTRODUCTION: Simply put, reducing food lost or wasted means more food for all, less greenhouse gas emissions, less…

  • SDP - 2 (User Research)

    SDP stands for the Skill development project Here is my article that explains the research part of the SDP Project…

    2 Comments

Others also viewed

Explore content categories