AWS: Lambda Security

AWS: Lambda Security

Business Scenario:

The head of the IT department wants a solution that helps ensure that the IT cloud infrastructure meets strict data security rules. The company requires that no company data is exposed to the public internet.

Objectives

  • Determine how to apply an AWS Lambda function to access private resources in a VPC.
  • Identify the benefits and features of AWS Secrets Manager.
  • Identify the different uses of VPC endpoints.
  • Create a Lambda function to connect to VPC private subnets.
  • Lambda retrieves credentials from Secrets Manager.
  • Create a gateway VPC endpoint to access the S3 bucket.

Let's start with Lambda. You can configure your Lambda functions to connect to private subnets in a virtual private cloud, which we call a VPC. After the Lambda function is VPC-enabled, you can securely access private resources, such as an Amazon RDS database. Absolutely! You can use AWS Secrets Manager to store the database credentials. Secrets Manager protects secrets, such as passwords, and you can rotate, manage, and retrieve secrets throughout their lifecycle. The Lambda function can quickly retrieve the credentials with API calls to Secrets Manager. You can use a VPC endpoint, which provides connections between a VPC and supported services, such as AWS Secrets Manager and Amazon S3, without requiring an internet gateway. Calling a service through a VPC endpoint keeps all the request traffic on the AWS network. There are different types of VPC endpoints. To access Secrets Manager, for example, you can use an interface VPC endpoint. To access your S3 buckets, you can create a gateway VPC endpoint that provides a private connection between the VPC and the S3 buckets. The traffic between your VPC and Amazon S3 does not leave the AWS network.

The purpose of using a VPC endpoint in this solution is to allow the Lambda function to securely access private resources, such as Amazon RDS, without going through the public internet. The VPC endpoint connects the Lambda function to the supported AWS services, like Amazon RDS and Amazon S3, through a private connection.

References: https://docs.aws.amazon.com/lambda/latest/operatorguide/networking-vpc.html

https://docs.aws.amazon.com/lambda/latest/dg/configuration-vpc-endpoints.html

The different types of VPC endpoints are:

- Gateway endpoint: A gateway endpoint is a gateway that is a target for a route in your route table used for traffic destined to either Amazon S3 or DynamoDB. There is no charge for using gateway endpoints.

- Interface endpoint: An interface endpoint is an elastic network interface with a private IP address from the IP address range of your subnet. It serves as an entry point for traffic destined to a service that is owned by AWS or owned by an AWS customer or partner. You are billed for hourly usage and data processing charges.

- Gateway Load Balancer endpoint: A Gateway Load Balancer endpoint is an elastic network interface with a private IP address from the IP address range of your subnet. It serves as an entry point to intercept traffic, and route it to a network or security service that you've configured using a Gateway Load Balancer.

References: https://docs.aws.amazon.com/whitepapers/latest/ec2-networking-for-telecom/aws-privatelink-and-service-endpoint.html

Steps:

  1. In the top navigation bar search box, type: lambda. In the search results, under Services, click Lambda. In the left navigation pane, click Functions. In the Functions section, click Create function. For Create function, choose Author from scratch. For Function name, type: labFunction. For Runtime, on the dropdown menu, choose Python 3.10. 4. Scroll down to Permissions.
  2. Under Permissions, click to expand Change default execution role. For Execution role, choose Use an existing role. For Existing role, choose lambda_security_role. Click to expand Advanced settings. Scroll down to Enable VPC.
  3. To connect the Lambda function to your VPC, choose the check box to select Enable VPC. For VPC, choose the VPC with Name: LabVPC. Scroll down to Subnets. For Subnets, in the search box, type: lambda. Choose the check boxes to select the two subnet names that contain lambda_subnet. - You are selecting lambda_subnetSubnet1 and lambda_subnetSubnet2.
  4. For Security groups, choose the check box to select the default VPC security group. Click Create function. - The function might take 5–10 minutes to be created. In the success alert, review the message. - You might need to wait for the success alert to appear. To add a layer to your Lambda function, in the Function overview section, click Layers.
  5. In the Layers section, click Add a layer. For Layer source, choose Custom layers. For Custom layers, choose the layer name that starts with labfunctionlayer. For Version, choose a version. - Only one version should be available in the list. The available version number might not be 1, depending on the number of times you created the Lambda function with this custom layer. Any version number will work. Click Add.
  6. . In the Function overview section, next to Layers, review the number (1). - One layer is added. On the Code tab, click Upload from to expand the dropdown menu. Choose .zip file. In the pop-up box, click Upload, and then choose the lambda_security_code.zip file that you downloaded at the beginning of the lab. Click Save.
  7. In the success alert, review the message. Click the Code tab. In the Code source section, in left Environment window, open (double-click) the lambda_function.py file. To view the lambda_function code window, on the Code source navigation bar, click the expand icon. In the lambda_function code, review lines 30 and 31. - The Lambda function retrieves the database credentials from AWS Secrets Manager. - The value of the secret_name is read from the environment variable with the key, secret_arn.
  8. . In the top navigation bar search box, type: secrets. In the search results, under Services, click Secrets Manager. In the Secrets section, click the secret name that starts with LabDatabaseSecret. In the Secret details section, under Secret ARN, click the copy icon to copy the provided ARN, and then paste it in the text editor of your choice on your device. - You will use this ARN in later steps. Scroll down to Secret value. For Secret value, click Retrieve secret value.
  9. In the Secret value section, review the key-value pairs of secrets stored in Secrets Manager. - Secrets Manager stores the database credentials, such as the database login username and password, database name, and database engine. Scroll down to Sample code. In the Sample code section, review the provided code samples. - The code samples are written in different programming languages to retrieve the secret in your application.
  10. Navigate to the labFunction page on the AWS Lambda console. - Remember, on the top navigation bar, you can use the Services search box (or click Services) to navigate to a different service console. Click the Configuration tab. On the Configuration tab, click General configuration. Click Edit. For Timeout, in the first (min) text box, type: 3 - This will increase the timeout value. Click Save. In the success alert, review the message. Click the Configuration tab. Click Environment variables. Click Edit. Click Add environment variable.
  11. For Key, type: secret_arn. For Value, paste the Secrets Manager ARN that you copied in an earlier step. Click Save. In the success alert, review the message. Click the Code tab. On the Code source navigation bar, click Test to expand the dropdown menu. Choose Configure test event.
  12. In the pop-up box, for Test event action, choose Create new event. For Event name, type a name that you like, such as testEvent. For Event sharing settings, keep the default setting of Private. For Template, choose hello-world. Scroll down to the bottom of the page, and then click Save.
  13. After the test event is successfully saved, click Test. On the Execution results tab, review Status: Succeeded. Under Function Logs, review the logs. - Database credentials were successfully retrieved from AWS Secrets Manager. To return to the code, click the lambda_function tab. To view the code, click the expand icon.
  14. . In the lambda_function code, review lines 53–69. -This code block tests the Amazon Relational Database Service (Amazon RDS) connection with your Lambda function, using the retrieved database username and password. - The Amazon RDS database "talentpool" includes first_name, last_name, occupation, company, date of birth, and country of people. Review lines 73–78. - This code block defines a custom query, which finds all the records with Toxicologist as occupation in the database.
  15. To uncomment the code block of lines 53–69, select (highlight) lines 53–69. On the Code source navigation bar, click Edit to expand the dropdown menu. Choose Comment. To uncomment the code block, choose Toggle Comment. - Be sure to keep the indentations in the Python code blocks. To save the updated function, click Deploy. In the success alert, review the message. To test the Amazon RDS connection, click Test. - The test might take 1–2 minutes because it has to load the data.
  16. Navigate to the Amazon VPC console. - You can type "vpc" in the top navigation bar search box. In the left navigation pane, click Endpoints. In the Endpoints section, click Create endpoint. In the Endpoint settings section, for Name tag, type a name that you like. For Service category, choose AWS services. To search for Amazon S3 services, in the Services section search box, type: s3 Choose Service Name: com.amazonaws.us-east-1.s3.
  17. Choose the service name that has a Type of Gateway. For VPC, choose the VPC name that includes (LabVPC). Scroll down to Route tables. In the Route tables section, choose the two check boxes to select the subnet names that include lambda_subnet: - You are selecting Lab/LabVPC/lambda_subnetSubnet1 and Lab/LabVPC/lambda_subnetSubnet2. Scroll down to the bottom of the page, and then click Create endpoint.
  18. Navigate to the Amazon S3 console. 2. In the Buckets section, click the bucket name that ends with -practice. Above the Objects tab, select (highlight) and copy the S3 bucket name, and then paste it in your text editor. - You will use this bucket name in a later step. On the Objects tab, review to ensure that the S3 bucket is currently empty.
  19. Navigate to the labFunction page on the AWS Lambda console. To view the lambda_function code, on the Code tab, click the expand icon. In the lambda_function code, review the code block of lines 93–104. - The code tests the Lambda connection with Amazon S3. - If successful, File Uploaded Successfully is printed in the log. On line 98, to replace Enter_your_bucket_name, paste the -practice S3 bucket name that you copied in an earlier step. Uncomment lines 93–104. - You practiced uncommenting lines in an earlier step. - Be sure to keep the indentations in the Python code blocks. To save the updated function, click Deploy. To test the Amazon S3 connection, click Test.
  20. On the Execution results tab, review Status: Succeeded. Under Function Logs, review the logs. - The query results of the Amazon RDS were successfully uploaded to the S3 bucket. Navigate to the Amazon S3 console. In the Buckets section, click the bucket name that ends with -practice.
  21. On the Objects tab, click the results.json file. - This new object was added to the S3 bucket. To download the Amazon RDS queried results to your device, click Download. On your device, open the downloaded results.json file with a text editor or JSON viewer, and then review the Amazon RDS queried results. - The results include the people in the database with Toxicologist as their occupation.




To view or add a comment, sign in

More articles by Max Jaff Fanka

  • AWS : Identity and Access Management

    Business Scenario: The mayor's office has hired a new security administrator to manage user access to AWS resources…

  • AWS: Compliance Enforcement

    Business Scenario: A hospital has many application servers that are hosted based on the requirements of many different…

  • AWS: Computing Solutions

    Business Scenario: The school server that runs the scheduling solution needs more memory. Help the school to vertically…

  • AWS: Cloud First Steps:

    Business Scenario The island's stabilization system is failing and needs increased reliability and availability for its…

  • AWS : Connecting VPCs

    Business Scenario: The city's marketing team wants separate virtual private clouds (VPCs) for each department, which…

  • AWS : Protecting Data at Rest

    Business Scenario: The city's recycling service recently deployed a truck scheduling application that has greatly…

  • Introduction to Amazon Redshift

    Understanding of Amazon Redshift, demonstrates the basic steps required to get started with Redshift includes: Creating…

  • Troubleshooting - IAM Access Issues

    The concepts of assuming an AWS Identity and Access Management (IAM) role from the Management Console. As a member of…

  • Create a CI/CD pipeline to deploy your app to AWS Fargate.

    How to build a fully managed continuous integration and continuous delivery (CI/CD) pipeline for applications that run…

  • APPROACH TO MAINFRAME MIGRATION AND MODERNIZATION.

    In this course, you will learn about the AWS approach for mainframe migration and modernization on AWS which helps you…

Explore content categories