Infrastructure as Code and AWS Lambda
Infrastructure as Code
Infrastructure as code is one of the fundamental practices when working with serverless applications in production. Infrastructure as code involves using a high-level programming language to control the infrastructure of IT systems. When your infrastructure can be treated as code, you can start applying the same techniques to the infrastructure that you apply to code, like testing, code reviews, automatic testing, and so on. Additionally, when the infrastructure is in a coded form, it can replicated many times, minimizing errors. All of this will improve the quality of your infrastructure. Infrastructure as code means that you're not going to the console of your cloud provider and different tools and type and click check boxes to create our infrastructure. It means that we are going to create scripts where the whole infrastructure gets defined. Later, these scripts will be saved in the same repository as the code is. And it will be put through the code reviews and tests. And as with code, when the scripts are changed, we will register who made the changes and why. This practice is fundamental for cloud development, microservices, and for serverless. Some advantage of infrastructure as code are the infrastructure can be managed in a programmatic way. No need for manual configurations that can generate many errors. It is possible to make infrastructure that can be repeated in many different environments, such as production, testing, and development. One employee can manage a massive infrastructure. Development speed is gained and now you can reutilize pieces of your infrastructure you had already used in other projects and evolve them. Infrastructure with more security. The chances of creating bugs gets reduced.
AWS Lambda
Lambda is referred by AWS as Serverless Computing. Serverless is a relatively new term in the software world that involves services that adhere to three criteria. You pay for the services you use. You don't need to manage infrastructure, and you can scale the service automatically up and down depending on the traffic. AWS Lambda lets you run code without provisioning or managing servers. You pay for how much you use and the servers will scale accordingly to your needs. That's why it's called Serverless Computing. Lambda supports many different programming languages like JavaScript, Python, Java, C#, Go and many others.
Lambda use case scenarios
Lambda has been doctored for many different use cases. The following list is just a brief summary of the most common use cases.
- AWS Lambda can be used to execute code when data changes. For example, to enrich data streams or do real-time analytics of data.
- Lambda is a great replacement for instances for building back ends including IoT or mobile back ends.
- Changes in your infrastructure can trigger events that execute code in Lambda for maintaining your Cloud infrastructure. For example, you can use Lambda to turn off your instances if nobody's using them.
- Lambda can execute on a schedule basics. For example, if you need to send a message at 2 p.m. everyday, you can use a scheduled Lambda to send that message for you.
We can break the Lambda programming model into three basic parts. Lambda has triggers, it has a handler function, and it has a specific code for Lambda. One typical example is an HTTP request that comes into our system using API Gateway and then API Gateway triggers a Lambda function. Another example is a record in a DynamoDB table that has been modified and this can trigger a Lambda function to execute. The creation of a new file in our file storage S3 can trigger a Lambda function to execute. Also, a new message in an SQS queue, that is the normal messaging queue that AWS provides, can also trigger a Lambda function to execute. We have the handler function that will invoke when the function is called. The handler function has an event object as an input parameter. That is the data sent during the function call. The event object changes depending on who invokes the Lambda. This means that the event object when API Gateway triggers the Lambda will be different than the event object when Lambda is triggered by Kinesis. Another input parameter is the context object. This parameter contains methods available to interact with runtime information. The last input parameter is the callback object. This parameter is not mandatory and not available in all cases. It is used to return information to the one that invokes the Lambda. So the anatomy of the Lambda function take us to the execution model of Lambda function. There are three execution models for a Lambda function, synchronous, asynchronous, and poll/stream based. Let's look first at a synchronous model. For this model, the Lambda function can respond back to its invoker. For example, this is the case of an API Gateway that is in charge of handing HTTP requests and responses. In the synchronous model, services can be invoke by other services and must wait for a reply. This is considered a blocking request because the invoking service cannot finish executing until a response is received. The asynchronous model handles requests that are non-blocking. A service can invoke another service directly or it can use another type of communication channel to queue the information. The service typically only needs to wait for confirmation, acknowledge that the request was received. For example, this is the case of S3. When a new file is created, it can trigger a Lambda function, but the entity that created the file is not blocked waiting for the Lambda function to execute. The poll/stream base model is when Lambda polls the stream, or message queue, and invokes a Lambda function synchronously. For example, SQS is poll/stream based. Messages are put into the queue and then the Lambda platform polls the queue. When there is a new message it triggers the right Lambda function.
API Gateway
API Gateway is one of the most popular triggers for AWS Lambda to create Serverless back ends. Amazon API Gateway is a fully managed service that make it easy for developers to create, publish, maintain, monitor, and secure APIs at any scale. You can create an API that acts as a front door for your applications to access data, business logic, or functionality from your back end services. Amazon API Gateway handles all the task involved in accepting and processing up to hundreds of thousands of concurrent API calls, including traffic management, authorization, access control, monitoring, and API version management. API Gateway has no minimum fees or startup costs. You pay only for the API's cost to receive an amount of data transfer out.
Serverless Application Model (SAM)
Serverless Application Model (SAM) is an open-source framework for building serverless applications. SAM will help you to build your infrastructure as it has a simple notation to create functions, APIs and database tables that are quite common resources needed in serverless applications. The infrastructure is defined in a YML file. And then it can be deployed with the help of SAM to the cloud. SAM syntax will get transformed into CloudFormation during the deployment process. CloudFormation is the AWS infrastructure as code syntax for defining all the AWS resources. This is very portable within different accounts and maintainable over time. SAM comes with a rich set of features available to developers. The SAM template will contain all the definitions for the right components of the serverless applications and their configuration. All these can deployed as one single entity. When we are building our serverless projects with SAM, we are defining our LAN drives, APA gateways, dynamaterials, and our resources in the code. We are using YAML to find these resources. We don't need to go to the AWS control to create the resources. We can store all the code in a code repository. We can even replicate this code in multiple AWS accounts.
Absolutely, IaC provides code re-usability, scalability and then true automation for the Organization as a whole. Good Article!!