Testing cloud applications without breaking the bank: Testcontainers and LocalStack
Join us today at 13:00 (GMT) for This Week in Testing
TWiT’s kinda like talk radio for testers, we share testing insights, swap stories, and wrap up the week gathered together.
Cloud applications nowadays
Nowadays, many enterprise applications run in the cloud and use multiple services available through cloud providers. As of 2024, 94 percent of companies worldwide have adopted cloud computing in some capacity. Those cloud services include virtual machines, file storage, message queues, databases, monitoring, logging, security tools, and many others.
Nowadays, you can build an entire application that is 100 percent cloud-based. Some of the current most popular cloud providers are AWS, Google Cloud, and Azure. Those three together are responsible for running most cloud applications. In 2023, enterprises spent approximately 270 billion USD on cloud infrastructure services, marking a 45-billion increase from the previous year.
Most cloud providers charge you based on how much you use those services. The more you use a cloud provider, the higher your bill will be at the end of the month. Generally, providers still offer a free tier, but you’ll still be charged if your usage exceeds a certain limit. The problem is that many companies depend on and use those services a lot, which makes it difficult to stay within this limit. 49 percent of businesses find it challenging to control cloud costs, and 33 percent exceed their cloud budgets by 40 percent.
What are the challenges of testing cloud applications?
Generally, the more customers you have, the more you need to scale your application. That means you will need more computational resources: better virtual machines with more CPU and memory, more file storage and database capacity to store application data, more message queues to handle incoming events, and so on.
Profits from adding customers certainly help to justify these increased costs, but it’s still advisable to find ways to save on cloud expenses. And those expenses include cloud usage by testers who verify that the application is behaving as expected. Testing, on its own, doesn’t increase revenue, and you will incur charges for testing in the cloud. So you need a good strategy for dealing with these costs. 28 percent of public cloud spending is wasted annually, often due to overprovisioning and lack of proper management. Planning ahead can ensure that your organisation’s testing efforts don’t become part of that problem.
Which options do you have to minimize testing costs?
To avoid undue cloud usage costs, you can create strategies where you can validate that the application meets all the expected quality criteria and also reduce the usage of the dependent cloud services as much as possible during the test process.
That seems a big challenge, since people can assume that to provide good software quality, you need extensive tests to evaluate different points of the application. Actually, what you need is just the right strategy to validate the application.
Fortunately, you have a choice of good tools to support this strategy. Read on to find out more.
Calling ALL leaders for a one-day educational experience to help expand quality engineering and testing practices across businesses.
Brighton - 30th Sep 2025
What is LocalStack?
LocalStack is a free and open-source tool that simulates cloud services on your own computer. It is like you have a cloud running on your computer, but it won’t cost you anything to use it.
Using LocalStack, you can simulate the cloud services that your application depends on. The service will cost your organisation nothing and you will eliminate the risk of maintaining different sandbox environments and accounts.
It is important to highlight that LocalStack works specifically for AWS services. The tool is mature, has a large user community, and covers almost all existing AWS services. However, if your application uses other cloud providers, like GCP or Azure, you may need to look at other similar tools to achieve the same goal. Here are some examples:
Some of those alternatives may not have the same maturity and coverage that LocalStack can provide at the moment, but they are frequently updated and improved.
What is Testcontainers?
Testcontainers is a free and open-source library that provides easy and lightweight test dependencies with real services wrapped in Docker containers. You can spin up any Docker container as a setup to run your tests.
You can also create specific configurations as needed for your test execution. For example, if you need to spin up a database, you can write code to create specific tables and insert custom data.
After test execution is complete, Testcontainers will also take care of executing all the required clean-up steps. This gives you a clean state for your next tests.
Another interesting feature of Testcontainers is that it supports modules that are preconfigured integrations with your application’s dependencies. That makes the process of writing your tests even easier. One of those modules is LocalStack, whose integration is discussed below. You can view the full list of available modules: Testcontainers modules.
The real power of LocalStack is achieved when you use it in combination with Testcontainers. In that way, you can integrate dependent cloud services into your test code without much effort.
You can use Testcontainers with many different programming languages and frameworks, including:
LocalStack and Testcontainers in action
To illustrate how to use LocalStack and TestContainers together in practice, I will show you some code examples from a personal project that I have been working on.
It’s a simple application to analyze reviews and comments from users. The components are: a website, two backend microservices, AWS SQS, and an AWS S3 Bucket.
This is the architecture:
Review Collector service is called to send the new review to the SQS queue after the user submits a new one through the website. The Review Analyzer service will then process any new message in the SQS queue, analyze this review, and push the results to the S3 Bucket.
Even though this application is simple, it’s a good use case to demonstrate the advantages of using Testcontainers and LocalStack as part of your test strategy.
You can create two types of tests using LocalStack and Testcontainers: integration and end-to-end (E2E). This article will focus on the E2E tests, but you can also find some examples of integration tests for this project in my repository: Review Analysis Cloud Microservices - Fernando Teixeira.
Setting up your E2E test
The diagram below illustrates the E2E test process:
Recommended by LinkedIn
Now, let’s see what the code for this whole idea looks like.
Test setup
Test setup is the most important part of the code, since the Testcontainers and LocalStack configuration is stored here.
First, you need a file with a base configuration and the setup steps. The E2E test class will import and use this test setup.
Below is an example sample base configuration class:
package com.teixeirafernando.e2e.tests;
import org.junit.jupiter.api.BeforeAll;
import org.testcontainers.containers.GenericContainer;
import org.testcontainers.containers.Network;
import org.testcontainers.containers.localstack.LocalStackContainer;
import org.testcontainers.junit.jupiter.Container;
import org.testcontainers.junit.jupiter.Testcontainers;
import org.testcontainers.lifecycle.Startables;
import org.testcontainers.utility.DockerImageName;
import java.io.IOException;
@Testcontainers
public class TestContainersConfiguration {
private static final Network SHARED_NETWORK = Network.newNetwork();
protected static GenericContainer<?> ReviewCollectorService;
protected static GenericContainer<?> ReviewAnalyzerService;
static protected final String BUCKET_NAME = "review-analysis-bucket";
static protected final String QUEUE_NAME = "review-analysis-queue";
@Container
protected static LocalStackContainer localStack = new LocalStackContainer(
DockerImageName.parse("localstack/localstack:4.0.3")
).withNetwork(SHARED_NETWORK).withNetworkAliases("localstack");
@BeforeAll
static void beforeAll() throws IOException, InterruptedException {
localStack.execInContainer("awslocal", "s3", "mb", "s3://" + BUCKET_NAME);
localStack.execInContainer(
"awslocal",
"sqs",
"create-queue",
"--queue-name",
QUEUE_NAME
);
ReviewCollectorService = createReviewCollectorServiceContainer(8080);
ReviewAnalyzerService = createReviewAnalyzerServiceContainer(8081);
Startables.deepStart(ReviewCollectorService, ReviewAnalyzerService).join();
}
private static GenericContainer<?> createReviewCollectorServiceContainer(int port) {
return new GenericContainer<>("teixeirafernando/review-collector:latest")
.withEnv("AWS_ENDPOINT", "http://localstack:4566")
.withExposedPorts(port)
.withNetwork(SHARED_NETWORK);
}
private static GenericContainer<?> createReviewAnalyzerServiceContainer(int port) {
return new GenericContainer<>("teixeirafernando/review-analyzer:latest")
.withEnv("AWS_ENDPOINT", "http://localstack:4566")
.withExposedPorts(port)
.withNetwork(SHARED_NETWORK);
}
}
Let’s understand the code piece by piece:
import org.testcontainers.containers.localstack.LocalStackContainer;
import org.testcontainers.containers.GenericContainer;
We need to import some classes:
private static final Network SHARED_NETWORK = Network.newNetwork();
A shared network object is needed to keep all the different containers within one network and make them communicate with each other.
@Container
protected static LocalStackContainer localStack = new LocalStackContainer(
DockerImageName.parse("localstack/localstack:4.0.3")
).withNetwork(SHARED_NETWORK).withNetworkAliases("localstack");
You then provide the LocalStackContainer with a specific Docker image from LocalStack. Also, you set it to use the shared network and you give it an alias within the network.
@BeforeAll
static void beforeAll() throws IOException, InterruptedException {
localStack.execInContainer("awslocal", "s3", "mb", "s3://" + BUCKET_NAME);
localStack.execInContainer(
"awslocal",
"sqs",
"create-queue",
"--queue-name",
QUEUE_NAME
);
ReviewCollectorService = createReviewCollectorServiceContainer(8080);
ReviewAnalyzerService = createReviewAnalyzerServiceContainer(8081);
Startables.deepStart(ReviewCollectorService, ReviewAnalyzerService).join();
}
In the BeforeAll code block, you define all the steps that should be executed before test execution begins.
E2E Test
Now, you can create your E2E test, which will use the setup configuration.
To assist us with some of the validations that will be performed in your E2E tests, you will use the REST Assured library, which will help us to make some API REST Requests and check the services' responses with assertions.
This is what the whole test looks like:
package com.teixeirafernando.e2e.tests;
import static io.restassured.RestAssured.given;
import static org.awaitility.Awaitility.await;
import static org.hamcrest.Matchers.equalTo;
import static org.hamcrest.Matchers.notNullValue;
import org.apache.http.HttpStatus;
import org.junit.jupiter.api.DisplayName;
import org.junit.jupiter.api.Test;
import java.time.Duration;
import java.util.Optional;
public class ReviewAnalysisE2ETest extends TestContainersConfiguration {
@Test
@DisplayName("Create a new Review and make the sentiment analysis")
void analyzeReviewSuccessfully(){
String fullReviewCollectorURL = "http://"+ TestContainersConfiguration.ReviewCollectorService.getHost()+":8080";
String fullReviewAnalyzerURL = "http://"+ TestContainersConfiguration.ReviewAnalyzerService.getHost()+":8081";
String review = """
{
"productId": "da6037a6-a375-40e2-a8a6-1bb5f9448df0",
"customerName": "test",
"reviewContent": "test",
"rating": 5.0
}
""";
String id = given()
.contentType("application/json")
.body(review)
.when()
.post(fullReviewCollectorURL + "/api/review")
.then()
.assertThat()
.statusCode(HttpStatus.SC_OK)
.body("productId", equalTo("da6037a6-a375-40e2-a8a6-1bb5f9448df0"))
.extract()
.path("id");
System.out.println("Review Collector Service created a new Review and pushed it to SQS");
await()
.pollInterval(Duration.ofSeconds(5))
.atMost(Duration.ofSeconds(30))
.untilAsserted(() -> {
given()
.contentType("application/json")
.when()
.get(fullReviewAnalyzerURL + "/api/messages/" + id)
.then()
.assertThat()
.statusCode(HttpStatus.SC_OK)
.body("id", equalTo(id))
.body("reviewAnalysis", notNullValue());
});
System.out.println("Review Analyzed Service processed the Review and sent it to S3");
}
}
Let’s understand piece by piece what it does:
public class ReviewAnalysisE2ETest extends TestContainersConfiguration {
First of all, you need to properly extend the TestContainersConfiguration that you created before.
String fullReviewCollectorURL = "http://"+ TestContainersConfiguration.ReviewCollectorService.getHost()+":8080";
String fullReviewAnalyzerURL = "http://"+ TestContainersConfiguration.ReviewAnalyzerService.getHost()+":8081";
We need to provide the endpoints of the two services, and you do it by getting the host address from your service containers.
String review = """
{
"productId": "da6037a6-a375-40e2-a8a6-1bb5f9448df0",
"customerName": "my-customer-name",
"reviewContent": "This is my comment for why my rating for this product was 5",
"rating": 5.0
}
""";
We create a simple test review to be processed by the application services.
String id = given()
.contentType("application/json")
.body(review)
.when()
.post(fullReviewCollectorURL + "/api/review")
.then()
.assertThat()
.statusCode(HttpStatus.SC_OK)
.body("productId", equalTo("da6037a6-a375-40e2-a8a6-1bb5f9448df0"))
.extract()
.path("id");
System.out.println("Review Collector Service created a new Review and pushed it to SQS");
In this first part of your test, you make an API request to the Review Collector service. Then, you provide it with the test review that you created before. Finally, you make some assertions to ensure that the test review was properly created and pushed to the SQS queue.
await()
.pollInterval(Duration.ofSeconds(5))
.atMost(Duration.ofSeconds(30))
.untilAsserted(() -> {
given()
.contentType("application/json")
.when()
.get(fullReviewAnalyzerURL + "/api/messages/" + id)
.then()
.assertThat()
.statusCode(HttpStatus.SC_OK)
.body("id", equalTo(id))
.body("reviewAnalysis", notNullValue());
});
System.out.println("Review Analyzed Service processed the Review and sent it to S3");
In the last part of the test, a request is made to the second service (Review Analyzer) to validate that it processed the Review and sent it to S3.
Since the processing of messages coming from SQS is asynchronous, it can take some time for the test review to be processed by the application. To deal with that, you use an await function with a retry mechanism to avoid flakiness in your test.
What are the advantages of using this approach?
By using Testcontainers and LocalStack to test your cloud services, you benefit from:
To wrap up
Testing cloud applications doesn’t have to be expensive. With tools like LocalStack and Testcontainers, you can create a local version of your cloud services to run all the tests that you need without being shocked by your latest cloud provider's bill.
You can even apply a shift-left strategy, since you don’t need to deploy your whole application to run your tests. You can catch problems early, fix them, and improve your application without paying for actual cloud usage during testing. It’s a smart and budget-friendly way to ensure your cloud application works as expected.
The examples in this article focused on applications dependent on AWS services. However, you can use Testcontainers and LocalStack with other cloud providers.
Long story short: using LocalStack with Testcontainers will allow you to create a great test strategy that fits your situation.
You can find the full project and examples used in this article on my Github project.
For more information
There is simply nothing else like it.
"Ministry of Testing is the one-stop shop for everything testing." —Ashutosh Mishra
Nice one