Testing Multi-Datasource Spring Boot Apps with Testcontainers on Azure Pipelines
My goal was to improve a couple of unit tests where several layers in the application were mocked, and complement them with integration tests. Of course unit tests have their validity, but they are not as thorough as integration tests. In an application with multiple data sources you need to do a little bit more, if you want to test all layers without using data connection mocks. Plus integration tests may yield a better ROI than unit tests in the long rong, since they do not mock away potential business critical parts of the code. I may here actually indirectly refer to Kent C. Dodds https://kentcdodds.com/blog/the-testing-trophy-and-testing-classifications testing trophy initiative
The case
I have an application that makes use of two data sources. When a user logs in, a lookup in a database makes sure the user has accurate privileges, then a couple of rest requests send data to the application. The data is processed and then stored in a different database. The first datasource is merely used in the beginning of the application, but nonetheless it’s there and even though it wasn’t a part of my particular test, it still needed to be used in the test. The following is a boiled down version promoted to an example application, in order to make it more user friendly.
Testing out h2
When I was ready to enhance tests by means of a test datasource, I remember I had used h2 before. H2 was a database backbone for various tests, and the world may have been more compatible and a lot simpler in those days, but I thought why not give it a shot. So I added a POM dependency, and the data.sql script with my database schema seemed to be injected automatically from src/test/resources. It worked fine so I set out to write my first test - only to discover the limits of h2. Deep in my code I had proprietary MSSQL syntax. The STUFF keyword collects various things in the MSSQL, but it does not actually rhyme with h2 support. I got:
SQL Error: 90022, SQLState: 90022 2025-02-18 14:36:32.749 ERROR 2428 --- [onPool-worker-1] o.h.engine.jdbc.spi.SqlExceptionHelper : Function "STUFF" not found; SQL statement:
So what to do ? rewrite the code so it fits with h2 and the test or go about it using tweaks and twists? Not a good idea since you want to fix the test framework and not necessarily the code so that it fits the test. The next problem would be getting it to work with MSSQL again, and it already worked with MSSQL.
I presented my problem to a colleague, and we discussed it back and forth. That’s when I remember that a former colleague of mine once talked about Testcontainers! The word just suddenly came to mind, since I knew they offer a containerized smorgasbord of images! Build up, throw away. No need for an extra test database somewhere. That would solve my h2 syntax issue. By just replacing it with a testcontainer I could (re-)create the real MSSQL database for my test. So I went back to my desk and removed the h2 dependency and ventured upon the next level test adventure.
Testing out Testcontainers
With MSSQL integration the application would potentially not be prone to syntax errors, using the actual database schema and script for inserting data. I would be able to use the STUFF keyword as much as I please. Since I moved away from h2 and a mocked world, and into an integration testing world, I discarded my application-test.yaml and just relied on my ordinary application.yaml from my actual application.
The following is not an excerpt from my application, but a version compiled for test and demonstration purposes.
The process would be as follows:
Create an application that connects to multiple data sources
Go to start.spring.io and create your sampler spring application project, and import the project into your favorite idea.
Add application.yaml in src/main/resources:
springdoc:
swagger-ui:
path: /swagger-ui.html
disable-swagger-default-url: true
version: 1.0
server:
port: 8080
tomcat:
connection-timeout: 120000
keep-alive-timeout: 60000
max-keep-alive-requests: 100
max-swallow-size: 2097152
spring:
profiles:
active: dev
jpa:
hibernate:
naming:
physical-strategy: dk.hjemliebe.data.SimpleNamingStrategy
show-sql: false
properties:
hibernate:
format_sql: true
dialect: org.hibernate.dialect.SQLServerDialect
logging:
file:
name: /var/log/MultipleDataSources/mds.log
level:
com:
zaxxer:
hikari: OFF
org:
springframework:
jdbc:
core:
JdbcTemplate: DEBUG
StatementCreatorUtils: DEBUG
apache.catalina.connector.ClientAbortException: OFF
hibernate:
type:
descriptor:
sql:
BasicBinder: OFF
sql: TRACE
dk:
hjemliebe:
controller: DEBUG
data: DEBUG
service: DEBUG
charset:
console: UTF-8
file: UTF-8
logback:
rollingpolicy:
max-file-size: 200MB
max-history: 6
file-name-pattern: /var/log/MultipleDataSources/mds.%d{yyyy-MM-dd}.%i.log
---
spring:
config:
activate:
on-profile: dev
datasource:
testdb1:
url: jdbc:sqlserver://localhost:1433;database=testdatabase1;encrypt=true;trustServerCertificate=true;
username: sa
password: "A_Str0ng_Required_Password"
testdb2:
url: jdbc:sqlserver://localhost:1433;database=testdatabase2;encrypt=true;trustServerCertificate=true;
username: sa
password: "A_Str0ng_Required_Password"
jpa:
database-platform: org.hibernate.dialect.SQLServerDialect
generate-ddl: true
show-sql: true
hibernate:
dialect: org.hibernate.dialect.SQLServerDialect
sql:
init:
encoding: UTF-8
---
Why application.yaml? Simply put I just like the indentation more than I like the way it works for application properties.
Remove the default application properties file, if it’s there.
Create packages for config, controller, data & service or the structure that seems feasible for you.
Let’s start from the bottom up by adding two properties-classes for the datasource, so you can use them as properties in your application:
package dk.hjemliebe.config;
import org.springframework.boot.context.properties.ConfigurationProperties;
import org.springframework.context.annotation.Configuration;
@Configuration
@ConfigurationProperties(prefix = "spring.datasource.testdb1")
public class DataBase1Configuration {
private String url;
private String username;
private String password;
public String getUrl() {
return url;
}
public void setUrl(String url) {
this.url = url;
}
public String getUsername() {
return username;
}
public void setUsername(String username) {
this.username = username;
}
public String getPassword() {
return password;
}
public void setPassword(String password) {
this.password = password;
}
}
Observe that the prefix maps to the datasource for the dev profile in the properties file
Don't forget the second class - just copy the one above and switch to using testdb2, and call the class Database2Configuration.
We need two configurations for the database config in the same package, one for each datasource. Under the config package, we add Database1ManagerConfig
package dk.hjemliebe.config;
import com.zaxxer.hikari.HikariDataSource;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.boot.orm.jpa.EntityManagerFactoryBuilder;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Primary;
import org.springframework.data.jpa.repository.config.EnableJpaRepositories;
import org.springframework.orm.jpa.JpaTransactionManager;
import org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean;
import org.springframework.transaction.PlatformTransactionManager;
import org.springframework.transaction.annotation.EnableTransactionManagement;
import javax.persistence.EntityManagerFactory;
import javax.sql.DataSource;
import javax.xml.crypto.Data;
@Configuration
@EnableTransactionManagement
@EnableJpaRepositories(
entityManagerFactoryRef = "db1EntityManagerFactory",
basePackages = {"dk.hjemliebe.data.repositories.testtable1"}
)
public class Database1ManagerConfig {
private DataBase1Configuration dataBase1Configuration;
public Database1ManagerConfig(DataBase1Configuration dataBase1Configuration) {
this.dataBase1Configuration = dataBase1Configuration;
}
@Bean(name = "db1DataSource")
@Primary
public DataSource getDb1DataSource() {
HikariDataSource hikariDataSource = new HikariDataSource();
hikariDataSource.setMaxLifetime(28000); // 30 seconds
hikariDataSource.setIdleTimeout(28000);
hikariDataSource.setMaximumPoolSize(10);
hikariDataSource.setLeakDetectionThreshold(10000);
hikariDataSource.setJdbcUrl(dataBase1Configuration.getUrl());
hikariDataSource.setUsername(dataBase1Configuration.getUsername());
hikariDataSource.setPassword(dataBase1Configuration.getPassword());
return hikariDataSource;
}
@Bean(name = "db1EntityManagerFactory")
@Primary
public LocalContainerEntityManagerFactoryBean db1EntityManagerFactory(EntityManagerFactoryBuilder builder,
@Qualifier("db1DataSource") DataSource dataSource) {
return builder
.dataSource(dataSource)
.packages("dk.hjemliebe.data.entities.db1")
.persistenceUnit("db1")
.build();
}
@Bean(name = "transactionManager")
@Primary
public PlatformTransactionManager db1TransactionManager(
@Qualifier("db1EntityManagerFactory") EntityManagerFactory entityManagerFactory) {
return new JpaTransactionManager(entityManagerFactory);
}
}
Plus
package dk.hjemliebe.config;
import com.zaxxer.hikari.HikariDataSource;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.boot.orm.jpa.EntityManagerFactoryBuilder;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.jpa.repository.config.EnableJpaRepositories;
import org.springframework.orm.jpa.JpaTransactionManager;
import org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean;
import org.springframework.transaction.PlatformTransactionManager;
import org.springframework.transaction.annotation.EnableTransactionManagement;
import javax.persistence.EntityManagerFactory;
import javax.sql.DataSource;
@Configuration
@EnableTransactionManagement
@EnableJpaRepositories(
entityManagerFactoryRef = "db2EntityManagerFactory",
transactionManagerRef = "db2TransactionManager",
basePackages = {"dk.hjemliebe.data.repositories.testtable2"}
)
public class Database2ManagerConfig {
private DataBase2Configuration dataBase2Configuration;
public Database2ManagerConfig(DataBase2Configuration dataBase2Configuration) {
this.dataBase2Configuration = dataBase2Configuration;
}
@Bean(name = "db2DataSource")
public DataSource getDb2DataSource() {
HikariDataSource hikariDataSource = new HikariDataSource();
hikariDataSource.setMaxLifetime(28000); // 30 seconds
hikariDataSource.setIdleTimeout(28000);
hikariDataSource.setMaximumPoolSize(10);
hikariDataSource.setLeakDetectionThreshold(10000);
hikariDataSource.setJdbcUrl(dataBase2Configuration.getUrl());
hikariDataSource.setUsername(dataBase2Configuration.getUsername());
hikariDataSource.setPassword(dataBase2Configuration.getPassword());
return hikariDataSource;
}
@Bean(name = "db2EntityManagerFactory")
public LocalContainerEntityManagerFactoryBean db1EntityManagerFactory(EntityManagerFactoryBuilder builder,
@Qualifier("db2DataSource") DataSource dataSource) {
return builder
.dataSource(dataSource)
.packages("dk.hjemliebe.data.entities.db2")
.persistenceUnit("db2")
.build();
}
@Bean(name = "db2TransactionManager")
public PlatformTransactionManager db2TransactionManager(
@Qualifier("db2EntityManagerFactory") EntityManagerFactory entityManagerFactory) {
return new JpaTransactionManager(entityManagerFactory);
}
}
We use these two configs for connecting the right datasource to the right config. If you don’t mark the beans Primary in the first class, you will get a mapping error like
“Consider marking one of the beans as Primary, updating the consumer to accept multiple beans, or using Qualifier to identify the bean that should be consumed“
So you could also use Qualifier to solve the issues with the applicationcontext.
You want the transactionmanager bean in the classes in order to apply correct transactional management in the application with multiple data sources.
Also observe the use of the basePackages property for EnableJpaRepositories. This is to make sure that the repositories from these basepackages are mapped to the correct datasource.
Spring boot is great at mapping a datasource from application properties and classes, but in the case of multiple datasources, spring does not necessarily know what repository should be used for what datasource, so we need to add the HikariDataSource bean.
As for the EntityManagerFactoryBean we need one for each data source, since the EntityManager interacts with the database, and you need to execute queries to the right database.
Add two entities that map to our TestTable1 and TestTable2 for hibernate to use
package dk.hjemliebe.data.entities.db1;
import com.fasterxml.jackson.annotation.JsonIgnoreProperties;
import javax.persistence.*;
@Entity(name="TestTable1")
@JsonIgnoreProperties({"hibernateLazyInitializer"})
public class TestTable1 {
@Id
@GeneratedValue(strategy = GenerationType.IDENTITY)
private int id;
@Column(name = "name")
private String name;
public int getId() {
return id;
}
public void setId(int id) {
this.id = id;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
}
This is for your repository interfaces, so you can perform SQL queries. Create an equivalent for TestTable2, but remember to add a different entity and class name (TestTable2).
Add two repository interfaces so you can perform queries.
package dk.hjemliebe.data.repositories.testtable1;
import dk.hjemliebe.data.entities.db1.TestTable1;
import org.springframework.data.jpa.repository.JpaRepository;
import org.springframework.data.jpa.repository.Query;
import org.springframework.stereotype.Repository;
@Repository
public interface TestTable1Repository extends JpaRepository<TestTable1, Integer> {
@Query("select DISTINCT name FROM TestTable1")
String getName();
}
Add an equivalent for TestTable2. Make sure you switch the “1” with the “2” for TestTable2Repository, if you copy the first class to create the second. These interfaces would be used for more CRUD operations. If you have complex queries you would have to evaluate whether you can have them here or you have to use native queries through an entitymanager.
You may also want to add a SimpleNamingStrategy in order to avoid some mapping issues.
Add MultipleDataSourceServiceInterface for your service methods in your service package:
package dk.hjemliebe.service;
import dk.hjemliebe.data.entities.db1.TestTable1;
import dk.hjemliebe.data.entities.db2.TestTable2;
public interface MultipleDataSourceServiceInterface {
String getNameFromDataSource1();
String getNameFromDataSource2();
}
And implement the method in your actual service class, where you use your repository to query the database:
package dk.hjemliebe.service;
import dk.hjemliebe.data.repositories.testtable1.TestTable1Repository;
import dk.hjemliebe.data.repositories.testtable2.TestTable2Repository;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import javax.transaction.Transactional;
@Service
public class MultipleDataSourceService implements MultipleDataSourceServiceInterface {
private Logger LOG = LoggerFactory.getLogger(MultipleDataSourceService.class);
private TestTable1Repository testTable1Repository;
private TestTable2Repository testTable2Repository;
public MultipleDataSourceService(TestTable1Repository testTable1Repository, TestTable2Repository testTable2Repository) {
this.testTable1Repository = testTable1Repository;
this.testTable2Repository = testTable2Repository;
}
@Override
@Transactional
public String getNameFromDataSource1() {
return testTable1Repository.getName();
}
@Override
@Transactional
public String getNameFromDataSource2() {
return testTable2Repository.getName();
}
}
Now add a rest controller for a rest api:
Recommended by LinkedIn
package dk.hjemliebe.controller;
import dk.hjemliebe.data.entities.db1.TestTable1;
import dk.hjemliebe.data.entities.db2.TestTable2;
import dk.hjemliebe.service.MultipleDataSourceServiceInterface;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;
@RequestMapping("/api")
@RestController
public class MultipleDataSourceController {
private MultipleDataSourceServiceInterface multipleDataSourceService;
@Autowired
public MultipleDataSourceController(MultipleDataSourceServiceInterface multipleDataSourceService) {
this.multipleDataSourceService = multipleDataSourceService;
}
@GetMapping(path = "/name1", produces = "application/json")
public String getName1() {
return multipleDataSourceService.getNameFromDataSource1();
}
@GetMapping(path = "/name2", produces = "application/json")
public String getName2() {
return multipleDataSourceService.getNameFromDataSource2();
}
}
If you want to run your application and have a beautiful swagger API, and test the api manually, you can configure your main class like this:
package dk.hjemliebe;
import io.swagger.v3.oas.models.Components;
import io.swagger.v3.oas.models.OpenAPI;
import io.swagger.v3.oas.models.info.Info;
import io.swagger.v3.oas.models.security.SecurityScheme;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.boot.builder.SpringApplicationBuilder;
import org.springframework.boot.web.servlet.support.SpringBootServletInitializer;
import org.springframework.context.annotation.Bean;
@SpringBootApplication(scanBasePackages = { "dk.hjemliebe",
"dk.hjemliebe.data.repositories" })
public class Application extends SpringBootServletInitializer {
@Bean
public OpenAPI customOpenAPI() {
return new OpenAPI()
.components(new Components().addSecuritySchemes("basicScheme",
new SecurityScheme().type(SecurityScheme.Type.HTTP).scheme("basic")))
.info(new Info().title("Our fantastic api").version("1.0"));
}
protected SpringApplicationBuilder configure(SpringApplicationBuilder application) {
return application.sources(Application.class);
}
public static void main(String[] args) {
SpringApplication.run(Application.class);
}
}
When you’re done your application should have a structure looking like this:
If you want another very good explanation of configuring multiple datasources, head over to Diego Benincasa's Mastering Multi-Database Connections in Spring Boot
Now onto the fun part, the test part. Add a MultipleDataSourcesTest class under the package src/test/java/service
Create a test that incorporates the use of testcontainers
package dk.hjemliebe.service;
import com.zaxxer.hikari.HikariDataSource;
import dk.hjemliebe.controller.MultipleDataSourceController;
import org.junit.Assert;
import org.junit.jupiter.api.AfterAll;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.TestInstance;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.autoconfigure.AutoConfigureAfter;
import org.springframework.boot.autoconfigure.jdbc.DataSourceAutoConfiguration;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.boot.test.context.TestConfiguration;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Primary;
import org.springframework.test.context.DynamicPropertySource;
import org.testcontainers.containers.MSSQLServerContainer;
import org.testcontainers.containers.wait.strategy.Wait;
import javax.sql.DataSource;
import java.util.function.Supplier;
import static org.hibernate.id.SequenceMismatchStrategy.LOG;
@SpringBootTest(properties = "spring.main.allow-bean-definition-overriding=true")
@TestInstance(TestInstance.Lifecycle.PER_CLASS)
public class MultipleDataSourcesTest {
@Autowired
private MultipleDataSourceController multipleDataSourceController;
private final static String DATABASE_NAME1 = "testdatabase1";
private final static String DATABASE_USER1 = "sa";
private final static String DATABASE_PWD1 = "A_Str0ng_Required_Password";
private final static String DATABASE_NAME2 ="testdatabase2";
private final static String DATABASE_USER2 ="sa";
private final static String DATABASE_PWD2 ="A_Str0ng_Required_Password";
private final static String DRIVERNAME = "com.microsoft.sqlserver.jdbc.SQLServerDriver";
static MSSQLServerContainer<?> mssqlContainer = new MSSQLServerContainer<>("mcr.microsoft.com/mssql/server:2019-latest")
.acceptLicense()
.withInitScript("schema.sql")
.waitingFor(Wait.forLogMessage(".*SQL Server is now ready for client connections.*\\n", 1)) // Wait for DB readiness
.withCreateContainerCmdModifier(cmd -> cmd.withName("testmssqlcontainer"));
@DynamicPropertySource
static void dynamicProperties(org.springframework.test.context.DynamicPropertyRegistry registry) {
mssqlContainer.start();
String jdbcUrlWithDb = mssqlContainer.getJdbcUrl() + ";databaseName=" + DATABASE_NAME1;
String jdbcUrlWithDb2 = mssqlContainer.getJdbcUrl() + ";databaseName=" + DATABASE_NAME2;
Supplier<Object> jdbcUrl = new Supplier<Object>() {
@Override
public Object get() {
return jdbcUrlWithDb;
}
};
Supplier<Object> jdbcUrl2 = new Supplier<Object>() {
@Override
public Object get() {
return jdbcUrlWithDb2;
}
};
Supplier<Object> testdb = new Supplier<Object>() {
@Override
public Object get() {
return DATABASE_NAME1;
}
};
Supplier<Object> testdb2 = new Supplier<Object>() {
@Override
public Object get() {
return DATABASE_NAME2;
}
};
Supplier<Object> testusername = new Supplier<Object>() {
@Override
public Object get() {
return DATABASE_USER1;
}
};
Supplier<Object> testusername2 = new Supplier<Object>() {
@Override
public Object get() {
return DATABASE_USER2;
}
};
Supplier<Object> testpass = new Supplier<Object>() {
@Override
public Object get() {
return DATABASE_PWD1;
}
};
Supplier<Object> testpass2 = new Supplier<Object>() {
@Override
public Object get() {
return DATABASE_PWD2;
}
};
registry.add("spring.datasource.testdb1.url", jdbcUrl);
registry.add("spring.datasource.testdb1.username", testusername);
registry.add("spring.datasource.testdb1.password", testpass);
registry.add("spring.datasource.testdb2.url", jdbcUrl2);
registry.add("spring.datasource.testdb2.username", testusername2);
registry.add("spring.datasource.testdb2.password", testpass2);
}
@AfterAll
void stopContainer() {
if (mssqlContainer != null) {
mssqlContainer.stop();
}
}
@Test
public void testMultipleDataSourceApi() {
String result1 = multipleDataSourceController.getName1();
String result2 = multipleDataSourceController.getName2();
Assert.assertEquals("Walter",result1);
Assert.assertEquals("Matthau",result2);
}
@TestConfiguration
static class TestDatabaseConfig {
@Bean(name = "testDataSource1")
@Primary
public DataSource testDataSource() {
HikariDataSource dataSource = new HikariDataSource();
dataSource.setJdbcUrl(mssqlContainer.getJdbcUrl() + ";databaseName=" + DATABASE_NAME1);
dataSource.setUsername(DATABASE_USER1);
dataSource.setPassword(DATABASE_PWD1);
dataSource.setDriverClassName(DRIVERNAME);
return dataSource;
}
}
@TestConfiguration
@AutoConfigureAfter(DataSourceAutoConfiguration.class)
static class TestAccessDatabaseConfig {
@Bean(name = "testDataSource2")
public DataSource testDataSource() {
HikariDataSource dataSource = new HikariDataSource();
dataSource.setJdbcUrl(mssqlContainer.getJdbcUrl() + ";databaseName=" + DATABASE_NAME2);
dataSource.setUsername(DATABASE_USER2);
dataSource.setPassword(DATABASE_PWD2);
dataSource.setDriverClassName(DRIVERNAME);
return dataSource;
}
}
}
The class defines an MSSQL container for an MSSQL database, injecting an MSSQL script with our two datasources. In the class we add two Testconfigurations, so that we use both datasources in the test, and decide what database runs for each of them. Observe that we also use the primary annotation in this case.
We need to define the properties for the database under DynamicPropertySource in order to override the datasource from application.yaml, otherwise the datasource will be tied to those. DynamicPropertySource facilitates overriding properties before the Spring context is bound, which is what we want.
If you run on Ubuntu, you may need to enable mixed mode for MSSQL
docker run -e "ACCEPT_EULA=Y" -e "MSSQL_PID=Developer" -e "SA_PASSWORD=YourStrong@Passw0rd" -e "MSSQL_AGENT_ENABLED=True" -p 1433:1433 --name sqlserver -d mcr.microsoft.com/mssql/server:2019-latest
At the time of writing I realize that with an upgrade to a newer Spring boot, you may actually be able to replace DynamicPropertySource with just an annotation. So please look into that!
Here is the schema.sql
CREATE DATABASE testdatabase1;
CREATE DATABASE testdatabase2;
use testdatabase1
BEGIN TRANSACTION;
CREATE TABLE dbo.TestTable1 (
id INT IDENTITY(1,1) PRIMARY KEY,
name varchar(200) not null unique
);
INSERT INTO dbo.TestTable1(name) VALUES (N'Walter');
COMMIT TRANSACTION;
use testdatabase2
BEGIN TRANSACTION;
CREATE TABLE dbo.TestTable2 (
id INT IDENTITY(1,1) PRIMARY KEY,
name varchar(200) not null unique
);
INSERT INTO dbo.TestTable2(name) VALUES (N'Matthau');
COMMIT TRANSACTION;
Create an azure-pipelines.yml script so your application can run the test on Azure devops
My application repo is on azure devops where my pipeline script was defined too.
In my script as image I used windows-latest, which caused some headaches:
I had to start the container before my maven build. In my azure script, you could use a powershell to do that.
steps:
- powershell: |
echo "Starting Docker service..."
Start-Service Docker
But then I got errors like
java.lang.IllegalStateException: Could not find a valid Docker environment. Please see logs and check configuration at org.testcontainers.dockerclient.DockerClientProviderStrategy.lambda$getFirstValidStrategy$7(DockerClientProviderStrategy.java:277) ~[testcontainers-1.19.5.jar:1.19.5]
at java.base/java.util.Optional.orElseThrow(Optional.java:403) ~[na:na] .....
With “windows-latest” you do not get docker out of the box, and in order to get it …. Nevermind, I never tried to find out. I just switched to “ubuntu-latest” since it has docker out of the box.
I went ahead and defined the ubuntu-latest image instead in my Build artifact stage.
- stage:
displayName: "Build artifact"
jobs:
- job:
displayName: "Prepare docker and build job"
pool:
vmImage: ubuntu-latest
After that it was straightforward to use Docker:
steps:
- script: echo "Ensuring Docker is running..."
displayName: "Check Docker"
- script: docker run hello-world
displayName: "Run Test Container"
By adding that part, you will see the testcontainer spinning up, if you have logging enabled in your application.
This is the full script in your azure folder (see image above, just under the root folder)
trigger:
- develop
parameters:
- name: environment
type: string
default: 'dev'
values:
- 'dev'
variables:
- name: environment
value: ${{ parameters.environment }}
pool:
vmImage: ubuntu-latest
stages:
- stage:
displayName: "Build artifact"
jobs:
- job:
displayName: "Prepare docker and build job"
pool:
vmImage: ubuntu-latest
variables:
- name: artifactName
value: ${{parameters.environment}}
steps:
- script: echo "Ensuring Docker is running..."
displayName: "Check Docker"
- script: docker run hello-world
displayName: "Run Test Container"
- task: Maven@4
inputs:
mavenPomFile: 'pom.xml'
mavenOptions: '-Xmx3072m'
javaHomeOption: 'JDKVersion'
jdkVersionOption: '17'
jdkArchitectureOption: 'x64'
publishJUnitResults: true
testResultsFiles: '**/surefire-reports/TEST-*.xml'
goals: 'clean install -P$(artifactName)'
# Do deploy steps here.
I am going to take for granted you already have an environment to deploy to, and that you have a deploy agent setup on a server, and now you just need to run a test with multiple data sources. For simplicity I call the environment “dev”, like in my application.yaml profile - this is so I can use it as a parameter in the script to control application specifics I want my artifact to contain pertaining to the profile. I can for instance ust it in the mvn clean install -P<ENV> command in the script and control what environment to deploy to. It will correspond with the properties you have defined in your application properties.
Make sure your code is on Azure Repos Git or somewhere where Azure can get it. In order to create a new pipeline, click“new pipeline”. My code lies within Azure Repos Git, so I choose my MultiDataSource project. Since I already have my pipelines file, I choose “Existing Azure Pipelines yaml file”. Then choose the path where the pipelines is stored within the project. Now there is a chance to review the pipeline YAML. If everything looks ok, you can run it.
I have included a step in my pipeline to test if docker is running, but it should not be needed, since Docker is out of the box on Ubuntu images.
When you hit the maven step, make sure it is building the correct profile.
If the maven step builds and you have HikariConfig output the jdburl for the databases should look something like
jdbcUrl................................jdbc:sqlserver://localhost:32769;encrypt=false;databaseName=testdatabase1
2025-03-17 15:12:29.272 DEBUG 2072 --- [ main] com.zaxxer.hikari.HikariConfig
com.zaxxer.hikari.HikariConfig : jdbcUrl................................jdbc:sqlserver://localhost:32769;encrypt=false;databaseName=testdatabase2
Now, when that works you can continue working on the script and add stages to deploy it to an environment through a service agent.
When running your tests on an ubuntu image on azure, and if your application makes use of path separators for directories, make sure your paths in the application uses File.separator. Else you may be ending up fixing a lot of tests that use the Windows style separator (“\”). Also observe that datasources you have access to from your local machine may not be accessible from Azure. A lot of debugging time can be used to find out that you have access to a machine locally that is unavailable on Azure in the cloud. Make sure to use HikariConfig debugging so you can see what URL is actually being used to what database.
On azure, by the way, if it works it should look something like this:
Recap
What an educational article. Well done!
Join the group below to discuss Azure real-time projects, certifications, and resolve any issues or errors you encounter during real-time work: https://chat.whatsapp.com/EnrYBU9IFXG2z4XwHS1ZC9