Developing in Confidence Instead of Hope
https://unsplash.com/

Developing in Confidence Instead of Hope

At the end of this week, I'm leaving my current role. I feel it's a good time to put pen to paper and describe development I've recently played a part in. The main areas I want to give an overview on are the use of WireMock, Test Driven Development and the importance of refactoring.

"Confidence over hope."

The above is a simple idea but one that has resonated with me ever since a colleague put it to me. When moving to production, do you want to do so confident in its success or just hoping that it will succeed with a cross of fingers and a touch of luck? Personally, I was crossing my fingers on one too many occasions.

"Do the entity classes match the expected schema? Will our application handle the data returned by the API correctly?" I hope that feature we rushed to put in actually works!

Having been tasked with developing a new batch processing service to interact with an external API, it was time to remove the reliance on hope and develop in confidence. With this in mind, the team and myself quickly found solace in WireMock and its ability to return canned HTTP responses. As WireMock articulates better than I can:

“WireMock is a simulator for HTTP-based APIs. Some might consider it a service virtualization tool or a mock server.

It enables you to stay productive when an API you depend on doesn't exist or isn't complete. It supports testing of edge cases and failure modes that the real API won't reliably produce. And because it's fast it can reduce your build time from hours down to minutes.”

WireMock comes both as a standalone version and as a JUnit Rule; it was the latter we intended to use to full effect.

Could we not write smaller unit tests, attain 80-100% code coverage and be totally confident when deploying to production? Why use WireMock? What were the potential benefits?

  • WireMock allowed development to begin and continue at a faster pace due to lack of integration concerns. The test API that was to be available to us didn't exist yet. We were able to start developing almost immediately without concerns regarding the test API delay.
  • It ensured we were able to develop safely with valuable tests. Using the mock API, we were able to write larger tests that would test the entire runtime behavior of the batch processing service. Of course we could have wrote a number of smaller method level tests, but these wouldn't have verified the integration with the API. By writing those larger tests, we in essence got more bang for our buck.
  • Ease and speed of set up. This is described briefly below.

The process of configuring WireMock responses is well documented therefore I won’t go into the nitty-gritty details here. A basic overview is as follows:

Place a JSON file which contains the required response within the test resources directory.

{
	"errCode": 0,
	"reports": [{
		"id": "51724",
		"description": "Expected description"
	}]
}

When configuring the expected response, reference the file.

stubFor(get(urlEqualTo("/reports"))
        .willReturn(aResponse()
                .withBodyFile("response_file.json")));

Assert on values to be expected as a result of that response

EXPECTED_REPORT_ID = "51724"
EXPECTED_DESCRIPTION= "Expected description"

//Then
assertEquals(EXPECTED_REPORT_ID, report.getId());
assertEquals(EXPECTED_DESCRIPTION, report.getDescriptionEvent());

This configuration served us well. When adding a new feature, we would follow the same process of creating a response file, stubbing the response and verifying against hard-coded expected values. We were able to iterate quickly through development tasks and changing requirements which lead to a production ready service. Our build history looked great!

At the time I considered this process to truly be TDD. Write the failing test, write the code so that it passes and repeat. That was until only recently when confronted with a nagging feeling of doubt in our ability to add new features safely.

....

Fast forward 3 months and 3 iterations of development and subsequent releases. Using the JSON file configuration, our service had a collection of 12 response files and 30+ plus tests.

Each test was reliant on having an insight on the information contained in the relevant response files. As a result, we had dozens of hard-coded constants throughout the test class as well as a large test constants class.

What had initially enabled us was actually beginning to slow us down. Where we had once been able to configure the REST API context for a (failing!) test with minimal effort, we were instead spending more and more time on configuring the tests which in turn hindered our ability to add new features. What was the source of this frustration?

  • The amount of "clutter" made traversing through our test classes tiresome, frustrating and ultimately caused delays when writing tests. On review, one test class was 1500+ lines long.
  • When using the JSON file configuration, the same response files were being used by multiple tests. This resulted in multiple tests using the same hard-coded values. This caused the tests to be somewhat coupled and fragile to change in both themselves and other tests.
  • The inability to scale the amount of data returned by WireMock. Say we were to add a JSON response file which contained a large set of data. It would have added large amounts of hard-coded values to an already crowded test class. On top of this, the effort to manually generate the data would have proved time consuming.

On review, the issue, it's cause and the solution became obvious. Configuration of tests was causing delays in feature delivery. We had become over reliant on those JSON response files which once upon a time were the key enabler to our success. The root cause of this? We had in part overlooked the vital refactoring stage of TDD. Yes, we were refactoring our application code but had forgotten to do the same with our test classes.

We knew what we had to do. It was time to roll up our sleeves and refactor the way we stubbed WireMock responses. When doing this we kept 3 factors in mind:

  • Reduce the number of hard-coded values.
  • Tests should be solely responsible for their context.
  • Ensuring we are able to easily scale the data returned depending on the requirement.

Our solution is as follows and one we feel is of potential use for others faced with a similar scenario.

The first step was to create a factory class to generate the data we require.

@Component
public class ReportResponseFactory{

    @Autowired
    TestReportFactory reportFactory;

    public ReportResponse generateResponse(ReportType... reportTypes){

        ReportResponse reportResponse = new ReportResponse();
        reportResponse.setErrCode(0);

        ArrayList<Report> reports = new ArrayList<>();
        for(ReportType reportType:reportTypes){
            Report report = reportFactory.generateReport(reportType);
            reports.add(report);
        }

        reportResponse.setReports(reports);
        return reportResponse;
    }
}

We then configured an ObjectMapper which we would use to transform the object to JSON.

@Component
public ObjectMapper objectMapper(){
    ObjectMapper objectMapper = new ObjectMapper();
    objectMapper.registerModule(new JodaModule());
    objectMapper.configure(WRITE_DATES_AS_TIMESTAMPS , false);
    return objectMapper;
}

Both components are autowired into the test class. As can be seen below, in the given section we define what response we require for testing. This response is then written as JSON and given to WireMock as part of it's stub.

@Autowired
private ObjectMapper objectMapper;

@Autowired
private ReportResponseFactory reportResponseFactory;

@Rule
public WireMockRule wireMockRule = new WireMockRule();

@Value("${request.url}")
private String requestUrl;
@Test
public void verifyNumberOfReportsProcessed(){


//Given
ReportResponse todaysResponse = reportResponseFactory.generateResponse
                  (UNPROCESSED, UNPROCESSED, UNPROCESSED, FAULTY, FAULTY);
                  
stubFor(post(urlPathEqualTo(requestUrl))
	.willReturn(aResponse()
	.withBody(objectMapper.writeValueAsString(todaysResponse))
	.withHeader(CONTENT_TYPE, MediaType.APPLICATION_JSON_VALUE)
	.withStatus(OK_RESPONSE_CODE)));
//When
testJobLauncher.launchJob();
//Then
int numberOfReports = reportService.selectNumberOfReports();
assertEquals(todaysResponse.getReports().size(), numberOfReports)

}

When asserting, we then use the response object to define the expected result.

While the above configuration doesn't contain any new ground breaking concepts, we feel that it's a solution that has already and will prove fruitful for some time. As a result we've seen the following results:

  • Reduced lines of code by up to 40% in some test classes. The hard-coded constants class mentioned previously is soon to be removed completely.
  • Addition of tests that are able to verify the services ability to handle large sets of data.
  • Tests totally independent and solely responsible for their context.
  • Cycle time when adding new features reduced.

So what did the team and myself learn over the previous 3 months? To rephrase, what do I hope I've conveyed within this article:

The power of WireMock.

A key enabler for our early and continued success. When integrating with an external API, WireMock is a potential option which allows code to be written in confidence.

The refactored solution as an option when using WireMock.

When using WireMock, consider using a configuration like we refactored to. It makes use of the factory pattern and the Jackson ObjectMapper to provide an ability to easily scale the amount of data required for testing, reduces cluttered code and allows tests to become somewhat self sufficient.

The need to refactor test code as well as application code.

Refactoring shouldn't be confined to application code only. Test code is as important as application code and should be treated as such. Refactoring allowed us to continue developing with confidence.

I'd love to get others take on this article. Please feel free to comment or drop me a mail if there's an opinion on the above.

Thought provoking Michael, good read!

Longer article than I expected. Interesting read.

To view or add a comment, sign in

Others also viewed

Explore content categories