Performance Test Design – No thinking required!

When designing performance tests for servers the most important quality of the test is a meaningful, controllable, quantifiable workload.


Common, commercially available workload generation tools use the concept of think time (the time between user actions) to regulate the flow of work into the System Under Test (SUT). This has always struck me as odd for the simple reason that think times are properties of the client side, highly subjective and generally unknowable from the server side. In addition, as demonstrated by Brady & Gunther, the think time construct results in an ever-decreasing workload presented to the SUT in Internet facing applications.


On the other hand, the arrival rate of requests to the SUT (or production servers) is easily measured at the server via log mining. In addition, they can be broken down into types of requests easily, allowing for a complete picture of the workload. Also, arrival rate (l) is a main component of Little’s Law (Q = lR) – the guidepost for all systems performance analysis. Indeed arrival rate is the only component you can directly control in a test environment.


Commercially available tools make manipulating arrival rate somewhat complicated – usually by varying the number of users and/or the think (wait) time. In your future performance testing designs, try to embrace the arrival rate and Little’s Law.  You can either use the more complicated method demanded by commercial products, or you can do what I and my partners did years ago – design & implement your own workload generator to allow for direct control of the arrival rate. Either way, your tests will be more controllable, more directly quantifiable and more meaningful.

To view or add a comment, sign in

More articles by Mark Monaghan

  • Report Driven Test Design

    Earlier in my career I often arrived at the reporting part of a performance effort only to find that I was lacking some…

  • Toolset vs Skillset: an argument for experience

    The first piece of electronic equipment I worked on had tubes. That’s right: cathode, plate, anode, grids and a pile of…

    3 Comments
  • Performance test system design

    The design of en environment to support performance testing is equal parts art and science. Some will argue that the…

    2 Comments
  • Meltdown/Spectre – the security/performance tradeoff

    It is axiomatic in the InfoSec realm that keeping patch levels current is one of the most critical steps for any IT…

  • Security vs Performance in the @intel chipset design

    The recent news about security issues in the @Intel chips is causing (or should be causing) quite a stir. The…

    1 Comment

Explore content categories