Wednesday, February 12, 2014

Performance Tests made easy


Why performance tests?

When you are working on a project which has a web interface (like web-service, webpage or hosting server) you finally will come to a point you will need to start doing some performance tests. There are several reasons for such a necessity:
  • to know how much traffic you can serve without problems
  • to create realistic Service Level Agreement 
  • to check how well you scale while adding new servers
  • to make a baseline for performance improvements
Knowing how much traffic your product can handle is very crucial in making a decision if you need to invest time in new functionality to get more users, or in scalability or performance changes to handle the users you already have.
It is always important to have performance tests before you will attempt any performance related changes. Without baseline from before the changes you will have no idea if your work changed anything. It is not rare that developers are spending weeks on something that in the end happen to be a micro-optimization (or even the product is running slower), because they didn't took something under consideration. To not fallow that scenario you need to check if your optimization idea works. To do that, you need the baseline and performance tests to run again in order to compare.
On the other hand those tests cannot take forever to create. One needs to find a way to create them quickly, so it will not delay product development. If it would take days, some teams might sacrifice the need of such tests (accepting the risks) and ignore it until too late. On the other hand if it would take up to an hour every "mentally stable" developer team would like to do it an be left on the safe side.

Creating performance tests for web interface.

While creating performance tests, you need to remember to assure some crucial properties. They must:
  • be easy to rerun - you should be able to run them one after another with no problem on any time of day and night. This implicates tests automatisation. It is always good to ask yourself if you could add your test script to crone on some server, and only see test results after the test run. 
  • be configurable - such things as host address, number of threads, number of retries etc must be read from config, so you can easily change them and reuse tests on different environments.
  • have results easy to analyse - dumping results to file or only the simplest statistics calculated iare rarely enough. You need to model tests output in such a way, that it would explain not only what but also why. For example if request duration mean is around 100 ms, does it mean that all requests take around 90 ms - 110 ms, or are the requests around 20 ms but some of them can take even over 10 seconds and this is misleading the statistic.
This tutorial will show how to quickly create configurable, automated tests and how to visualize their results in such a way they are easy to analyse. It will take less than half an hour!

Creating a test

Let's test google! Our test will show us performance of http://www.google.pl/search?q={query} web interface As we want our test to be easily configurable, we will look at is as http://{host}/search?q={query}.

First, we need to get newest JMeter. Then we need to download zip of jmeter-plugins (standard set is OK) and copy contents of it's lib/ext to apache-jmeter-X.Y\lib\ext. Jmeter-plugins is a great set of JMeter extenstions.

Having our JMeter configured with jmeter-plugins, we run it in graphical (default) mode with a script you can find in apache-jmeter-X.Y\bin folder. First we will create a Thread Pool for our tests:





We are using ${__P(name)} tool to read values from property file, where:
  • ${__P(test.thread.max)} - number of threads we want to use
  • ${__P(test.thread.rampUp)}  - how much time to spawn the threads
  • ${__P(test.baseCount)} - number of tests each thread will do
Then we will add a sampler for our threads - Http Reqest:


where we use two additional configurable properties:
  • ${__P(google.host)} - to specify tested host
  • ${__P(google.query)} - to specify test query
Lets save our test plan as google.jmx. Then lets create google.propertes:
test.thread.max=4
test.thread.rampUp=1
test.baseCount=10
google.host=google.pl
google.query=loadosophia
Now we can run our test! Lets do it from console (after all that would be the way crone would run it each night)
java -jar ApacheJMeter.jar -n -t /path/to/google.jmx -q /path/to/google.propertes
Created the tree successfully using google.jmx
Starting the test @ Fri Oct 04 11:54:10 CEST 2013 (1380880450276)
Waiting for possible shutdown message on port 4445
Tidying up ...    @ Fri Oct 04 11:54:20 CEST 2013 (1380880460120)
... end of run
Tests are run! Lets see the results! If you have any problems You can consult files in the repo.

Analysing the results with loadosophia.org

Loadosophia is a great site, where you can upload jMeter results and receive a rich graphical report helping to analyse it. It has the Pay What You Want policy so you may test it and pay if you feel like it. All operations there are under https and signed with Your google account, so we may say it's decently safe to use. 
We will cover Loadsophia in details soon, but for now You only need to get Your Upload Token and create new project in your workspace named GoogleQueryTest.
Now let's open google.jmx in jMeter graphical interface and add Loadosophia.org Uploader:


We are changing the project name to the one we've just created: GoogleQueryTest, and we need to paste in the token from Your Upload Token, I also like to specify the folder to save in results (relative to where tests are run from) and to name test after the time it was run with ${__time(yyyy-MM-dd-HH:mm:ss)}. It helps in compareson of many tests over time.
Lets save the file, and open console. Lets create the results directory and run the tests:
$ mkdir results
$ java -jar ApacheJMeter.jar -n -t /path/to/google.jmx -q /path/to/google.propertes
Created the tree successfully using google.jmx
Starting the test @ Fri Oct 04 14:25:58 CEST 2013 (1380889558228)
Waiting for possible shutdown message on port 4445
Tidying up ...    @ Fri Oct 04 14:26:22 CEST 2013 (1380889582299)
... end of run
Now, we can see our results (I've made the results public for You to see. Normally it require to be member of project to see it's results).


What's next

This tutorial gives You the basic knowledge how to create performance tests in the manner of minutes. The next post will cover the tips how to read Loadosophia report and how to use it to make Your application better.


Graphic courtesy of lasvegassportsperformance.com