How we set up performance testing

How we set up performance testing
June 18, 2020
Setting up performance tests for a complex web application can be tricky and sometimes you run into obstacles you couldn’t have predicted. We summed up our experience from performance testing a web application for one of our clients.

As part of one of our projects, we needed to introduce performance/load testing. This kind of testing can be initialized in many different ways and it depends mostly on customer requirements and goals. The goal we set up in our project was to perform API calls in given scenario on a BackEnd server (HTTP requests) and measure response times. The scenario of HTTP requests was discussed with the customer beforehand and it covered many different real user behaviors in the web application.

In the development of these performance tests we tried three different solutions for getting, storing, and visualizing reliable results for customer satisfaction. I will describe these three approaches below:

Solution 1

Testing tools:

The first solution we came up with used the user-friendliness of Jmeter software to create the request scenarios which we than converted into a Taurus test (JMX file into a YAML file) for better integration into our pipeline.

Afterwards the tests needed to be manually modified a little bit, because the integrated converter is not always precise and sometimes does not include all test elements. And last but not  least, we had to setup the Taurus elements (for example number of users, run time, etc.)

Image 1 -

In our first solution, we chose Taurus as execution software instead of sticking all the way with Jmeter because Taurus had much easier integration on the server via docker image. This solution, however, was not perfect because we needed to store the test result data for long periods of time (to create reports and analyze this data later if needed) and while Blazemeter shows very nice visualizations of the test results, its data retention in the free version is only 7 days. Also, the user cannot customize which data will be visualized.

Image 2 -

Solution 2

Testing tools:
Data storage:
Results: is a modern Javascript based software for creating and executing performance tests. It offers very well written documentation and easy integration (via docker image), and because our standard UI tests were written in Cypress (also Javascript/Typescript), test preparation was really easy.

The Local InfluxDB database gave us the proper way to store test result data on our own servers without data retention. This raw data can be later used in any way we choose or analyzed/filtered without the need for modification from other tools.

Image 3 -

For a data visualizer, we used Grafana as their integration with InfluxDB is prepared perfectly and we’ve already had some experience with it before. Tests result data were sent to the InfluxDB database in real time so we could see results in Grafana nearly in real time too, as it works with a minimal automatic refresh rate of 5 seconds.

The Grafana dashboard for tests can be found in the Grafana store. Of course, we had to make some small modifications to the dashboard to show the desired visualizations. In our case, the testing tool showed some minor problems where we suspected that some of the test results were slightly modified by the testing tool and we did not receive enough information about each sent HTTP request (for example error details). So we moved to solution number 3…

Image 4 -

Solution 3 (Final)

Testing tools:
Data storage:

As our third and final solution, we kept InfluxDB and Grafana integration but for the test execution software we choose Jmeter.

We eventually found a way to integrate Jmeter as a docker image on our servers with the right plugins and we found Jmeter to be very reliable to use.

We only needed to solve one minor problem with its default memory allocation which is only 256 MB – too little for our performance/load testing.

Image 5 -*C1Y5fLDpayFYyxTa3yPedA.png

This small memory allocation caused interruptions in the test executions. But fortunately for the docker image usage of Jmeter we needed just a small modification. We increased allocation to minimal 2GB and maximal 4GB which quickly solved all our problems.

The Grafana store includes a dashboard for Jmeter tests too and one of them was specially prepared for Jmeter/InfluxDB integration, which fully satisfied our usage as it was setup well for the measurements we needed to see in the visualization.

In summary, we found that Jmeter/InfluxDB/Grafana was the most reliable and customizable of the three solutions. It gave us the best way to execute tests, store raw results in the local environment (our servers), and visualize these results in a timeline with graphs and with lists where values can be marked with threshold colors.  

These visualizations of the test results helped us show some problematic areas of the tested application and we hope that it can help other developers find the source of their apps’ performance problems.

Martin is a skillful tester, quick-learning DevOps, and reliable teammate. He has a deep knowledge of HTML and CSS, JavaScript, Cypress, and Robot framework. He is experienced in automated testing and since recently our brave performance testing guru. He is also a great rock'n'roll dancer and energy drink lover.

Article collaborators

SABO Newsletter icon


Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

About SABO Mobile IT

We focus on developing specialized software for our customers in the automotive, supplier, medical and high-tech industries in Germany and other European countries. We connect systems, data and users and generate added value for our customers with products that are intuitive to use.
Learn more about sabo