Performance Testing Experience with OctoPerf in a Tight Time Frame
Mar 7, 2022
Last week we used the performance testing tool OctoPerf in a concrete project. It was a short project that needed to be completed in two weeks’ time, which was a new experience for me. Was it possible to do relevant performance testing in such a tight period? Yes it was! We actually managed to make important recommendations and succeeded in increasing performance awareness.
On paper, the project looked straightforward; the client had just upgraded the site to a new version of Drupal, while continuing to use the same underlying infrastructure.
In practice, we were faced with the following challenges: no indication of the scenarios, no idea about the number of concurrent users, no indication about the expected response times. They only had an estimate of the number of visitors per month.
It often happens in performance testing that it is difficult to obtain the necessary information. In this case, we had to help the customer with these questions. We proposed a number of scenarios and decided to take 20% of the monthly visits as the maximum simultaneous peak to simulate. Anyway, discussing this kind of details already made them think about performance.
The scripts had to be created on a staging environment and then executed on the shadow prod environment. The problem was that when we started working on scripts, the staging environment was not ready: a lot of links were missing or were configured incorrectly. Nevertheless, they had a hard deadline, as they wanted to go into production in two weeks. Therefore, we documented the actions we could do and those that caused problems.
Then we had to wait to validate our scripts in the shadow prod environment, because they needed time to deploy everything. The test execution was postponed and we had to be flexible. However, on Thursday, everything was ready to run the validation tests and to run the actual tests.
Besides an initial validation test to check the technical correctness of our scripts, four load tests were planned, each time with an increasing concurrent load. During the first test, we already noticed that the response times were increasing and that some requests were not answered. With the help of their monitoring tool, we were able to determine that the network was saturated.
So the following questions came up: Was the load realistic? And how many users were using the application at the same time? It is likely that the load would have been (much) higher than the actual usage, as the customer had no idea of the actual load on the current production site. However, the fact that the network had been saturated, made them think about things like performance and the question of how many simultaneous users they should deal with.
After some discussions, we decided, in agreement with the customer, to slightly change the approach. Instead of focusing on response times, we had to determine the point at which the network would become saturated, taking into account the normal load of a working day. We had to be flexible here and had to work on Sundays, due to their deadline of Tuesday the next week.
After a few iterations, we were able to define more granularly at which threshold the new site gave issues. We completed this project on Sunday evening by giving them a final report with an overview of all the tests, the results and our recommendations to improve the performance.
It was an interesting experience for me to work on this project. The client was happy with our services and the project created the necessary awareness with regard to performance on their side. Returning to the initial question: Can you do relevant performance testing in a very tight time frame? Of course you can! As long as you are pragmatic and set the right expectations.
Want to know more about CTG’s performance services? Check out here.