Blog

go back to the blog

Test Early Part 1: How to integrate automation and agile in performance tests

  • 02/06/2014
  • 10968 Views
  • no comments
  • Posted by EuroSTAR
-->

Is there any way we can seamlessly run performance tests in an agile environment? It only provides a benefit if the tests are automated,fast and valuable .This article is an attempt to give an insight in which areas performance tests can be useful in agile while at the same time realize the limitation ,as a through performance tests still requires a human brain to perform analysis of result to find meaningful conclusion

For an organization, time to market is the top priority, hence they are working to produce high quality software in less time. In order to achieve this goal, they have adopted agile framework, an agile environment puts various teams to work and develop the product in a manner that is iterative and incremental and promotes self-organization. Theoretically in agile, performance testing that usually takes place at the end of the lifecycle in a waterfall model will move to the beginning. However, many products are neglecting to include performance testing in agile cycles as it’s expensive and time consuming .Plus, it doesn’t serve the purpose in an early phase when you’re testing the product from the perspective if it can be a successful in the market and people buy into your idea. This normally happens in a startup environment where performance test is only crucial once the bug idea has been verified through beta version that’s available on public and the product all of a sudden becomes popular. In my experience, there’s an opportunity to utilize the available open source tools and reap the benefits in agile performance tests. It’s not end to end, as you’ll still require an analysis of result to find meaningful conclusion but frequent repetition can be avoided, and the best of all it’s automated. fairul figure 1_499x385 Figure 1 : Performance test with continuous integration workflow As referring to figure 1, this is how we are doing it .In our environment we are using the following tools:

Jenkins, Performance plugin, TAP plugin.

Jenkins is a continuous integration tool that we are using for running and monitoring performance tests continuously.

Jmeter Jmeter is a load testing tool for measuring performance in term of response time,in our focus is web application.

Yslow for Phantomjs Yslow is heavily used in our environment for front-end testing. It suggests a way to improve the web applications based on set of high performance website rules.

1. Step 1 is where the code and automation test scripts are committed by tester and developer. We are using subversion for this purpose. Specifically for tester, Jmeter scripts are stored and maintained here.

2. Step 2 defines the need of Subversion to trigger Jenkins whenever there are changes happen either in Jmeter scripts or more importantly in web application source codes. Depending on the stability of your web application, you may want to schedule the run instead of running every time there are a code changes.

3. In Step 3, Jenkins to notify and trigger the attached node/slave.

4. In step 4, Jenkins’s node/slave will proceed with an auto deployment and then performance tests.

5. In step 5 is where the verification result will be sent through an email to whoever needs to be notified.

Most of the time, we will check the result from Yslow first to verify how the front-end is doing and that can be achieved by verifying the result as per figure 2. In a nutshell, your web application page is benchmarked against the best practices as stated in the Yahoo! Developer Networks Best Practices For Speeding Up Your Website article. Hence, easily, whenever there is a breach of best practices, everybody will be notified through an email and the fix can be done quickly. Figure 2 – Click to enlarge fairul figure 2_600x300 The strength of continuous performance tests is the ability to run an automated load testing too .The simulation a given number of concurrent users through Jmeter can be run either through a scheduled or whenever there are a changes in the code. In jenkins ,you may be able to set the threshold when the performance tests is failing ,for an example when the response time is exceeding 5 seconds , you may want to get everybody to look into the problem and rectify it ,hence assigning the performance tests job as failed is probably what you want to do. As per figure 3, with Jmeter report in Jenkins, you always able to verify an overview of performance over the time and spot if there are any unusual high variations.   Figure 3 – Click to enlarge fairul figure 3_550x252 To drill down further, you as well able to display a breakdown of response times and errors per HTTP request. This is very useful , whenever there is new web services or even a new minor of images being added ,you always want to know if it’ll give an adverse impact to overall performance .In Figure 4 as an example ,one of the item in red color contributes to unusual high error rate , and it seems that’s due to HTTP response code 404.It means while the http link is written in the code ,the particular resource doesn’t exist in the path .

Figure 4 – Click to enlarge

fairul figure 4

When we talk about performance test in agile, there is not a priority and mandatory to perform a test with thousand numbers of concurrent users, the baseline load normally 50 to 100 users will be good enough to get the feedback whenever there are a code changes or new component being integrated. You may want to check the system breakpoint later on once code has been freezed. Before that , the faster the result the better, therefore you can iteratively improve your web application along the way .That’s agile!. Thanks.

P/S : I’m more than happy and looking forward to share the details configuration in Jenkins ,Jmeter and Yslow for Phantomjs in the next article if there’s a request .We also can touch on how to integrate the memory ,CPU and SQL profilers which proven can be useful for root cause analysis in performance.

 

Biography

fairul rizal_120x120Fairul is a Test Engineering Manager of Product Quality and Reliability Engineering at MIMOS Berhad,a leader in pioneering new ICT market creations for partners in Malaysia through patentable technologies for economic growth. Fairul oversees performance and test automation for more than 15++ projects .Fairul has led an effort to reduce cycle time by managing Application Lifecycle Management (ALM) tools. Fairul is passionate about better and faster delivery of quality product by designing and building intelligent test automation system.

Find Fairul on Twitter @fairul82 or LinkedIn

 

agile content CTA

Blog post by

go back to the blog

eurostar

Leave your blog link in the comments below.

EuroSTAR In Pictures

View image gallery