Listen to this article

Enterprises have looked at performance testing often as the last straw in the glass. Such kind of approach has shown that enterprises have lost business opportunities and in certain cases have resulted in damages to their brand. This blog series shows you a way, as to how enterprises can improve their return of investments made on their business applications. The five fundamental spend categories in any performance testing engagement for business application are

  • Business application (non-functional requirements),
  • Test data volumes,
  • Test infrastructure,
  • Performance test tools and
  • Test personnel skills.

We will take a look at them one by one over a series of blogs in the coming weeks.

Performance requirements are required to be clearly defined early in the lifecycle of the business application, but what eventually happens in most cases is that we hear business and architecture teams advising, “let us get the numbers from the tests”, “from your past experiences, what do you think should be the ideal number to go with”, “what is your recommendation on ideal configuration”, clearly parking the onus on the performance test team to explicitize performance requirements. We have heard of performance teams asking “Business should define the performance requirements”, ” Architects should clearly call out the SLAs”. A point to be taken here is that though it is a fair expectation on the part of the business to expect performance teams to make some recommendations and vice versa, what needs to be noted is that application to application vary in terms of the technology stack, anticipated user arrival rates, user distribution across geos, etc.,

The key to good performance of any application will be nailing down the right performance requirements. At

  • Basic level, performance requirements should cover
  • Response time along with SLAs
  • Throughput
  • Acceptable resource usage like CPU, RAM, I/O for the transactions under different arrival rates.

In today’s scenario, apart from our traditional desktops and laptops, we are seeing increasing use of mobiles and tablets to access portals. Today, we are seeing that different parts of the world connects to the internet through wide range of devices, various bandwidths and experiencing different kind of latencies. It is very important that performance requirements cover these network and connectivity requirements as well.

From a workload modeling perspective, it is very important to provide a holistic insight for performance requirements covering infrastructure, key business transactions to be considered in scope, data volume required for different conditions, data retention period, work load arrival rates, user-geo distribution rate, usage patterns and associated peak business hours, etc.,

This is where all relevant stakeholders need to mature and understand that everyone(business, architects and performance teams) has a key role to play in ensuring success of the go-live, and hence start providing crucial inputs like technical architecture – both logical and deployment perspective, so that these could be factored by the performance test personnel as part of their test strategy definition.

It should be noted that performance requirements should also call out what are the KPIs to be reported like average, min, max, standard deviation of the response time for the business operations in scope, possibly splitting across different layers (web, application, database)Performance test teams should make recommendations in here as to what are the performance parameters that will be monitored and reported, which could be vetted by business and architecture team, ensuring that all the expectations are addressed appropriately.

To say the least, it is very important that the architecture, business and the performance testing teams work in a consultative mode to capture the right performance requirements.

We will look at infrastructure in our next blog.

Meanwhile, if you have any thoughts or suggestions or comments on this blog, kindly post the same.