FD Solutions started by obtaining a general overview of the application architecture. The TVO QA team guided us through the many games and use cases the application can have. This allowed us to gain an understanding of how students use the application. This was followed by a more detailed navigation through the application with different student credentials. We then obtained a detailed view of the data transferred. This included but was not limited to looking at response codes, HTML behavior, data requests, error code handling, information storage patterns, and session management while browsing the application. The final step was to gain an understanding of how information is manipulated and to look at patterns on DB calls. Once we had a firm understanding of the application, we proceeded to understand the performance testing requirements.
We tested the application under a number of scenarios to allow us to ensure that mPower was capable of meeting its requirements for performance and load. The application had to be able to process 1,000 concurrent logins for 30 minutes. The application had to process a maximum number of concurrent games for 30 minutes. The application response time while under pressure could not degrade to a point where logins and/or game play failed. And the servers had to support 1,000 concurrent users using the site for an extended period of time (7 hours).
We generated a number of test types including a Load User Test, Stress Test, and Simultaneous User Test. Test scripts were developed using NeoLoad, by first recording the desired set of operations and then manually editing the generated code as needed for legibility and repeatability. Manual editing achieved some modularity and reuse for common operations which increased maintainability.
Data creation started with the creation of a population of students and educators that represented or exceeded the expected number of users. Next, gaming scenarios were recorded and stored for further analysis in NeoLoad. Values for credentials, population, think time, wait time, load variation, test duration, and URL filters were predefined as parameters for each test on the application.
Running the load test iteration required active monitoring on the performance metrics for each component as it highlights how a change has impacted the application. A successful test became an established baseline for the following iteration. The plan included load test iterations to the point that either the application stopped handling the load or until the maximum number of users was reached.
Once the tests were executed, the run was analyzed and problems were pinpointed that needed to be addressed to improve system performance. Using graphs and reports from the analysis sessions, we were able to highlight and identify bottlenecks in the application, and determine if changes were required to improve application performance.
As a result of the stress testing, we were able to tune the mPower application so that it performed with superior results under any circumstance.