Settings for your viewing comfort
Close settings

Case Study TVO

Toronto, Ontario
Government

Improved results for satisfactory game play

As the technological extension of Ontario's public education system, TVO provides learning opportunities for Ontarians through innovative educational products both inside and outside the classroom.

TVO developed mPower, a new learning product which immerses children in a virtual world of math in which they learn key concepts through fun and engaging game experiences. The application needed to be able to support the projected number of 1,000 concurrent students.

TVO asked FD Solutions to performance-test mPower, to demonstrate that the application was capable of handling its projections for concurrent use and maximum load.

Read Details

Challenge accepted

Performance-test TVO's new learning product mPower to ensure that the application behaves properly during the situation when the user load is considerable.

Solution delivered

Through detailed analysis of the application architecture, game play and use cases, the FD QA Team gained an understanding of how mPower was to be used. With this knowledge, we created a series of load tests that stressed the performance of the application and allowed us to tune the application for superior results.

Technologies

  • NeoLoad Load Test Generation
  • Apache Webserver Logs
  • MySQL DB Logs
  • Zabbix Monitoring Agent Component
  • NeoLoad Load Test Generation
  • Apache Webserver Logs
  • MySQL DB Logs
  • Zabbix Monitoring Agent Component
  • NeoLoad Load Test Generation
  • Apache Webserver Logs
  • MySQL DB Logs
  • Zabbix Monitoring Agent Component
  • NeoLoad Load Test Generation
  • Apache Webserver Logs
  • MySQL DB Logs
  • Zabbix Monitoring Agent Component

Benefits of deliverable

Eliminated early breaking point for application load.

Analysis of graphs and reports allowed us to highlight and identify bottlenecks in the application.

As a result of the analyses, some modifications were made to different components of the application to achieve the desired performance.

Analysis of current infrastructure helped identify hardware requirements for supporting future number of users (4000).

Improved testing cycle as the tested scenarios were recorded and can be re-used on future releases of the application.

Validated expected application response time when the application was under pressure.

Identified potential overuse of the network.

Created confidence in all business areas involved, that the application and infrastructure would work under pressure.

Solution details

FD Solutions started by obtaining a general overview of the application architecture. The TVO QA team guided us through the many games and use cases the application can have. This allowed us to gain an understanding of how students use the application. This was followed by a more detailed navigation through the application with different student credentials. We then obtained a detailed view of the data transferred. This included but was not limited to looking at response codes, HTML behavior, data requests, error code handling, information storage patterns, and session management while browsing the application. The final step was to gain an understanding of how information is manipulated and to look at patterns on DB calls. Once we had a firm understanding of the application, we proceeded to understand the performance testing requirements.

We tested the application under a number of scenarios to allow us to ensure that mPower was capable of meeting its requirements for performance and load. The application had to be able to process 1,000 concurrent logins for 30 minutes. The application had to process a maximum number of concurrent games for 30 minutes. The application response time while under pressure could not degrade to a point where logins and/or game play failed. And the servers had to support 1,000 concurrent users using the site for an extended period of time (7 hours).

We generated a number of test types including a Load User Test, Stress Test, and Simultaneous User Test. Test scripts were developed using NeoLoad, by first recording the desired set of operations and then manually editing the generated code as needed for legibility and repeatability. Manual editing achieved some modularity and reuse for common operations which increased maintainability.

Data creation started with the creation of a population of students and educators that represented or exceeded the expected number of users. Next, gaming scenarios were recorded and stored for further analysis in NeoLoad. Values for credentials, population, think time, wait time, load variation, test duration, and URL filters were predefined as parameters for each test on the application.

Running the load test iteration required active monitoring on the performance metrics for each component as it highlights how a change has impacted the application. A successful test became an established baseline for the following iteration. The plan included load test iterations to the point that either the application stopped handling the load or until the maximum number of users was reached.

TVO mPower QA Case Study screenshot of graphs and analyses

Once the tests were executed, the run was analyzed and problems were pinpointed that needed to be addressed to improve system performance. Using graphs and reports from the analysis sessions, we were able to highlight and identify bottlenecks in the application, and determine if changes were required to improve application performance.

As a result of the stress testing, we were able to tune the mPower application so that it performed with superior results under any circumstance.

A few words about the project

Feedback

Your QA team brought us ease of mind knowing that our mPower application could handle the expected load with superior performance. The process of load testing allowed us to tune our application to handle even the most strenuous situations with grace and finesse.

— Manager of Web & Mobile Software Development at TVO

Back to top