Introduction

The purpose of model concurrency testing is to validate that the model's key user journeys will perform optimally at the customer’s expected level of concurrent user interaction. If issues are identified, then there are actions that can be taken to address these issues and ensure optimal performance.

Model functionality and performance for a single user, or at very low concurrency (2-3 users), must have already been verified/optimized prior to model concurrency testing. Performance issues that can be observed for a single user or at very low concurrency will be significantly amplified as the concurrent usage increases.

Overview

Model concurrency testing requires detailed information from the customer and supporting project implementation team to enable the effective simulation of the expected concurrent model interactions. It is vital that the team has a clear understanding of the required information so that this can be factored into the project schedule and model design.

The model concurrency team must be engaged at the start of the implementation project and model concurrency testing should be completed prior to the UAT phase to ensure that the user’s first impression of the model performance is optimal. The model delivery lifecycle shows where model concurrency fits in.

The model concurrency testing is undertaken in a performance test environment that is completely isolated from, but representative of, the production environment that customers use.

Model Concurrency Testing is a chargeable service and the available Packages are defined here (internal Anaplan only). Pricing is available in the SOW pricing model.

 

The model concurrency process is summarized below:

Engagement

Responsibility: Implementation Team

Requesting model concurrency testing should be done at the very start of the implementation project so that the milestones can be understood/agreed and the model concurrency information/inputs planned into the project schedule.

Contact the Customer Success Business Partner on the prospect, or customer, account to engage the model concurrency team. 

 

High-level requirements

Responsibility: Customer/implementation team

Completing the model concurrency high-level requirements document will provide the initial information that is required to understand the high-level requirements and plan model concurrency testing.

It covers:

  • Model details
  • User information
  • Key milestones
  • Key contacts

The document template is the Model Concurrency Requirements article here (internal Anaplan only).

Model interaction specification

Responsibility: Implementation team

The Model Interaction Specification document captures detailed information that describes how each type of user will interact with the model as part of their regular duties. This information will enable the model concurrency team to accurately simulate how the user population will concurrently interact with the model and the frequency of the various interactions. The concurrency scenarios are also captured here based on the required Test Types , User Journeys, concurrent users, user geographical location and frequency of User Journey completion.

The document template is the Model Concurrency Requirements article here (internal Anaplan only).

Model sanitization

Responsibility: Implementation team 

The Anaplan Security Policy requires all models to be sanitized before model concurrency testing can be undertaken. Although the performance environment is secure, this policy exists to protect customer data and all parties involved.

Model sanitization involves obfuscation of model labels to remove the data context. For instance, a module could have a value of 50000 for the salary(line item) of an employee (list item), but by changing Line Item and List Item names to abc and xyz, the value loses context and becomes meaningless.

During sanitization, it may be tempting to replace every data value with 0 or 1, however this could greatly affect the performance results.

The model(s) should be sanitized in the production environment before access is given to the concurrency team to import the sanitized model(s) into the performance environment. If there is insufficient workspace capacity to do a model-copy, L3 Support can assist.

Written approval to copy the sanitized model is required before a copy of the model can be taken into the performance environment.

A copy of the sanitized model(s) must exist in the performance environment (isolated from production), before work can commence to generate the concurrency test scripts.

The model sanitization guide can be found here.

Model performance can be influenced by its data volume and so it is important that the data volume in the model being used for model concurrency testing is representative of the expected production data volume.

Model configuration

Responsibility: Implementation Team

The sanitized model(s) will be imported into the performance test environment and so any cross model or file import actions that are exercised within the model interaction specification will require some remediation work to make these actions function correctly.

Each user journey must be exercised by the implementation team on the model within the performance test environment to validate that the defined steps are correct and that the model is configured/working correctly for the relevant user role/selective access.

This is to ensure that the model is set up correctly and the supplied user journey steps are valid. Any issues encountered during test creation will result in wasted effort and could impact the project milestones.

Creating tests

Responsibility: Anaplan

The concurrency testing simulates the actions that the users take in the model by sending the requests to the Anaplan server that would be sent if a real user was performing that operation in a web browser.

Each user journey will be performed according to the steps and user role/selective access detailed in the model interaction specification document on the model(s) in the performance test environment.

The test scripts are created by capturing the network traffic requests and responses for each user action using JMeter.

At this stage, the test scripts require significant modification to make them functional, realistic and suitable for simulating concurrent user interactions i.e. 100 users performing the same steps but without all selecting the same items from drop downs or entering data in the same cells, etc.

The test scripts will then be configured to meet the Concurrency Scenario requirements for test type, user concurrency, user georaphical location and frequency that each user will complete each journey.

 

Running tests

Responsibility: Anaplan

When the test scripts are completed, the tests will be run in accordance with high-level requirements/model interaction specification. During each run, the tests and the test environment will be closely monitored and the performance data captured. Each test is run in isolation to prevent the results being impacted by other activity in the performance test environment.

Reporting

Responsibility: Anaplan

The results of the model concurrency testing will be presented in a report which includes a comparison of the response time for each transaction with the target response time (specified in the model interaction specification) for each load profile. The report will also provide analysis of the results and conclusions to aid the implementation team in decision making. An example of the reporting is available here. 

Model optimization

Responsibility: Implementation Team / Anaplan

If model concurrency testing identifies any performance concerns, then the Model Analysis team will conduct an analysis of the model(s) to identify opportunities for optimization. This activity does not load the application with multiple users. Instead, modules, formulae, and line items that may be contributing to poor performance are identified and recommendations provided to optimize model performance.

 

When the implementation team has carried out the recommended changes then the concurrency testing can be re-run to validate that the concurrent model performance has been optimized.

Model delivery lifecycle

The diagram below is an illustration of the key events that relate to Model Concurrency Testing and where they fit into the broad phases of a model delivery lifecycle.

Including sufficient time  in the plan for each preparation ctivity and for model concurrency testing to be completed (including allowance for model analysis/changes and retesting) before UAT commences will set the project up for success. The milestones in the high-level requirements will be validated and agreed by the Concurrency Team.

Anaplan Model Concurrency- Model Delivery lifecycle v5.png

Latest Articles
Model Concurrency Testing Reporting
Model Concurrency Testing Knowledge Base
10-09-2018
Model Sanitization Guide
Model Concurrency Testing Knowledge Base
03-05-2018
Model Concurrency Test Types
Model Concurrency Testing Knowledge Base
03-01-2018
Model Concurrency Testing FAQ
Model Concurrency Testing Knowledge Base
02-05-2018
Model Concurrency Testing Process
Model Concurrency Testing Knowledge Base
01-23-2018
0 Kudos