OEG Best Practice: HyperModel best practices

AnaplanOEG
edited February 2023 in Best Practices

HyperModelsTM extend the Hyperblock calculation engine capabilities, allowing users to significantly increase scale to expand with new use cases, manage more historical and external data, forecast further into the future, and model more scenarios. The following HyperModel best practices are designed to ensure you can easily leverage and maximize these benefits upon configuration and implementation.

Analyze existing models

First and foremost, take stock of your current models. Are they well-built? Do they perform well today? These are important questions to answer before considering implementing HyperModels.

HyperModels are built on the same underlying technology as standard models (there are no new features or changes to functionality) so all of the best practices and Planual rules still apply. If your current models aren’t performing to your liking, don’t expect anything to change simply because of increased scale. An existing poorly performing model, when scaled with HyperModel capability, will still perform poorly. If this is your current scenario, you may want to revisit some of your existing models.

HyperModels allow you to model and leverage the inherent multi-dimensional and in-memory architecture of Anaplan. For experienced Anaplanners, this means that you don’t have to worry about combining structures if you don’t need to. You can also improve formula efficiency by splitting formulas up onto separate line items. Read the Truth about sparsity blog series for a good review of the different considerations around sparsity. With HyperModels these considerations are even more relevant, especially Part 2 of the series.

In addition, text line items are memory intensive as compared to other types. Joining strings of line items can significantly increase memory consumption. To ensure the formulas are as efficient as possible, apply the best practices to joining strings as you scale your model. See Formula optimization for more details and instructions.

Before transitioning to HyperModels, factor in some time to review and re-engineer the model and structures if appropriate using PLANS and D.I.S.C.O. techniques. “Don’t add more floors to your house without ensuring the foundations are solid”

Combining models

One of the big advantages of HyperModels is the ability to combine models. Yet, before combining models, think about why you split the models in the first place. If the models were split to efficiently manage scale, then combining them will simplify modeling and reduce overall scale requirements because of the reduction in repetitive common structures.

However, if you have split models due to other factors, such as different process steps, a different user base, different levels of granularity, or the ability to manage updates, consider the effect on flexibility and responsiveness by combining these. You might want to leave the split models as is. You can still, of course, increase the scale of each model to include more timescales, scenarios, etc.

HyperModel structures vs. dimensions

The size and performance characteristics of HyperModels is dependent on two key attributes: structures and dimensions. HyperModels can be described as “structurally large” or “dimensionally large”.

Models that are structurally large have some very large lists and few modules with more than two or three dimensions. These are typically data hubs or models with large detailed “transactional” data sets. 

Dimensionally large models don’t typically have large lists but have many modules with four or five dimensions.

The Hyperblock engine is optimized for dimensionally large models. So, when scaling up your model, consider its structure. This may also be an opportunity to re-engineer the model to be dimensionally large rather than structurally large.

Large lists

Referring to the point above, be cautious of scaling your lists into the millions.  Whilst this is now possible there are implications. 

As you may know, lists are structural and take up approximately 500b per list member.  This means that a list of 100 million members is approximately 50Gb even before any other modules or data is added. Lists of this size will increase model open times as the structures need to be retrieved from the model store.  This effect is amplified if there line items dimensioned by the large list that contain text, or text joins. So, review the need for the list of millions of items and aim to split these into a more dimensional structure

This article explains this concept with an example

Model constraints

As you scale your model, understand some of the constraints and mitigation techniques around modeling.

First, let’s talk about cell count for a line item. For example, a line item without Time or Versions (such as the lowest level of a composite hierarchy with summaries turned off) allows you to manage up to approximately 2.1 billion cells. As you add more dimensions to the line item or increase the size of the lists, the potential cell count will increase beyond this.

Mitigation techniques:

  • Consider the dimensionality of the line item. Following PLANS, ask yourself: Are all of the dimensions appropriate or necessary for the calculation?
  • Can you split the line item into smaller elements?
  • Can you use subsets to reduce the scale of the line item?
  • Remember the Planual rules for Subsidiary views (2.01-06) though. Try and keep line items with common dimensions in the same module.

Second, let’s discuss scale for calculation cells (excluding Time, Versions, and Summaries) for the following functions:

  • ISFIRSTOCCURRENCE
  • RANK
  • RANKCUMULATE

These functions are now five times more scalable, but modules with large lists can hit calculation thresholds faster with HyperModels.

Workspaces and model management

HyperModel enables customers to consolidate both models as well as workspaces.

If you are using Application Lifecycle Management (ALM), we recommend that development, test/QA, and production models are kept in separate workspaces to help with segregation of duties and user management. We also recommend that development models are kept small. Consider a standard workspace and model for development, and HyperModel(s) and workspace(s) for production models. Depending on your testing requirements, you might need HyperModel(s) and workspace(s) for test/QA models as well.

Within your Anaplan footprint, you may have many models and different user profiles. In this scenario, consider single development and test/QA workspaces, along with one or more production workspaces.

Performance

The performance of models with HyperModeling capability is still dependent on modeling best practices and how the model is constructed. With HyperModel, we have made enhancements so that users can expect similar or improved model performance. However, best practices around minimizing the use of user-driven imports, setting up efficient filters for exports, only exporting what you need to, text strings, etc. still apply.

Conclusion

We hope you’ll find these best practices useful in helping to maximize the value of your HyperModels and successfully extend Anaplan Hyperblock capabilities at HyperScale. Refer to the Planual and the Truth about sparsity blogs for more tips on how to set up your existing models for scale.

Author David Smith.