Implementing Polaris: Lessons learned
Author: Philipp Erkinger, Certified Master Anaplanner and Principal Solution Architect at Bedford Consulting.
There are days in your professional career where you get to observe something special. October 20, 2022 was such a day, when I received access to a Polaris workspace to build a model proof of concept. Having worked with Anaplan for 8 years, I thought I knew it inside-out. A few hours into the adventure, I realized that the world of Polaris is very different than Classic but that I was about to experience something exciting and fascinating. I felt like an explorer, discovering the Amazonas for the first time. This article is a summary of lessons learned from the unique experience of implementing Polaris at one of the early-adopting customers.
Background
Let’s start with the basics. How did I even get here? A Swedish Anaplan customer that I worked with for a long time was actively investigating different solutions to improve their existing Anaplan model for reporting and analysis. Their key challenges:
- >8 different operational lists required in operational reporting (Accounts / P&L, Cost Center, Business Region, Legal Entity, Production Entity, Products, Currency, version).
- Highly sparse data.
- Many combinations needed to analyze data from different angles.
- Complex consolidation logic for financial plan / actuals including eliminations, allocations, currency translations and much more.
- Many reporting and analysis add-ons that have been built over the years.
Why Polaris?
The choice fell ultimately towards Polaris, due to its unique calculation engine built to deal with highly sparse data sets and large dimensionality. The value expectations are:
- Better user experience with more dimensions available to the end users.
- Simplification of the calculation logic to ease data reviews.
- Leaner model administration processes.
- Jumping board for new use cases.
Solution architecture
We came up with a solution architecture that would split the heavy calculation logic and admin processes from the end user reports by building two Polaris models instead of one.
- The Reporting Hub: A model designed to digest actuals and plan data through a complex consolidation logic that includes cost allocations, eliminations of inter-company accounts and a currency translation from all local currencies into the three main corporate currencies. All of course across hundreds of accounts, cost centers, products and much more.
- Reporting and analysis: A simple model that can be completely refreshed at any moment, containing only reporting modules designed to bring an exceptional user experience to the end-users. Calculation is limited to aggregation functionalities, variance reporting and KPI’s.
The project focused only on the replacement of the existing Reporting Model and didn’t include any major changes in other models. However, the Reporting Hub is designed as a platform to replace in the future the Data Hub and empower several new use cases currently under discussion, like workforce planning, investment planning and much more.
Learnings from the Implementation
We had a new experience with Polaris almost every day during the implementation. The following points are a summary of learnings that I consider critical knowledge, because Polaris will demand a whole new set of skills from model builders.
- It’s all about the ‘Fan-Out’.
This is probably the most important question that you can ask yourself when building an Anaplan model in Polaris. As a basic principle, the Polaris engine calculates only when it “sees” data. This means a “0” or “BLANK” won’t require any calculation performance. Yet, changing the value of a single cell from 0 to 1 can cause an explosion of real-time calculations. The “fan-out” describes how many cells will be impacted by a single value change. Having a large fan-out can grow a tiny model of 10 MB into a 10 GB model through a single cell change.
New skill: Learn to predict how end users in combination with your formula and module design can impact the performance of the entire model with a single cell change. - Understanding calculation complexity.
A new feature in Polaris is “calculation complexity”. It tells you in the blueprint the impact relation of each line item, meaning the fan-out ratio. “One-to-one” is the simplest complexity, meaning, 1 cell change from the source line item will impact 1 cell in the calculation line item. The higher the number, the more complex the calculation logic becomes having a direct impact on performance. It’s recommended to keep calculations as close to 1 as possible. Everything above should be reviewed and potentially remodeled by splitting up line items into several steps.
New skill: Learn to read the calculation complexity of each line item, review regularly and optimize towards “one-to-one”. - Flat lists — A new beginning.
The hierarchical depth stands in direct relation to the fan-out ratio. Calculation modules with many hierarchy lists can therefore become very performance heavy when the aggregation is turned on (SUM, FORMULA etc.). This can be solved by creating flat replicates for the purpose of calculations that have only one item as parent “Total”. The great advantage is that turning on SUM doesn’t add significant overhead calculations, but it makes it easy to move from modules with many dimensions to modules with much less dimensionality.
New skill: Build flexible calculation modules that easily allow moving between flat lists and hierarchies. - The Master Switch-Board: Utilizing dynamic calculation performance.
The unique calculation method behind Polaris allows for something very special, calculation performance can be simply turned off. Imagine you have a year of actuals that is not actively used in the model and consumes unnecessary calculation performance (workspace size). In Classic, you would have to remove it from all modules by implementing time ranges, updating the model calendar etc. In Polaris, this can be done by changing a single Boolean from TRUE to FALSE. Even better, you can bring the data back at any time without updating the core model settings. The associated calculation performance can be managed dynamically. This can be very helpful in scenarios like:
- Optimizing model-to-model imports and exports.
- Testing new features with a limited subset of data.
- Turning on / off individual areas of the model, like heavy reports that are only used during month closure.
New skill: Learn to optimize calculation performance by administrating data utilization throughout the model. Test the data impact well! - IF THEN ELSE… So different?
It has always been essential in Classic to optimize IF THEN ELSE statements towards early exits. Meaning, the most common rule must be applied first. Polaris can partially detect this now by itself. More important is to optimize towards sparsity and default values (ZERO, BLANK) to reduce calculation performance.
New skill: Optimize your IF THEN ELSE statements for data utilization instead of early exits. - Impacting user experience with over-dimensioning.
Polaris allows for a great number of dimensions in one single module / line item. This is just fantastic to simplify key calculations and provide more information to the end user. The new Supress Zero feature allows also easy filtering of the visible grid to only columns and rows that have data, even when multiple dimensions are used. However, for an end-user it can become difficult to navigate and overwhelming. That’s why it is even more important in Polaris to assess the implications of dimensions to the end-user experience.
New skill: Pay attention to end-user requirements and limit modules with many dimensions and optimize (limit) views to their needs. - Rethink filtering. Cancel your calculation.
Polaris is not just a new engine; it comes also with a couple new features that provide key improvements for model builders and end-users. One of them is “Zero Suppression”, which allows the end-user to filter the visible grid on only active values. Another one is the magical “Cancel Button”, which stops an ongoing model calculation mid-flight.
New skill: Train end-users in applying Zero Suppression to avoid filtering rows and columns by hand. - Learn about the technical differences.
Ultimately, Polaris is a fantastic product but works from the ground up differently as the Classic calculation engine. There are best practices in classic that can get you into model builder ****. Therefore, it’s essential to familiarize yourself with the fundamental technical differences between the two engines before embarking on a Polaris endeavor. Here are some important ones highlighted.
Conclusion
It has been an incredible learning experience so far with Polaris. I hope that our knowledge build-up during the project will help to accelerate future implementations and guide model builders in the transition from Classic to Polaris.
Finally, I would also like to emphasize the great work of Niko Vilkko (@nvilkko) during this implementation. He has built a fantastic Polaris model and contributed a lot to the above learnings.
Questions? Leave a comment!
Comments
-
Really good overview, thanks Philipp!
0 -
@PhilippErkinger , I greatly appreciated your leadership, collaboration and energy throughout this implementation. Thanks for summarizing and sharing your reflections from this experience. @nvilkko , thank you too for your great job with your hard and great work this spring. Also @nikolay_denisov , Dave Harding and Seb McMillan for your solution guidance throughout the build.
2 -
Great work Philipp and Niko.
What a great article! I like you very structured way of describing. Agree that we had a lot of the discoveries and indeed we needed to re-think solution design to migrate to the new engine.
I believe It was beneficial for rethinking solution giving a new engine, new features and it was proper time for customer as well.
3 -
Well put Phillip, it was a pleasure working with you as always
2 -
Thank you @PhilippErkinger for all your efforts with that project and using Polaris. It is also thrilling for those of us involved in delivering Polaris to market to read this kind of post from an expert model builder utilising the new capabilities. Your (and other early users) feedback continues to be essential to us. Thank you.
Polaris will continue to evolve, with both new features and ways of making some of these lessons more intuitive and easier to adopt.
2 -
Great to share your learnings and I am sure this will be useful to so many starting their Polaris journey.
1 -
The consideration of 'likelihood of data occurrence' is interesting. I remember similar design considerations back in the day with a tool where the builder could choose dynamic calc and store options. It adds a layer to consider when testing as well.
0 -
Amazing article, thank you for sharing.
0 -
Inspiring stuff!
0 -
How does Polaris manage exports when it has multiple dimensions? Are you still limited to 3 dimensions in rows?
2 -
Required reading for all SAs!!
Excellent work.
1 -
Hi @kaitchura, great question.
First on the grid side we still currently limited to 3 dimensions per axis as per classic but this is a key area that product are working on and it's well understood that this is a big part of unleashing the benefits of Polaris. On exports they is again a lot of development activity but currently the tabular single column export is the optimal option .Feel free to reach out if you need more detail or have other questions
2 -
Thanks, @seb_mcmillan! I'm very excited by the progress of Polaris!
0 -
Thank you great write up
Has anyone done a deep dive on Model to Model import performance? We are seeing higher numbers than Classic, and this is a core part of user experience in a large reporting Hub to refresh their data very quickly. For example, we see a couple million cells might take 2-3 seconds between Classic modules but 30+ seconds into Polaris multi-dimensioned module (even with Summaries off).
Any best practices or benchmarking here would be very helpful for our work. Thank you to any tips!
0 -
@gheiler Model-to-model imports into Polaris can behave fundamentally differently than into classic. This has many reasons, but the key one is that the model recalculation after data changes works differently in Polaris than in Classic. A driving force behind it is the calculation complexity associated with the import. I can recommend discussing this with your Anaplan BP/CS.
0 -
Hey @PhilippErkinger - we are on the implementation side and have lots of really seasoned architects on this, but just curious if anyone had resource with real benchmarking to Classic and just exactly how those M2M imports are fundamentally different and benchmarked
For example, we are seeing different load times with no downstream connected formulas and no aggregations, just raw data flow. But I think Polaris is still early enough there may not be super well documented tests and benchmarks so we are sort of doing this on our own right now0 -
Hi @gheiler - let me take that one as I think it is more of a question for Anaplan. There are some known issues at the moment with Model to Model import performance in Polaris. This is because Model to Model import actually stores all the cells values (including default/unpopulated/zero values) at the moment. We hope to release an update to Polaris in the next month that will improve this situation - by only moving the populated cells under the covers. This won't change anything in terms of import semantics and should just be a performance improvement.
1 -
Hello, i would be very interested in Polaris! Is it true that is 250K USD extra price? Also, are there any additional cost i need to keep in mind for the actual model import? Thanks
0 -
Hi @Frank Pau . I suggest that you connect with your Anaplan Account Executive on the pricing - the people in this posting are focused on implementation and best practices rather than license considerations. Hope that is ok for you.
2 -
This is for all of you that are interested to learn more. Take the chance and sign up! :)
0 -
https://events.anaplan.com/ACEImplementingPolaris
Is there any recording for this event?
Unfortunately I missed it and am having a lot of challenges for implementing Polaris…0 -
0