Five lessons I learned in 2022
Author: Manuela Apostol, Certified Master Anaplanner and Anaplan & BI Solutions Architect at Logicor
Anaplan is a resilient and very flexible tool and has demonstrated it can be used to adapt plans in days or weeks to current market trends — or rather — shocks! I have found that I’ve also needed to adapt my work strategy to be flexible — shifting from the consulting industry to a customer Solution Architect role during the first year of the pandemic when everything started to change.
The below lessons learned are a result of the past few years but were crystallized for me in 2022. I highlight what I believe are best practices when it comes to managing the Anaplan ecosystem and formalizing the concept of the Center of Excellence (CoE) as a customer. I wished I knew this when I started, however, it has been a trial-and-error approach that got me to this point thanks to mentors, colleagues, and partners.
- Your Center of Excellence (CoE) structure may already be in place
“You may not realize it, but you already have a CoE,” my Anaplan business partner said during one of our catchups. This made me pause and think, and I went back to the CoE materials and started identifying roles and elements. Executive sponsors were easily identified — simply put, who’s approving our budget; subject matter experts — who from the business we turn to when in need of advice or confirmation of certain processes; model builders, recurring meetings, etc (my day-to-day team and work basically). Although depending on the available resources, the same person can wear different hats at the same time (for example a solution architect, model builder, and project manager can be the same person).
Our naming convention was a bit different than what's in the CoE recommendations (like Anaplan Council or CoE Lead), but I realized we had the same responsibilities and structure. I was unconsciously adapting to my company’s ways of managing systems (as Anaplan would be just a piece of the puzzle among CRMs, data warehouse, payment systems, expenses systems, etc.) and all these systems needed a centralized way of being managed (Anaplan!), and thus our CoE had slowly developed on its own.
Once you’ve identified the CoE components within your Anaplan practice, it's okay if it doesn't look the same as everyone else — it still counts!
- Continue applying The Anaplan Way (TAW) after go-live to stay agile
Continue being agile by regularly maintaining and improving models, as there is no way that once a model is released it will stay as-is! If it never changes, it means that it is not being used or that the business never changes, which is not the case for most companies that are trying to innovate and adapt to a fast-paced environment.
One of Anaplan’s most attractive components, besides its powerful calculation engine, is the flexibility and ease in which existing logic can be changed and built. In order to manage ongoing changes, why not use a small-scale TAW? Manage buckets, prepare frequent releases, define the change requests as user stories, and ask users to test, like you would during UAT. And the cherry on top, you can configure and keep using the project management Anaplan app used during model development for ongoing improvements. The benefit of doing so is that you can review initial user stories and compare them with the new requirements, or keep track of the changes. Learn more or complete The Anaplan Way (TAW) training here.
- Release management strategy
Oftentimes — because you can very easily make changes to a model — the change is done in the moment, which can get out of hand and increase risk. As Anaplan's Catherine Hubbard and Pete Dixon put in their recent article, The difference between Business Intelligence and Corporate Performance Management systems, in order for stakeholders to see Anaplan as a trusted corporate performance management system, you must establish and follow change management and testing governance guidelines. This is especially important for logic changes and more impactful user interface changes.
Releases work best if they are frequent and in small batches; my preference and the option that works best for us is to do monthly releases but the cadence of meetings and releases will change based on your organization's needs.
Below are the steps I follow and recommend for a successful release:
- Continuously collect feedback and be transparent about its purpose with your end users/stakeholders. Collect any improvements as a user story.
- On a monthly basis, discuss the prioritization of the feedback with the business owners and reach an agreement/approval on the next steps.
- Make the technical assessment of the improvement once approved and write the technical user story. Identify how other Anaplan modules or connected models will be impacted, and additionally, if data is exported out of Anaplan, add a step in your process to assess the impact on other systems.
- Use the monthly meetings with the project team and agree on development, test, and release dates.
- Make the developments and perform unit testing.
- Peer review the development once transferred to a test model.
- Release to testing for the SMEs/end users (usually the requester of the improvement).
- Test the impact the change will have on the live model — regression testing (which I’ve summarised as one lesson itself, below!).
- Optional: Once all the above steps are completed, most established companies would require the release to be approved during a CAB meeting (change approval board). This is where the change is analyzed in relationship to the entire IT ecosystem and the impact the change might have on other systems; basically, checking if all steps above have been completed.
- Inform end users of model downtime for release and state the new improvements that will be released (e.g., a release newsletter).
- This will help set you up for a worry-free release in the indicated period (but as a double caution I also make a copy of the live environment before and after releases).
- Regression testing for controlling impact on your model
Once UAT is complete, make an impact analysis of the changes with production data. Having concurrent model builders making overlapping changes in the same modules at the time of testing is not a good practice, therefore the testing will need to be done on all the changes included in a monthly release.
To be able to test the changes on live data but without releasing the changes in production, the workaround is to make a copy of the production model into the test workspace and select the output modules that are most likely to be affected by the change. Although this may appear to be a tough process, it is reassuring for model builders and end users that changes to a live environment are controlled and only the expected impact is released. This is especially useful with complex models — avoiding data loss or unwanted downside effects.
In regard to the best way to perform this test, I have a very simplistic approach. I use an Excel add-in to extract the outputs and create three tabs: before, after, and delta comparison to identify differences (one test by output). I have not yet found a better way to make regression testing more efficient but am opening an invitation/challenge for anyone who is having the same struggle to propose a solution! It's not uncommon to identify a delta that needs to be explained or corrected. When this occurs, you need to compare the test model vs the live model. You may need to use Anaplan's drill-down functionality to find the source of change.
- The value of BP partner meetings
Once established in my new role as a Solution Architect, I was introduced to our company's business [Anaplan] partner — who acts as an anchor between a customer and Anaplan. I have a direct channel to the software provider and this person is not only there for me to check in with, but they also help (on a quarterly basis) make sure we’re aware of the Anaplan road map, new releases, plan for developments, etc. — like an intensive 1:1 course. I’ve never missed a single quarterly update meeting and I consider all my business partners great mentors on my Anaplan journey. They provide the most up-to-date information and offer guidance with our strategy and putting the pillars in place for our CoE.
Interesting read @Manuela_Apostol. You made me realise how with the right events, team and tools in place one can quickly adapt to the VUCA world with meaningful data and plans while keeping the systems, processes and teams under control (without unexpected issues or without siloed knowledge of functions).
Thanks for sharing!3
Thank you for taking the time to read it @AlejandroGomez. It is not 100% bulletproof, mostly in the case of complex models there might still be issues, but with minimal impact and much less frequency, as with this method we can catch them before realising them in production.2
Great post @Manuela_Apostol, lots of wisdom shared here! I especially enjoyed "Once you’ve identified the CoE components within your Anaplan practice, it's okay if it doesn't look the same as everyone else — it still counts!" You're absolutely right!3
Thanks for sharing @Manuela_Apostol interesting read and will be looking to apply learnings 🙂2
Great post @Manuela_Apostol thank you for sharing your valuable experience into this article. As I recently switch roles to become model builder at user company, I find your points (especially point.2 " ...there is no way that once a model is released it will stay as-is" which is very true) really helpful for me to maintain Anaplan ecosystem and seek for continous improvement while adapt to current corporate culture2
Good post @Manuela_Apostol 👍️
The testing aspect of the change management and of the Anaplan Way is crucial, especially when extending a models functionality where added complexity could impact performance. Continually testing and gathering feedback is a essential in stopping costly performance problems building up...2
Thank you for sharing these lessons, @Manuela_Apostol. On the topic of regression testing: I also find myself hoping there is a more efficient method of testing model outputs. I was inspired by your challenge, and while I haven't found an easier method outside of Excel, I did create an Excel template of outputs for one of my models. That way, all output modules for the model already have a 'before', 'after', and 'delta' tab. I can make a copy of the template and remap the connections to the current test models. I find remapping to be speedier than generating new connections at the time of testing. While I'm just starting out with the template, I anticipate that having the template ready to go for a model will make the testing task easier/more convenient.
For very large models, I suspect it would be better to split the templates up by output category, since the add-in can bog down Excel if there is too much going on.
Thank you again for sharing the lessons. It motivated me to share the importance of regression testing with the team of model builders that I work on.3