This article describes how to use health check audit on Anaplan models to identify gaps, deal with data quality, manage user experience, connecting models and user adoption. Based on experience in maintaining and deploying new models across FP&A and operational business functions as the Enterprise Connected Planning architecture grows across business with new use cases.
Anaplan Usage & Growth:
Anaplan has provided customers with excel type functionality, which enables any business user to use the cloud enterprise platform for business planning and analysis without requiring a technical programming skillset. From my observation across different organizations, the connected planning ecosystem grows within business relatively quickly, as operational units realize the value and want to jump on to the platform. The central team soon runs into a challenging situation where they have tp increase in number of models, they must also deal with predictable issues, bugs, data quality, business changes, adaptability, and change management.
Typically, Anaplan solution extends to other areas of business operations, not just financial planning functions. Such as sales and marketing, operational cost functions, workforce planning, supply chain, inventory planning, operations planning through to consolidation of cost to enabling real-time planning and decision making. I strongly believe to achieve the true potential of a connected planning ecosystem each use case should be talking to each other. However, I have found time and again teams operating the model as a standalone function and not looking at the full architecture and utilizing other core data points from other models. In some instances, lack of commitment from businesses for adoption, this could be due to various reasons such as lack of time, resistance to change, and team changes, etc. Thus, having a negative impact on delivering value to the enterprise-wide connected planning.
How Models Get Tangled:
Secondly, we have learned so far “Just because you can doesn’t mean you should” (2018, J.S. David) Anaplan community has lots of resources on how to model for performance, keeping it logical simple & auditable, not duplicating calculations, and designing scalable solutions. Likewise, the Anaplan community has some excellent articles on best practices for model building standards DISCO and PLANUAL, which set clear standards for model building, governance, and most importantly why it's done in the Anaplan Way.
Furthermore, as new cases grow organically across business functions, I have identified maturity on all external aspects of the model to be business ready gets blurred. In most organizations, the Center of Excellence (CoE) team is responsible to operationalise the model as a business-as-usual function. I have found that usually new models are built to 100% PLANUAL standards driven by COE, with defined SOW, and is built to deliver agreed requirements. But from an internal COE management standpoint backlogs list to keep building up due above-mentioned external aspects such as business adoption, changing business needs, data availability, data quality, business process changes, etc. For example, updating of the master or transnational data sometimes is not automated via API or any other integration means. This could be due to various reasons, simply data does not exist or project time-frame could not wait for data to be defined. Leaving the model to have manual data uploads. In another example, the model could be standalone for forecasting a business function and not connected to a wider connected planning enterprise. Once more, this could be due to a new use case, hierarchies not aligned, or lack of business understanding.
Therefore, what I have found useful is taking a step back to evaluate the models and asking simple questions like,
Does the individual model meet current business requirements?
Is the model connected to the wider connected planning ecosystem?
Is Data integration fully automated (via API or other integration) or what manual data functions being performed.
Does the model have UX app build or classic dashboard have been migrated to the UX
Training for the end user is performed, does the model is fully used in forecasting process
Hence, performing a reflective exercise as a model health check supports identifying what is going on with the model. This helps to check entirely from a different lens and to focus on the areas to determine whether the model is meeting the health check standards defined by the COE. The check allows to conduct assessment on set criteria’s, making the process binary either True or False to take necessary steps. Which then could be managed using the product backlog.
When should you conduct Connected Planning Health Check:
From the COE perspective, there is no set schedule as to when to perform the health check, it would be good to plan out a time to perform the health check on a regular interval. More importantly, to keep an eye out for user/business feedback and monitoring the planning and month end process to identify disparities.
Finally, the 5 steps to connected planning health check
MODEL MEETS SOW: No pending backlog affect the core function or calc logic of the model. All the required core model calc logic is build and tested. Therefore, the model is used fully functional as defined in the SOW.
CONNECTED MODEL: The model is connected to the connected planning ecosystem and the model is not just a standalone model. Accelerated decision-making by connected models.
INTEGRATED MODEL: All the master and transaction data from the enterprise source application is fully integrated and scheduled via Data Hub. Single source of truth and data is live and governed.
UX READY MODEL: Users hate using a platform that is not easy to navigate and interact with. User experience is key for user adoption. The Model UX is delivering user needs and providing a positive experience for planning and reporting. Also keeping a record to migrate all the classic dashboards to the new UX. (This would not affect organisation who started their Anaplan journey after the UX was released)
OPERATIONAL MODEL: Business does not run any other manual process, excel, or other application models. Ultimately, the model is operationally used, and no relevant worksheets are used by the business.
Any technology application roll-out tends to have the above challenges, an application designed and developed also tends to have bugs. However, when these types of gaps happen it is important to take proactive steps to understand and ensure the continued development is meeting the needs of the business and the development meets criteria defined by the COE. If these gaps are left operational and not identified early, could have a disproportionate effect in the long run to achieve data transparency, increase efficiency, produce reliable results to make positive real-time decisions and deliver real connected ROI.
Asslam is the Principal Performance Management, Leading the enterprise planning program including the Centre of Excellence at Fortescue Metals Group. He is very passionate about the core concept of connected planning, and specializes in business-let transformation and having a direct positive impact across the business functions not just operating margins. He strongly believes in breaking down silos and empowering businesses with federated delivery to unlock the full potential of connected panning.
He helps to solve business problems and providing solutions to connect people, processes, data with technology. So that business can improve their decision-making with a single source of truth and close the loop between strategy through to execution.