In my recent blog post titled Scenario Planning in Anaplan, Part I: The Power of 'What If' , I introduced the key steps needed for effective scenario planning in Anaplan and focused on the mechanisms to develop a forecast ('what') and evaluate the impact of forecast events ('so what'). In this second part, we look to move beyond vision and insight to evaluate actions ('do what').
Like many modeling use cases in Anaplan, there are multiple approaches to take. Below I describe what I have found to be the best way to leverage effective model building techniques and best business practices.
Key Elements: Drivers, Actions, and Playbooks
The base model should be driver-based and include multiple scenarios, as described more fully in Part I. This allows us to use actions to adjust driver values and reevaluate results dynamically.
Each action can be used independently to incrementally adjust driver values (Figures 1,2,3).
Figure 1: Adjusting Capital Expenditures
Figure 2: Adjusting Productivity
Figure 3: Adjusting Driver Values These impacts can be of various types, including but not limited to:
Productivity changes: adjusting the volume of activity that can be handled by a single resource.
Business or activity volume changes: adjusting the number of business units and/or activity (effort) units. For example, a business unit might be the number of units sold while an activity unit might be the number of cases shipped or calls taken.
Capital expenditure changes and the impact on cash and depreciation.
Changes in non-volume-sensitive operating costs or headcount.
Ideally, actions are combined in playbooks. Our model uses a separate list of actions, and then assigns actions to playbooks. In the new UX, we use forms to allow the users to create and assign actions on the fly (Figure 4). Figure 4: Enable playbooks and actions; evaluate impact on P&L
Playbooks and actions can be enabled or disabled through a simple boolean.
The model should have the ability to evaluate the resulting model elements, both including and excluding the impact of playbooks and actions.
I have generally chosen NOT to dimension the entire model and calculations by playbook or action. In a large model, this would create significant size implications, particularly since these apply against multiple scenarios and other dimensions (products, customers, etc). Instead, I adjust the driver values on the combined impact of the selected changes.
The impact of this is that driver values changes are additive. In a more complex model, the modeler might choose to find a way to allow for automated interaction between drivers. For example, one action could impact on another in a non-linear way. Using multiple actions, this can be adjusted manually by the analyst by customizing actions and enabling as needed.
In very complex Anaplan models, interaction between variables and over time would tend to create circular calculations. We have overcome this through repurposing of the time dimension, or one could leverage external capabilities to find optimal solutions.
The modeler should be able to evaluate the sensitivity of output to changes in inputs. For example, what is the impact on profitability of a 5 percent increase in sales, compared to a 5 percent impact in productivity? This allows the modeler to identify the most likely set of actions that would be leverage-able.
This building of these models, and the testing of scenarios, is valuable in and of itself, because well thought out models help us understand both the likely range and volatility of potential outcomes, and the actions that the organization can take to prepare, anticipate and react.
Ensure that all model elements can be adjusted in a driver-based way, and that all actions are linked to all of the drivers that they impact. For example, changing sales volumes (or product/customer mix) would have impact not only on sales, but on any post-sales volume-driven activity, such as shipping, invoicing, customer contact, etc. It is important that proper stewardship of the model be employed. Since the model is likely to be used for quick decision-making, it is critical to ensure that corporate subject matter experts are engaged, both in advance and on a regular basis, on the model and action assumptions.
In a football playbook, for example, a given play includes actions for each member of the team. In our Anaplan model, multiple actions are connected to a single playbook, so that there is a coordinated set of activity around a desired outcome.
Playbooks are evaluated across each scenario. Consider whether or not driver values would be different for each scenario and over time.
Leverage other analytics in developing actions. For example, in our model we leverage activity-based costing information to identify low-profitability products and customers and target those for action under adverse scenarios (Figure 5).
Figure 5: Profitability analysis can inform potential actions
Consider if given actions would create positive outcomes under any scenario. In this case, the business may choose to implement this action immediately. Similarly, evaluate whether a given action now would insulate from any down-turn scenario. Like buying insurance, the cost of extra capacity could be carried if it produces sufficient optionality for the business.
Anaplan is an incredible tool for scenario-based planning. Use the approaches outlined here to help boost your ability to forecast the future and evaluate alternative action plans that will enable your business to survive and thrive during changing times.
Please share your comments or reach out if you would like to discuss any specific issues in your business.
... View more
When adding a new list member via a Form in NUX, it is not currently possible to choose the parent by selecting it from the list in another grid on the worksheet. This would be similar to the CREATE action in classic dashboards. While you can prompt the user to select the parent (or hard-code it), in many cases the lists are long and multi-levelled, so this is not practical. We use this function extensively and it is impeding our ability to leverage NUX.
... View more
The term VUCA—an acronym for Volatility, Uncertainty, Complexity, and Ambiguity—was first leveraged in the business environment in the late 1980s, but the impact of that term in our modern world has rarely met the tests that the year 2020 brought. Organizations around the world are united in finding ways to meet the challenge of VUCA, and many have turned to scenario planning as a new tool in their FP&A arsenal. While there is a lot of information on the theory of scenario planning, I would like to present and illustrate some practical considerations and guidelines that I have found useful and effective in building scenario planning capabilities within Anaplan.
Beyond Budgeting and Forecasting
This year, changes beyond our imagination happened: barriers and business operating models that were previously resisted or inconceivable were broken, organizations disappeared and were acquired, and businesses that had never been imagined were created overnight. Very early on in 2020, most organizations realized that the usefulness of their most recently completed budgets was limited, at best. They turned, instead, to more frequent forecasts. Scenario planning does not replace budgeting (if you still do that) or forecasting. Rather, it is the first step to help establish a single plan or forecast based on the agreed set of assumptions and course of action to be taken.
Consider the “Three Whats”
Following basic principles, I often consider the role of the CFO in asking three key questions:
What? What happened or might happen?
So What? What is the impact of those events?
Do What? What options do we have after and what do we do now, in advance?
It’s important to not focus only on identifying possible events and their impact as it's even more important to focus on action. FP&A’s task is to guide the organization in making effective decisions. Those decisions need to incorporate not only the changes in inputs and their impact but also an evaluation of those actions that the organization could take after they occur. More importantly, it's critical to consider what actions can we take now that might mitigate risk or position for the exploitation of opportunity. In this light, some CFOs now talk about scenario management [i] instead of scenario planning.
Key Steps for Scenario Planning
The core steps to be followed in developing and executing scenario plans [ii] include:
Create a base forecasting model, ideally driver-based.
Develop a wide range of potential scenarios (probable and even improbable) leveraging sets of drivers and assumptions.
Narrow the scenarios to a shortlist, including a base scenario. Ideally, this should be done using statistical techniques.
Develop a set of potential actions that could be taken (a “playbook”) and then selectively applying them to each scenario.
Assess risk and identify leading indicators.
Identify positioning actions and their cost and value that can be taken in advance, and potentially identify “no risk moves” that can be taken under all scenarios to mitigate risk or position for growth. Establish detailed operational playbooks for the actions that are most likely and/or consistently most useful.
Incorporate scenario planning into your ongoing financial planning activities.
Translating This to Anaplan
The "Three Whats" provide a framework for developing an effective approach in Anaplan:
Develop a driver-based model.
Where possible, start from your existing forecast model. I have found it generally helpful to build a higher-level model. For example, product details could be rolled up to a line of business level, organizational structures, and customer groups could be aggregated, and modeling could be done on a quarterly basis instead of monthly (but should extend longer—ideally two to three years).
Consider the drivers of revenue and build inputs with overrides to allow for scenarios to be rapidly constructed. Ensure that the revenue analysis is appropriately dimensioned to handle required inputs.
Use of Driver Rates and Overrides to project Business Volumes
One of the most impactful changes is often headcount-based. Consider those costs which vary with headcount and develop cost-per-head ratios. Then, measure productivity rates per head based on key drivers in each area (for example, new customers, orders, suppliers, etc.).
Identify those costs that are volume sensitive versus those items that are not sensitive to changes in business volume. Note that this is different than costs that are "fixed." For example, the cost of finance headcount is “variable” in the classic accounting sense but is unlikely to vary much with business volume.
Use of Volume Sensitivity, Variability and Fixed Cost Growth Rates to Forecast Headcount and Operating Expenses
If you have an activity-based cost (ABC) model, use it to improve your understanding of drivers and their associated costs.
If you don’t have an ABC model (or in addition to it), look to other analytic tools such as regression analysis to confirm the relationship between and amongst drivers and revenues and costs.
Where appropriate, include a high-level capability for forecasting critical balance sheet items (cash, receivables, investments, inventory, and debt) and ensure that you have the capability for incorporating capital expenditures.
In Anaplan, I have found it preferable, for this use case, to treat scenarios as a numbered list, and not to use “Versions.” In this way, users can rapidly add new scenarios, copy assumptions from one scenario to another and easily compare across scenarios. In place of switchover, use a Time Settings module to indicate actual and forecast periods, and use Dynamic Cell Access (DCA) to restrict input to forecast periods. Leave input “hooks” for actions and plays to be incorporated.
Add "scenarios" as list dimensions to your forecast modules. We will build in action controls to add the action variables to the scenarios in the second part of this blog series.
Identify a broad range of scenarios and their inputs. (This is the “What.”) Consider only what might happen, not what you would do (actions) as a result. For example, what if revenue in a certain product line, geography, or customer segment had an order-of-magnitude change—either positive or negative? Follow D.I.S.C.O. principles by isolating inputs variables, dimensioned by scenario, from calculations and outputs. Consider generating an extensive set of inputs for each variable to construct a large set of scenarios [iii] using the power of the Hyperblock.
Measure the outcomes of each scenario. What is the impact of that change in the short and long term? (This is the “So What.”) Identify the probability of each scenario and its impact. For example, under what scenarios could the business become insolvent? Note that it is not always appropriate (although tempting) to multiply the outcome by the probability. Consider 2020—few predicted the events but the outcomes were extraordinary by most measures.
Develop plays and actions.
Having developed a set of scenarios and potential outcomes, the analyst can then proceed to apply different “plays” (the “Do What”), either in advance or during the scenario. For example, building redundant assets has a cost but the NPV of that investment could be positive under any scenario (in which case it should be adopted now).
In the second part of this blog series, we will look at an effective way to develop plays and apply them to scenarios in Anaplan.
[i] See, for example, an interview with Coca-Cola CFO John Murphy.
[ii] This incorporates research from an upcoming AFP Guide on Scenario Planning.
[iii] For an interesting perspective on this, see this article from McKinsey.
... View more
I am looking to use the mobile app to easily capture input data (for example, inventory count). When a user touches the grid, the grid automatically maximizes. I would like the option to activate for input without maximizing, so that other items on the screen are visible. For example, selecting a list member in another grid to synchronize to the data entry grid.
... View more
There are a large number of folks working to complete certifications before Anaplan's Dec 31 deadline. Some of us might be trying to do that on weekends. It might be really helpful if the downtime could be postponed until January. Just my $0.02.
... View more
I am attempting to build a data entry board in NUX to capture user input (an inventory count) on a mobile device. When I click on the module to enter data, it maximizes that module on screen. I would like it to remain active for input so that the other cards remain visible. What am I missing?
... View more
That's correct @Austinv but assumes that both periods are within the Time Range in Time Module. This may not always be the case - for example, when computing the length of time of a contract or relationship where the start time precedes the Time Range, or extends afterwards. Mitch
... View more
Are there any further developments on this? It looks like there is different cell functionality on boards vs. worksheets, but still no drill-down. We could use worksheets with insight panels to direct to other sources, but it's difficult to predict the types of drill down analysis that may be required.
... View more
FWIW, we have found that if you set the "Use Top Level as Default" box in the list configure tab, the selector does default to that top level. Not the same as picking a default but that seems to help for some use cases.
... View more
It would be very helpful to be able to re-order items, as in classic. We also have found that when the top level in a list is de-selected, the order goes to show the totals at the top. There is no other way that we have found to control this.
... View more
Thanks, @usman.zia . The use case here is capturing of data for strategic initiatives (building and consolidating business cases). Ease of use for rapid modelling is critical. Users can always fine-tune items by lower entry periods (quarters) after entering data by year. Since there are a significant number of lines to be entered, reducing the number of periods for entry is helpful.
... View more
I was pleasantly surprised to see that "basic" breakback persists into New UX. This was particularly helpful on a board with two grids - one using years and another synched using quarters or months. Entering data by year in to a strategic input grid and then allowing it to be spread by quarters or months in the child grid is a great UX. Will be nice to see a fuller set of breakback features (display,hold,etc) implemented in future.
... View more
Most organizations, at some point, will need to find ways to significantly reduce spending. We are all familiar with general exhortations to spend less, as if managers willingly commit to frivolous spending and can easily "cut a little fat." Sure, there are always some small things that can be reduced, although in practice it generally turns to deferral—pushing out spending to a future quarter in the hope that good times will return. Cutting or deferring so-called "discretionary spending" has consequences. For instance, deferred training has an impact on achieving objectives on a timely basis, reducing marketing spend or travel may have an impact on sales or retention, etc.
One technique enjoying a "re-birth" due to the use of new technologies is Zero-Based Budgeting (ZBB). The main goal of ZBB is to create transparency around "what" is spent, and more importantly, "why." It seeks to answer questions surrounding which spending initiatives or items may be deferred or reduced without a net negative impact on business objectives (ideally on a discounted basis). Identifying costs for targeted reduction and realizing the savings are not the same thing. While the re-birth of ZBB is exciting and will help to navigate the tricky waters of cost planning, it is still important to be aware of the two main "myths" of cost reduction programs before jumping into your next planning session. Keep reading to find out what should be avoided the next time cost reduction planning comes up.
Myth 1: Costs Aren’t Always What They Appear to Be
The most insidious problem with "go away" costs is understanding the allocated costs that are often embedded in a cost structure. Consider an IT department that plans to migrate an on-premises application to a cloud application. What costs might go away? There may be ongoing license or maintenance costs that would cease (and likely be offset by new subscription costs). However, would the server (or mainframe) go away? Only when the entire hardware platform and supporting infrastructure components are removed would those costs be eliminated (remember that most server applications are virtualized). This is an example of a reverse-step-function consideration.
In our example, let's assume an entire server could be removed from the platform. IT shows the operating costs of the server are $1,000 per month and estimates a savings of $12,000 per year. But wait, the $1,000 isn't one amount, it comprises a number of allocated items—a bit of network, a piece of the data center, some portion of staff costs, and then some depreciation. Which costs would actually go away? In practice, only a small portion of costs are directly traced to the application, and costs only go away when a major step can be addressed—shutting down an entire data center or a server farm, migrating to an outsourced facility, etc.
Myth 2: You Can “Will” Costs to Go Away
Organizations need to ensure that costs to be reduced are in fact costs that will "go away." In many cases, costs don't really go away; they just go on vacation. And like a good vacation, when they return, they often come home with friends, and restarting programs often costs more than if things had continued. There is lost momentum, or perhaps personnel have moved on.
A frequent target for go-away costs is personnel, since those costs are often an organization’s largest line item. However, reducing those costs, whether through deferring hiring or aggressive down-sizing, is often difficult, particularly in tight labor markets. As tempting as it may be to reduce the size of one or more teams, unless your organization is grossly overstaffed or you are improving the automation of the process itself, reducing personnel costs is not likely to be a long-term solution, and may actually hinder your business output. It is critical to be able to identify the actual spending items that have changed if efforts are to be successful. Willing costs to “go away” during a planning exercise is not the same as tracking the achievement of cost reduction efforts. While organizations can measure cost changes in aggregate, it is often difficult, if not impossible, to identify the individual costs that actually were reduced. Different cost items may move in different directions, leaving only a “net” indicator. The real enemy of sustained change is the absence of measurement. In many organizations, different costs go up and down independently, and so identifying which specific actions have yielded improvement can be difficult. Without a sustained program to identify and then measure real go-away costs, efforts to reduce costs will be fleeting, at best.
So, What Can an Organization Do to Plan and Control Costs?
The answer lies in creating a transparent mechanism for planning and measuring costs based on the activities that an organization consumes to sell and support its products, services, and customers. Some organizations have developed Activity-Based Cost (ABC) models that allocate costs. This is an important first step, but a more comprehensive approach includes three additional elements:
Strong traceback to understand the original costs that make up an allocated amount and the variability of those original cost items. This can then be aligned with the business drivers to understand the potential cost sensitivity to business volume.
Beyond cost allocation, the ability to plan for costs using this same ABC model.
The ability to track actual cost spend against ZBB “lines,” so that achievement of cost reduction initiatives can actually be measured.
More About Connected Planning:
Connected Planning in the Age of Artificial Intelligence
The State of Connected Planning Trends Review
Transforming IT Project Planning for CIOs
While this is by no means a comprehensive guide to cost reduction, it should provide some starting parameters to prevent you from going off track early in the process. In future articles, we will be exploring the ways that these capabilities can be developed and successfully employed so that costs can be successfully planned, measured, and managed.
Mitch Max is the founder of BetterVu and has over 25 years of experience in guiding organizations in measuring and managing performance across a variety of industries, with a deep focus on activity-based costing and planning. Follow his blogs at www.BetterVu.com . He is always interested in learning and exchanging views and can be reached here.
... View more
David, we're having a very enjoyable time taking advantage of this feature on some of our larger client models. Well done! We've noted that when using Time Periods to format a line item (ie. for a user to select a time period) the entire superset is presented. This can be awkward when the superset is large (ie. contains multiple years of history etc.) Have you considered an enhancement where the Time Period could apply either the Model Calendar or a specfied Time Range on a formatted line item?
... View more