Share your model design best practices – November 2024 Community Challenge
We’re excited to announce our second Best Practices Challenge of the year! This Challenge series is open for Community members to share their expertise and tips on a specific topic. Join us to spotlight your knowledge and explore fresh ideas from other members in Community!
For our second Best Practices challenge, we’re focusing on model design. How do you plan successful model layouts, what steps do you go through, what do's and don'ts do you consider? How do you ensure minimal redesign throughout the project, and plan a model with the flexibility and scalability to expand in the future? This Challenge is a great opportunity for you to share your insights with the community and learn from other Anaplan pros!
How to participate
- The Best Practices Challenge around model design kicks off today, November 7, and concludes on November 27.
- Share your best practices related to model design in Anaplan on this post. Whether it’s a detailed write-up, a short tip, or even a video, we welcome all formats!
- Explore model design tips shared by your fellow Community members.
What’s in it for you?
- Recognition: Showcase your model design expertise and stand out as a Community thought leader!
- Learn: Check out contributions from newer and seasoned professionals in the Anaplan ecosystem.
- Earn a Badge: As a thank you for your participation, everyone who shares their best practices will receive an exclusive Community Challenge badge. It’s a fun way to show off your contribution!
- Earn a shout-out in our upcoming event: on December 4, we’ll be hosting a Challenge recap event discussing model design best practices. Participants' responses will be highlighted at this event.
Participate Today
This is a great opportunity to exchange insights, tips, and innovative ideas with fellow Anaplan professionals. Join the model design Best Practices Challenge to contribute your expertise and learn from others in the Community!
Comments
-
Anaplan Champions!
So, what do you think about model design best practices? Tough question, right?
Listen, I can go on all day about this topic but I'll keep it salient and honestly, if you REALLY want to make a good investment in model design start with the Level 3 Coursework. There's no better way to start. Great, now I can skip the whole process design, user stories, UI wireframing, success criteria, testing, and model schemas… the foundation for good model design.
(Image Credit: ChatGPT, DALL-E 2024)
So, here are my top suggestions. I realize some of these will be hard to take in but coming from 9 years of implementation experience I can vouch for all of them all. Hope you don't find them too controversial.
- Intimately know the industry and planning processes you are building for. Do not rely on customers to describe the technical nuances of the process. If you can anticipate the challenges likely to be faced, you're on the right track!
- Invest in developing a productive relationship with IT or with the stewards of the source data. I found food helps, a lot!
- Truly understand how Anaplan's bulk APIs work.
- You will only get limited help from your customer on how to properly design the data hub and to synchronize the data hub to the applications. So make these user stories non-negotiable.
- Do not underestimate the Planual. This document is not Anaplan dogma. Have an incredibly good reason to break the rules (which, btw, is rare)
- Read and follow @rob_marshall Data Hub Peak Performance.
- Do not use Anaplan as an ETL tool. Bad idea. If you don't have access to one then get familiar with Anaplan's ADO or find an open source solution.
- Own the design completely. Think DevOps and take is seriously.
- Let your customers change their minds, a lot. It happens and frankly, that's why they bought Anaplan, so it can adapt properly to their process.
- Document everything you do because most of us cannot remember what we did 24 hours ago, let alone, 3 months ago. If you have to add formulas to your model schema, go for it!
9 -
I agree with everything @JaredDolich laid out, and want to just dive a little bit deeper into one of the areas he highlighted from L3: UI Wireframing.
I think this is a critical step in the launch of a new planning model because if you build the worlds greatest model and follow all of the technical best practices, but no one uses it because they hate the UI…. You didn’t build worlds greatest planning model! And all your hard work is for nothing.
UI Wireframing is a great tool for change management. It helps to give the user an idea of what something is going to look like before it is built. It also allows them to provide feedback and then see their inputs come to fruition. Seeing their input in the final product helps with buy in, which is contagious and helps build excitement with the user base as opposed to fear.
By putting in the effort up front and putting together good Wireframes, it will help inform you how to design the model from the backend. Understanding what the end users are going to see helps you know how to architect the model. Last second changes for usability usually come at the cost of performance, so eliminating those as much as possible is advisable!
Building onto the 1st point from Jared, this is also a great opportunity to “flex your knowledge muscles” a little bit, and provide examples of how this has been achieved before, and why you would advise doing it that way. This is an excellent way to build early trust and also ensure the implementation is successful.
I can tell you from experience, spending a little time up front, saves you a LOT of time/pain/frustration in the end.
5 -
I agree that wireframing UX pages is key to success of an Anaplan implementation. I also have learned that connecting UX wireframes to drawing out user journeys helps a lot on designing good usability. In my user journey maps I connect different UX wireframes on the intended journey of the user in the model. From these maps it's easy to follow if the wireframes are missing something or if the user journey is in risk of becaming confusing.
3 -
When designing models in Anaplan, it’s easy to get caught up in technical details and best practices, but the most successful models start with a clear understanding of the user’s needs and a focus on simplicity. Over the years, I’ve learned that a few foundational principles can make all the difference in building a model that is not only scalable but also intuitive and adaptable. Here’s what I’ve found works best:
Prioritize Simplicity and Usability
One of the biggest mistakes you can make is overcomplicating the model. While it might be tempting to build complex workflows and calculations, it’s critical to design with the user in mind. Think about how easy it will be for end-users to navigate and update the model. A clean, simple interface with clear navigation will always outperform something that’s overly complex, even if it’s technically sophisticated.
Embrace Flexibility—The Business Will Evolve
One of the reasons clients choose Anaplan is because they need a model that can adapt to change quickly. So, while it’s important to design for the current state, you have to bake flexibility into your solution. Your model needs to handle evolving business requirements without requiring a full overhaul. Build in the ability to modify assumptions, add new dimensions, or shift processes as the business grows.
Get the Data Flowing Smoothly
Data integration is often where things go wrong, especially if you’re pulling data from multiple sources. A well-designed data flow between your Data Hub and applications is essential for maintaining model accuracy and performance. Spend time upfront defining how data will be synced, transformed, and loaded, and ensure that the process is automated as much as possible. If you get this part right, your model will run smoother and scale more easily.
Think About Long-Term Maintenance
Design your model with future maintenance in mind. This means building in the right level of documentation, modularizing your model so it can be easily updated, and leaving room for improvements. Models should not be static—they need to evolve with new features or changes in business processes. Keeping things organized and flexible will save you (and your team) a lot of headaches when it’s time for updates or tweaks down the road.
Focus on Performance from the Start
Performance is often an afterthought, but it shouldn’t be. Test performance early and often to ensure that the model remains responsive, even as the volume of data grows. Simple things like minimizing the use of volatile formulas, reducing unnecessary calculations, and optimizing your modules can make a big difference in speed and efficiency, especially when working with large datasets.
Involve Stakeholders Early and Often
Getting stakeholder buy-in isn’t just about showing them the final product—it’s about involving them in the design process from day one. Encourage feedback early and often, especially when it comes to defining business rules, reporting needs, and UI preferences. The more you involve them in the process, the more likely they’ll be invested in the success of the model, and you’ll get their feedback when it’s easier to make adjustments.
Data Governance Is Key
Data integrity and governance can’t be overlooked. Establish clear data ownership and define rules for data entry and updates. Ensure that your model has appropriate controls and validation rules in place to keep data accurate and consistent. A model built on unreliable data is doomed to fail, no matter how well-designed the rest of it is.
Balance Standardization and Customization
Standardization is important to maintain consistency and scalability, but don’t be afraid to customize when necessary. Some departments or processes may require unique functionality. The key is striking the right balance between standardizing core processes and allowing for customization where it truly adds value to the business.
Document Everything—Even the Small Stuff
One of the most underrated best practices is documentation. As much as you think you’ll remember how you designed a module or why a specific formula was implemented a certain way, it’s easy to forget details down the road. Documenting not just the logic but also why certain decisions were made will save you and your team a lot of time, especially when the model needs to be updated, debugged, or handed off.
Don’t Be Afraid to Iterate
No model is perfect from the start. You’ll likely need to make adjustments after initial testing, especially as you get real-world feedback from users. Don’t be afraid to iterate on your design. A model that can evolve and improve over time will be more effective than one that’s set in stone. Keep refining it based on feedback, performance testing, and changes in business needs.
In Conclusion
Great model design is all about understanding user needs, anticipating change, and creating a system that’s flexible, scalable, and easy to maintain. Focus on simplicity, performance, and clear data governance while keeping the lines of communication open with stakeholders. And always remember: the best models are the ones that can evolve as quickly as the business does.
By investing time upfront to get the design right—and by being open to adjustments—you’ll build a solution that’s not only functional but also future-proof.
5 -
The foundation on which Anaplan models are built are lists and its associated hierarchies. If you get the foundation wrong, the cost of remediation increases exponentially the later a problem is discovered in the project life cycle.
Understanding your customer's master and transactional data, checking data quality and validating data assumptions as early as possible is an important exercise.
Basic master data checks for your list hierarchy design such as uniqueness of the code being used, a list item having a parent or not having multiple parents establishes a framework for determining if your list design is on the right track and whether the quality of the customer's data is ready for Anaplanised planning. I've found on several projects that part of the customer's Anaplan journey is a need for the customer to remediate their data in order for them to truly reap the benefits of using Anaplan. Unfortunately, customers remediating their data at the same time while the model is being built tends to occur more than desired necessitating a lot of rework. Generally, customers tend to tell
you their master data is clean at the start of an engagement, an assumption which later fails during testing. It is for this reason we have a data profiling tool that does basic data quality checks at the start of an engagement now.
On profiling transactional data, I've recently been loading transactional data on to a Polaris environment to measure the level of sparsity to determine which types of data sets are best suited in Polaris vs Classic. It's good that the most recent licensing model includes Polaris in its Enterprise license as it was sometimes difficult to get permission from customers to load their transactional data on to our Polaris environment for data sparsity profiling.
Besides list design and verifying platform choice, another key design consideration is performance. Supply and demand planning applications tends to have a higher risk of performance degradation due to the size of data sets and the complexity of the business process it tends to support. One of the things to look at is ensuring that calculations are only "connected" where it needs to be. For example, in a trade promotions application you have the following processes.
- Promotion creation
- Promotion review and approval
- Promotion execution and tracking
- Promotion financial accruals
- Promotion claims
- Promotion closure
In a fully calculations connected model, at the time you insert a new promotion list item, the model would be calculating what it's approval status is, how much sales it has made, whether and how much accruals needs to be posted and check whether it is ready to be closed. All these calculations and you've only created a new promotion list item. This slows down performance (as I've personally experienced). Data entry for a new promotion would take several seconds just to update the start and end dates for the promotion. Therefore, look at connecting calculations only where it is required. When creating a new promotion, you don't know whether it will be approved yet or not so there's no value in calculating what the financial accruals and claims for it should be. Therefore, it makes sense to not connect the promotion creation process to the rest of the other processes until it has been approved. There's several design patterns that can be used to achieve this. One way is to create a "temporary" promotions list and after the promotion has been approved, copy the temporary promotions list across to the permanent or persisted promotions list which is connected to the other processes.
4 -
Hi All,
Below are the few suggestions from my side:
1 : To restrict a new user to one specific model while adding them to the workspace?
When adding a user to a model, avoid assigning them workspace admin rights right away.
Instead, add them as a standard user with "full access" to the specific model. Once they are added, you can then update their role to a workspace admin if needed.
This approach ensures that the user will only have access to the model where they have been granted "full access."
2 : Use Flat lists to Store Meta Data .
Use flat lists to store meta data (in system module) and avoid having hierarchy's in datahub. These should only apply to the main planning models
3 : Never delete lists and reload the list as daily occurrence.
Use a unique key to update to values rather than delete and re-load. Also consider adding a TRUE field to the data source that will allow you to know which records have been imported.
Clearing and reloading a list increases the structural changes within a model and will increase the likelihood of a model save. This will increase the import time.
4: Avoid using Emojis
Emojis can cause issues with integration and If you're going to use emojis for pages that aren't going to be used in an integration process
5: Time Range Naming
Keep the naming short using the FYxx-FYyy format. This format allows the administrator to see the name of the Time Range in the module blueprint without referring to the Time Range itself.
6: Workspace Separation
Prepare development and production model in different workspaces to avoid user specific data loss in production model.
7: Revision Tag naming
Major.Minor.HistoryID – e.g. 2.1.123456 Or Date.Version_HistoryID
I hope it will help for someone who started learning Anaplan.
4 -
Mixed Module(Multi-dimensional, Combined Hierarchy list) design:
Multi-dimensional:
Anaplan is designed to be simple to use, so we should always focus to keep the modelling as simple as possible to have the flexibility and simplicity. We should aim to use native modelling design through multi-dimensional modules where ever possibleCombined Hierarchy:
For exceptional scenarios where data sparsity is huge, the combined hierarchy list approach can be considered only for specific datasets. As this increases the efforts and complexity of the model(integration, data reconciliation, Master data maintenance etc.). we should aim to have less modules with Combined hierarchy.Balance between NUX and Add-ins(Anaplan XL/Excel Add-in) usage for easy user adoption:
NUX:
NUX can be considered for huge data volumes analysis and processing. NUX pages will provide better collobaration,security,cloud capabilities and can efficiently process large datasets.
As this enables the data access/processing through web client so no setup/installations required in the client system and extending model data access will be quick with pre-defined authorizations.Excel Add-In:
Excel Add-In components can be considered for low volume data reconciliation tasks. It’s also best option if the requirement is to perform additional reconciliations, by leveraging native excel features to compare with other existing data already stored in Excel. Some end users prefer to view data in excel and more comfortable to process & analyze data in excel interface.
Currently we have the following Add-ins to integrate with Anaplan Model data:- Anaplan XL(Previously FluenceXL)
- Excel Add-in
- Anaplan for Microsoft 365
- Anaplan Add-on for Google Sheets
4 -
These are the learnings that want to share from my experience till now , Although still learning lot in Anaplan as every time found me as new bee to Anaplan.
1. Analyze your inbounds and outbounds for what time setup for model, discuss with stakeholders initially to decide for model calendar type and range. Take precized future and past year for model to save up the size of the model.
2. Make dev and prod model in different workspaces to avoid for any user specific data loss for production.
3. Whenever creating dev model from scratch through revision tag be ready for data loads in dev model because it will copy up the structure but for data you need to load it again.
4. Avoid complex formulas to make performance slow , like end user is getting lot of time to open up the page.
5. Regularly cleanup your list subsets to save up the space.
6. Make your actions, Modules auditable and mark non used line items as no data formats.
7. Prefer model to model imports more rather file based import to improve performance.
8. Use DISCO Principles while designing any model. Make Data , Input , system , calculation , Output modules separately to decrease the model design complexity.
9. Avoid hardcodings. If anything needs to be changed then design should be like to change at one place and that line item/filter can be referred to everywhere and auto update.Hope it will help for some of the folks who started learning Anaplan.
4 -
- Few small observations:
1. Take your model back up everyday and archive it so that if any data mismatch issue comes on, we can compare with the existing data.
- 2. If you have a UAT model for testing, never change the mode from Deployed to Standard. Then it will break the ALM chain and we wont be able to use that test model any longer.
- 3. Use multiple time ranges instead of longer period of model calendar.
- 4. Don't use Versions in your modules if its not necessary. This will increase the cell count of the module.
- 5. For Lists always use Delete and Load method. This will help to get accurate items in the lists and accurate data too.
1 - Few small observations:
-
Wow, so many great tips!
Many of you have mentioned iterative design and that customers will change their mind overtime, absolutely spot on! For those of us focused on transformation♻️, we are frequently building MVP solutions around evolving processes. In these circumstances it is important to know what you need for today (aka MVP) and where you are headed tomorrow. If there are critical components that will be needed to support the future state vision, these should be assessed within the initial design . For example, if the MVP doesn't address the long term foundational needs then you may find yourself in rebuild mode later. This isn't inherently a bad thing but it can be incredibly frustrating if that path isn't well understood from the onset 🗺️.
Eyes wide open is my philosophy, as much as you can try to predict the future🔮. Where is your organization headed? What are the core capabilities needed that they can't support today? Are there decisions being made because of upstream dependencies that will get resolved over time? In short, ask questions lots of them. There may be a need to balance speed vs. completeness ⏰, be sure to understand where the external pressures to deliver "something" are creeping in and align the roadmap accordingly. The vision will change over time, that is without a doubt, but by helping your leaders to understand the potential tradeoffs and iterative design considerations you will be seen as an invaluable partner.
3 -
I've been mulling my thoughts on this and thankfully by coming in late in the day lots of people have already made similar comments to what i've been thinking.
But sometimes you've just got to start - like writing a novel (of which I have no experience by my partner is currently doing) you think you know where you're going to but over time areas evolve.
Yes you want to make things flow logically but occasionally you need to do things slightly off-grid which you hadn't envisaged. Being multi-dimensional is great but it soon eats space and not everyone is blessed with Hypermodel workspaces/Polaris.
Also what problem is being solved here - does the end user really need an all singing, all dancing model with automated data, management reporting, workflow, etc or just have the ability to load a CSV or two and then utilise the add-ins/export function for onward reporting. Just because you think it's cool doesn't mean it would add value to the people using the model.
Finally don't be afraid to cut your losses and start again - obviously if you're a long way down the path that can be more problematic. Especially true of the MVP / POC - you've built something to get people interested but then for longevity purposes you need to start again.
And don't be afraid to say well you said you want requirement X but how settled is that, what's the likelihood of that changing over the length of the project. If high then park it as a box to be designed latter, like a highway just stopping with an interchange to nowhere to be continued a latter date. (Anyone who's done anything on any of the changed international accounting standards recently will know that joy!)
4 -
Hi Everyone,
First thing when I hear best practices while model building I always think about how to make it a more optimized way of solving the problem statement. There is a lot of information about using functions and, best practices to carry out during model building, DISCO methodology in which Anaplan is greatly relied on.
Model builders should always look for a solution which makes the model more feasible, sustaining and change efficient. To always build the model in mentioned ways, there are few methods which can make the solution stand-out.
Firstly, for a problem statement, information gathering is a fundamental step in thinking of a solution. Gather all the information available which includes understanding of the business process, output required for the users, availability of data, model size, performance, and integrating your solution into the model's existing functionality. After information gathering, the next step would be preparing layout of the problem statement and solution, which helps us to see understand the gap analysis, solution overlook in turn helps us to avoid the gap in solution.
Secondly, which are the functions available to provide a sustaining product without affecting the model size or performance. As I mentioned earlier, we can always look into the DISCO methodology and PLANUAL standard methodology of approaching a solution. It is always suggested to look for any possible threats which affect the current functionalities in the model. The solution provided should not be confusing or questioning from end users. Try to keep it to answer the required problem statement and upon that look to provide value addition by making it easily understandable and accessible.
Change methodologies: https://community.anaplan.com/discussion/156723/impressive-community-q-a-challenge-recap
PLANUAL Ref: https://support.anaplan.com/planual-5731dc37-317a-49fa-a5ff-7fc3926972de
Furthermore, an added step while laying out the plan to build particular functionality is to try looking from the end user perspective to make it more appealing for users. This can be done by understanding the capability of the Anaplan NUX. Self questioning has always helped me in these situations. Some of the questions are; How would the end user want to look at the particular Data? What are the steps the user needs to carry-out before seeing the final answer to their problem statement if there are any? Is it sufficient? Is my solution is adding value to the customer?, etc.
There is always a possibility of different approaches towards a same problem. We need to analyse what are the possibilities of coming up with a solution, documentation of all the possibilities, build the efficient one. Do not hurry on to the solution making it vulnerable to redesigning. Understanding the complete capability of the tool helps providing a better solution to a greater extent.
Thanks,
Puneeth HPSuccess is the Intersection of Dreams and Hardwork!
3 -
Late to the party here! This is a huge topic in itself - as in the project's success or failure revolves round the designs. So here is my take on it - To me Design is Iterative and with Anaplan, fortunately, it is very easy to change it. With the changing world are the changing demands and change in requirements so it is important that we design the model in such a way that it can be pivoted quickly. And to make this happen I would strongly recommend to use four cornerstones of TAW - Data, Model, Process and Deployment.
- Thorough understanding of the functional use case - It may sound weird but it is very important to understand what you are trying to design. Starting point is to check with BA and understand the process/use case that is being implemented. Understand the pain points and see how Anaplan can help optimize it.
- Data is another important point which is extremely crucial, as this is make or break. This also is pre- requisite for a very good model design. If the data does not land in anaplan the way it should you will end up using Anaplan as ETL tool and it will start messing up the whole anaplan model design and Anaplan's performance.
- You got to be good in what you are building. There are many ways a problem can be solved in anaplan but you have to weigh your options and see what works best in your scenario.
- Last but not the least - start building keeping end user in mind. Your end goal has to be the dashboards that the end users interact with (keeping Anaplan engine's performance in mind)
Miz
2