-
Source-side lineage view
In as much detail as possible, describe the problem or experience related to your idea. Please provide the context of what you were trying to do and include specific examples or workarounds:
I was trying to identify which models consume our Data Hub as a source in a model-to-model setup. Anaplan supports using one model as a central Data Hub for multiple target models, but the visibility I found is mainly target-side, not source-side.
Today, the workaround is to go into each spoke model and check Source Models / Import Data Sources / Actions one by one. I also found a community answer saying they were not aware of a direct report or process to see this from the Data Hub side.
This makes impact analysis, cleanup, troubleshooting, and governance harder than it should be.
How often is this impacting your users?
Whenever we need to change, clean up, remap, or troubleshoot Data Hub integrations across multiple models.
Who is this impacting?
* Workspace admins
* Model builders
* Solution architects
* Integration teams
* Support teams
What would your ideal solution be? How would it add value to your current experience?
A source-side dependency / lineage view for model-to-model imports.
From a Data Hub, I want to see:
* all target models using it (at the very least, basic view)
* which saved views/modules/lists they consume
* which import/process uses them
* last run / status
* exportable results
This would make impact analysis, support, cleanup, and governance much easier. The existing Source Models information already exposes some relationship metadata, so extending that into a reverse-lineage view would be a logical improvement.
-
Update Subsets dynamically - Formula driven Boolean
It would be nice to have the subsets updated dynamically using a formula driven boolean linking to the subsets as a formula, just like how we have the option to link Display Name to a formula in a module.
This would certainly eliminate much needed efforts of running the action every time when we have to update the subset. This option along with the current feature of updating through actions would make the feature and, model building more efficient. Model builders would look for ways to automate and scale the model with the formula driven subsets which can be updated dynamically.
-
Enable Moving or Copying MyPages Between Models
Description:
Currently, MyPages are tied to a specific model, and there is no native functionality to move or copy them to another model. This limitation creates friction for users who:
* Build personal pages in a development or sandbox model and want to migrate them to production.
* Need to replicate similar MyPages across models with the same structure (e.g., regional clones, business units).
* Want to reuse personal dashboards without rebuilding them from scratch when models migrate to new models Year-over-Year.
Proposed Enhancement:
Introduce a feature that allows users to export, copy, or move MyPages from one model to another, provided the target model has a compatible structure (e.g., same modules, dimensions, and lists). This could be implemented via:
* A “Move to Model” or “Copy to Model” option in the MyPages menu.
* A way to export/import MyPages as templates.
* Integration with Application Lifecycle Management (ALM) for personal pages.
Benefits:
* Saves time and effort for users and model builders.
* Encourages more adoption of MyPages for prototyping and self-service reporting.
* Aligns with existing ALM and UX portability goals.
-
Level2 sprint 3
My figures in Inv 01 ordering is not matching with one given in training module
I have attached both figures and formulas used please guide whta I am doing wrong
-
Reorder flag
In module 2 sprint 3 it says keep the reorder flag referencing to Suggested order amount. WHile my formula for reorder flag is
OFFSET(Ending Inventory, Shipping Time Weeks, 0) < 0
I cant get it
Pls clarify
-
Historical Snapshotting
Estimated Level of Effort:
<4 Hours of Model Building
Level of Difficulty:
Beginner
Recommended Training:
L1 Model Builder Training
Persona:
Casual Model Builder
Potential ROI:
Decreased response time to address risks
Increased forecast accuracy
You Might Also Like:
* Tops-Down Allocations
Historical Snapshotting
One of Anaplan’s greatest strengths is its in-memory storage, which means that data is always accurate and real-time. However, sometimes it is useful to see what the data was at, at a specific point in time, whether to back up data, to compare changing data over time, or any other reason that comes along with the true diversity in planning empowered by the platform. Historical Snapshotting refers to the ability within Anaplan to capture real-time data values and store them in a way that they will never change.
This seemingly simple idea requires just a few more steps than you might think and cannot be derived by the calculation engine, but by following these simple steps (including one or two shortcuts), you can have Historical Snapshotting up and running in your model in no time. Sometimes called "Time Stamping" or "Version Stamping," this process really covers anytime that real-time data must be captured at a specific moment and saved in a relevant way that can be recalled and compared later, whether that has to do with a specific point in time, the specific version of a plan calculated at the moment, or any other variation.
This functionality is especially powerful when matched with Crediting Rules where you can see which transactions were assigned to which individuals at any point in time, regardless of staffing changes. This is also useful for historical reporting to compare different versions of data over time to show trends and changes in dynamic data in real time.
Ingredients
* Time
* Versions
* Standard vs. False Versions
* Import Actions
* Module Saved Views for Importing
Instructions
* Identify the module that has source data that you’d like to see over time. There are often many calculations in this data, and they are most often NOT dimensionalized by Time.
Create a Saved View of the data you would like to have timestamped, usually including some mention that it is a “Snapshot” in the name.
* In the Module Settings, Copy this module. You have flexibility over the name you choose, most frequently it is some combination of the name of the original module with the addition of "Snapshot.” It is not necessary to keep the "Include cell data in module copy" setting selected.
* In the new module, remove all formulas and clear out all existing data. You can easily accomplish this by setting the formulas to “0” or “BLANK” or “FALSE,” depending on the Line Item Formatting. Just make sure to remove these formulas before moving on to the next step. You can certainly also complete this step manually if you can easily see all data on one screen.
* Update the dimension of the module to add your Snapshot dimension (whether it's Time, Versions, or a list). If using a false time dimension, after clicking the "…" in the Applies To setting make sure to keep the current dimension selection and just add the new one by holding “Command” on Macintosh keyboards or "Control" on Windows keyboards to add the new dimension without losing the current module dimension settings.
* Build an import into this new module from the saved view you created in Step 1. All dimensions and line items should map automatically, and you will need to fill in the mapping for your time dimension. Set the mapping for your time dimension to “Ask Every Time.” One thing to note, it is best to set your import to Clear All Data, so that if you run the import action multiple times, set to the same Time item to prevent summing any numeric values instead of stamping them from the current source data.
* Alternatively, you can include a line item in your model where you set the value for the timestamp to set and include a formula referencing this data in your source data module Saved View, for use in mapping. This advanced modification is useful when running multiple Time Stamping procedures or ensuring model-wide consistency, but is not required to get started with your first Time Stamping exercise.
* Run your import and select the mapping you would like to use for this snapshot of the data.
* Validate that when setting your module dimension to the current snapshot, your data matches the current live data in your source module.
* Add your newly-created import action to a Process Action, and publish this Process Action to a relevant administrative Page (or Classic Dashboard, if you have not yet migrated to the UX).
* And that’s it! Whenever you have new data that should be captured in Time Stamp, you can run your Process Action to stamp the data to a specific time of your choosing.