Best Of
Comparing SAP × Anaplan integration: Why ADO is the right fit for planning scenarios
Author: Miki Sato is a Product Manager, Product Management Team (Data Management) at Anaplan.
To drive accurate and agile planning, it’s critical to integrate actuals from SAP ERP systems (such as S/4HANA) into Anaplan in a structure aligned with its model framework.
This article compares three integration approaches — SAP Integration Suite, SAP Datasphere, and Anaplan Data Orchestrator (ADO) — and explores the strength and limitations of each tool — and why ADO may be the most aligned for planning-focused integration scenarios..
Note: This article reflects the perspective of the Anaplan product team and highlights why ADO is uniquely suited for planning-driven scenarios.
ADO (SAP Connector): Purpose-built for Anaplan planning integration
ADO is a no-code integration tool built with deep awareness of Anaplan's model structure — modules, lists, and hierarchies. It connects to SAP (e.g., S/4HANA) using OData v4 API and is optimized to transform incoming data into formats that Anaplan can directly consume. ADO enables business users to push data into Anaplan models directly from the UI.
Instead of building flows from scratch, users simply complete guided setup screens. Each step — from data connection to mapping — is configured in a structured form, and the visual flow is automatically rendered for reference. Only the necessary fields are shown, making it easy even for non-technical users to complete integration tasks.
Figure 1: ADO architecture and integration role
Figure 2: ADO’s flow visualization and parameter input UI
Only minimum parameters are required, and the configured flow is visualized automatically.
Key capabilities
- No-code GUI experience
→ Intuitive interface, no need for custom scripts or complex configuration - Connects to SAP via OData v4
→ No code configuration for data extraction directly from the GUI - Supports scheduling, delta loads, and visual lineage
→ Makes monitoring and operations easier and more transparent - Built-in data catalog
→ Enables storage of transaction and master data as reusable assets for planning and consolidation purposes.
What’s next
The following items are part of our ADO roadmap for enhancing Anaplan–SAP integration:
- No native support for on-premise sources outside SAP (e.g., file servers)
→ Cloud options like S3 or Azure Blob are recommended - Does not support real-time events or bidirectional sync
→ Cannot be triggered automatically by external events or systems
SAP Integration Suite: System-to-System API Hub
SAP Integration Suite is a robust iPaaS offering that supports cross-system integration, including non-SAP platforms. When integrating with Anaplan, the Anaplan Receiver Adapter enables external triggering of Import/Export Actions via Anaplan’s REST and Bulk APIs.
The tool allows for flexible flow design using its GUI, supporting event-based, scheduled, and conditional execution. While not optimized for Anaplan-specific structures, it excels in generalized data orchestration.
Note: The Anaplan Receiver Adapter is available only with an active SAP Integration Suite license.
Figure 3: SAP Integration Suite capabilities overview
Source: SAP SE, “What is SAP Integration Suite? © SAP SE. Image used for comparative explanation purposes.
Figure 4: Minimum flow setup and adapter parameters
Even a simple import requires manual setup of Action IDs and flow logic — making this approach suited for technical users only.
Key capabilities
- Connects to Anaplan APIs via official Adapter
→ Support both import and Export actions through Anaplan APIs - Supports triggers, schedules, and complex flows
→ Enables complex orchestration logic and automation - Rich connector catalog for SAP and non-SAP systems
→ Suitable for hybrid, multi-system integration environments
Limitations
- Requires technical configuration
→ Flow setup and maintenance are best handled by integration specialists - No awareness of Anaplan model structure
→ Mapping is not aligned with Anaplan modules or hierarchies - No built-in visual mapping
→ Data transformations require custom logic - Separate license required
→ Adds cost for Anaplan-only integration scenarios - More complex setup process
→ Multi-step design can slow down initial implementation
By contrast, Anaplan Data Orchestrator (ADO) delivers the same functionality through a fully no-code UI, making it significantly more accessible to business users.
SAP Datasphere: Persistent data layer for analytics
SAP Datasphere (successor to SAP Data Warehouse Cloud) is a cloud-native data fabric platform designed to centralize and harmonize SAP and external data for analytics and AI. It provides a persistent layer for modeling and enriching data—ideal for analytics, but not for real-time operational integration.
Unlike ADO or SAP Integration Suite, which focus on orchestrating and automating data movement and workflows across systems, Datasphere is optimized for batch and near-real-time analytical scenarios.
Figure 5: SAP Datasphere feature stack
Source: SAP whitepaper “Unleash the Power of Business Data with SAP Datasphere” © SAP SE. Image excerpted for product comparison purposes.
Figure 6: Visual modeling in SAP Datasphere
Source: SAP Developers Tutorial – “Create a Graphical View” © SAP SE. Image excerpted for illustrative/reference purposes.
Key capabilities
- Stores and transforms data from SAP and external sources
→ Centralized DWH with persistent storage - Visual or SQL-based data flow builder
→ Build reusable transformation views - Supports OData API and file-based access
→ Readable from external BI tools or systems
Limitations
- Not directly compatible with Anaplan actions or push
- Primarily designed for BI, not operational integration
→ Write-back and bidirectional sync require custom design
Feature comparison summary
The following table summarizes key differences across the three integration approaches. It provides a side-by-side view of capabilities, limitations, and ideal usage patterns — helping you assess which option best fits your planning, integration, or analytics needs.
Feature Category | ADO SAP Connector | SAP Integration Suite + Anaplan Receiver Adapter | SAP Datasphere |
---|---|---|---|
Primary Purpose | Planning-centric integration to Anaplan | System-to-system orchestration | Persistent data foundation for analytics |
UI & Operation | No code (GUI-based interface) | Low code: GUI-based flow builder + scripting | GUI + SQL-based (Graphical View or SQL Editor) |
Connector Mechanism | ADO SAP Connector (OData v4); triggers Anaplan Import Actions only | Anaplan Receiver Adapter on Cloud Integration; supports Import & Export via REST/Bulk API | ✖ No native connector to Anaplan; requires iPaaS or export-file workflow |
Supported SAP Sources | S/4HANA, ECC, BW (via OData v4); optional CSV (e.g., S3) | S/4HANA, ECC, BW, and non-SAP via adapter catalog | S/4HANA, BW/4HANA, external DBs, SaaS (via federation/replication); no Anaplan write-back |
Direction | Inbound only (to Anaplan) | Bidirectional (to/from Anaplan) | Inbound only (Anaplan not supported as target/source) |
Data Transformation | GUI-based mapping optimized for Anaplan model (modules, lists, hierarchies) | Script-based logic; not Anaplan-aware | SQL or GUI transformation for analytics; no Anaplan model alignment |
Data Lineage | Visual traceability (source-to-target) via UI | Flow-based with conditional steps | SQL or graphical lineage (limited Anaplan traceability) |
Reusability | Dataset catalog (future: search/control) | Flow-level reuse | Analytical view reuse |
Execution Control | Scheduled / Manual / Delta updates via ADO UI | Scheduled / Event-triggered / Conditional | Scheduled / External trigger; limited internal logic |
Execution Granularity | Module/List level execution with workflow control (stop, pause on error) | Step-level orchestration with retry/branching; requires manual mapping of Action IDs | View-level execution only (no conditional logic) |
Data Persistence | ✔Persistent (catalogs/logs in ADO; imported planning data in Anaplan | ✖ Transient (middleware passthrough) | ✔ Persistent (data stored in Spaces) |
Primary Users | IT integrators, data engineers | IT integrators, enterprise architects | BI teams, data engineers, analysts |
Note: This comparison is based on current product capabilities as of July 2025.
Choosing the right tool based on use case
Each solution plays a different role, and in many cases, they complement one another. Here’s a high-level guide for selecting the right option:
Use Case | Recommended Tool | Reason |
---|---|---|
Efficient Anaplan data loading | ADO | No-code GUI; optimized for Anaplan model structure via OData |
API-based bidirectional sync | SAP Integration Suite | Rich adapters; flexible orchestration logic |
BI/analytics data foundation | SAP Datasphere (via integration Suite or export) | Strong persistence, modeling, and BI connectivity to BI tools*; indirect connection to Anaplan |
*Note: SAP Datasphere does not connect directly with Anaplan. For scenarios involving Anaplan integration, it should be used in combination with SAP Integration Suite or a separate export workflow for API and metadata management.
Real-world integration examples: S/4HANA, SAC, and Anaplan
When organizations aim to visualize Anaplan data in SAP Analytics Cloud (SAC), here are three common integration scenarios:
- S/4HANA → ADO → Anaplan
Designed for structured data loads into Anaplan; ideal for planning scenarios. (Write-back may be possible in the future) - S/4HANA → SAP Integration Suite (Anaplan Receiver Adapter) → Anaplan
Best for bidirectional use cases with complex logic and retry steps. - Anaplan → SAP Integration Suite → Datasphere → SAC
Recommended for analytical reporting in SAC. Datasphere allows semantic modeling and metadata enrichment
Figure 7: End-to-End Integration Across Models, Systems, Semantics, and Analytics
Notes:
- ADO enables GUI-based integration with a single connector, whereas SAP Integration Suite and Datasphere typically require adapters on both the source and target systems.
- ADO cannot connect to Integration Suite via its SAP connector, as Integration Suite does not accept OData v4 requests from external systems.
The future of ADO: Evolving for planning integration
ADO will not only continue to evolve as a planning connector but is also positioned to become a central hub for managing external data connections across the enterprise.
Stay tuned as ADO continues to grow with your business.
Learn more
Anaplan Data Orchestrator
- Official Product Page
- How to Import Data from SAP (Anapedia)
- Example: SAP Connection Setup (Community)
SAP Integration Suite
- Official Product Page
- Feature Scope Description - SAP Integration Suite
- Anaplan Receiver Adapter (Official SAP Help)
SAP Datasphere
- Official Product Page
- Developer Mission: Get Started with SAP Datasphere
- Feature Scope Description - SAP Datasphere
……………
Other articles from Miki:

Embedding accessible images
Author: Dave Waller, Principal Product Manager at Anaplan.
Using certain modeling and page build techniques, it is possible to embed images across your UX apps in a way that provides alternate text for users who may be visually impaired or who may be using assistive technologies to announce on-screen elements.
This article will guide you through the techniques needed to ensure images are provided with ALT text (or aria-labels), ensure model building best practice, and delivering end user experiences that are more in line with the WCAG2.1 accessibility requirements.
Defining a system module
Anaplan system modules are centralized modules typically used for the storage of static values that other modules, line items and calculations can refer to. This is seen as a best practice when building modules and provides a single location of the management and maintenance of global variables and values.
(For a micro-lesson on creating system modules, see this video.)
We can use a system module to define core properties of the images or UX pages will use.
Being sure to use any agreed naming conventions, create a new list to hold the list of images you wish to store, and a new module dimensioned by your images list and two line items, one to hold a URL (formatted as text/link) and another to hold default ALT text (formatted as text/general).
This module can be simplified, removing the capability for custom ALT text in the future, by giving it just a single text/link formatted line item told both URL and ALT text. Or, we can add a third line item to make things more sophisticated and store some baseline ALT text in our system module too. The following example shows all these options.
Once in place, the system module can be used to both embed images directly into UX pages or can be used as a base to refer to when adding images into other modules using formulas such as:
IF forecast value <= threshold THEN
SYS MOD images.url[SELECT: SYS LIST images.red light]
ELSE
SYS MOD images.url[SELECT: SYS LIST images.green light]
Using images from the asset library
Anaplan provides an internal asset store, offering centralized and secure storage for images used in Anaplan UX pages. The easiest way to use these images inside pages is to link them directly from the asset library, however this approach does not provide adequate ALT text when pages are published.
Instead, builders should expose the URL of the images in the asset library and use these to populate their system modules.
Once added to the asset library, images can be selected to expose a “Copy URL” button. This can be used to retrieve the URL for the uploaded image, ready to be added to the system module.
https://us1a.app.anaplan.com/a/mms-mms/media/customers/8a81b01166eee1af0166f7bde931314e/images/~
6ceb7ccf7c6549cf92fb36ec1208baac.png
Generating dynamic ALT text
Line items defined as text/link use a simple markdown format to define both display text and URL within a single data point.
[Display text]https://url.to.use/for-the-image.png
We can use simple concatenation formulas to build this value dynamically, crafting ALT text for the image that is aware of other model values and therefore more assistive to end users.
(Resource: Anapedia - Operators and constants)
If we want to display a red light image if a forecast value is less than the defined threshold, and a green light image if it’s higher, but also provide guidance on why the value is invalid, we can use a concatenating formula like:
IF forecast <= threshold THEN
“[A red light image indicating that your forecast of “ &
forecast & “ is invalid as it must be at least “ &
threshold & “.]” & SYS MOD images.url[SELECT: SYS LIST
images.red light]
ELSE
“[A green light image indicating that your forecast of “ &
forecast & “ is acceptable.]” & SYS MOD images.url[SELECT:
SYS LIST images.green light]
This would, assuming the forecast ($900) is lower than the threshold ($1,000), generate a text/link formatted line item value of:
[A red light image indicating that your forecast of 900 is invalid as it must be at least 1000.]https://.../red-light.png
This approach can also be extracted out a little by using an additional line item, here in the forecast module, to define our ALT text but only if we need it. If no custom ALT text is provided then we can fall back to that defined in the system module:
IF Forecast.forecast <= Forecast.threshold THEN
IF ISBLANK(Forecast.ALT text) THEN
SYS MOD images.combined[SELECT: SYS LIST images.red light
ELSE
“[“ & Forecast.ALT text & “]” & SYS MOD images.url
[SELECT: SYS LIST images.red light
ELSE
...
This simple addition checks to see if a custom ALT text value exists in the forecast module. If it does, it gets combined with the image URL to create a custom text/link line item value. If it’s blank, then we fall back to the “combined” value defined in the system module.
In complex cases, care should be taken to try to minimize the number of cells used when concatenating values into a single text string. The use of “&” can introduce slowness to some calculations if multiple cells, modules and line items are used to generate a single value.
You can view an indication of the calculation complexity of a concatenated line item using the “modules” > “line items” view and the “calculation effort” value for your line items.
Adding images to UX pages
Line item driven image URLs can be added to UX pages with the use of image cards. When adding an image card, be sure to select “line item” as the image source.
Select the module and the line item to use to define the image — this can be your system module if your images are static, such as logos or horizontal rules.
Depending on how your system list is defined, you may initially get an error message that your image is not valid. This is because we are yet to specify which image we want to reference. We do this using the “context” tab in the right-hand panel.
By opening the section for “SYS LIST images” we need to make a few changes:
- We need to disable “sync with page” to ensure that the page builder defines the image
- We need to select the correct image for the “page context” value
- We need to make sure that “show on card” is set to “off” to ensure that end users can’t change the image that’s displayed
These changes will display the line item driven image inside the image card and will implement the correct ALT text for the image in both the builder and published modes.
Adding images to UX grids
When adding grids of data to a UX page, they will display all text/link formatted line items as text values by default.
Using the “line item image settings” section at the bottom of the right-hand panel you can decide which line items should render as images rather than text. Using our system module, the only line item defined as text/link is the “combined” line item that defines both the url and ALT text.
Enabling this line item to display as an image has an immediate effect on the preview of the grid, now showing an image in the cell rather than text.
Once enabled as image display, we can also tailor the size of the image if needed.
The published page now includes the grid images, at the specified size, along with appropriate ALT text.
Questions? Leave a comment!
……………
Also by Dave: End user triggering of workflow processes.
End user triggering of workflow processes
Author: Dave Waller, Principal Product Manager at Anaplan.
With the release of action button workflow triggers, customers have been able to apply Workflow to a raft of new use cases and business processes. End users are now able to instigate processes directly from their dashboards and UX pages, opening up the possibility for processes like:
- Creating budget requests and submitting them for approval
- Identifying incorrect commission statements and raising disputes
- Logging CAPEX projects and submitting into a formalised review process
- Creating a new headcount request and passing it to HR for action
- And more
In many of these scenarios, a user will create a new item (or collection of items) that will become the “subject” of the workflow process — the specific budget request to be approved, for instance. This helps all stakeholders in the process clearly identify the things to be reviewed or approved and helps ensure accuracy by logging views of data and visualizations down to the specific item.
Depending on the type of process to be supported, the number of submissions expected, and the level of customization an implementation team is happy to build, there are a range of approaches that can be taken to deliver these scenarios with Workflow.
✅ This article will cover the creation and submission of a single item into a workflow process.
How-to Guide
The most straightforward approach, and with the most native support built into the Workflow tools, allows for the creation of individual items and their submission into a formal process. It combines UX board pages, UX forms, UX action buttons, and standard workflow steps to create a new item and pass it through a series of approval steps.
We’ll look at a scenario where end users are asked create new investment opportunities before submitting them into a two-tier approval process. We’ll make one level of approval conditional based on the data the user has input, and we’ll circle back round at the end and alert the user if their opportunity is approved.
In addition to the list we’ll use to store our opportunities we need a few extra items building in our model.
The investment opportunities list:
The investment approvers module:
Plus any additional modules to hold opportunity data, in my example I have a range of properties, including a list formatted line item to hold the opportunity type which can be used to further refine our approvers, and even a time-based forecast of income vs expenditure. They key aspect of this, for our example at least, is a line item to hold the investment amount, and the data needed for the workflow to execute.
The Investment data module with line items to be used by Workflow:
It is critical that this module is dimensioned by the list that holds our investment opportunities as we’ll need this to be able to look up the correct line items our process will need. It also contains a number of these line items:
Line item | Use | Formula |
---|---|---|
Total investment required | Currency – the value of the investment input using a form | - |
Investment owner | User list – the user who created or owns the opportunity | - |
Submitted | Boolean – used to track which opportunities have been submitted for review | - |
L1 approver | User list – the user to complete the first approval |
|
L1 approved | Boolean – used to track when L1 approval is complete | - |
Requires L2 approval | Boolean – we only require L2 approval for investments greater than $1M |
|
L2 approver | User list – the user to complete the second approval |
|
L2 approved | Boolean – used to track when L2 approval is complete | - |
Scheduled for fund release | Boolean – used to identify when an opportunity is completely approved | - |
An additional forecast module used to enhance the visual aspect of the demo:
With the model components in place, we now need two UX pages to support our workflow: a page for end users to create and populate investment opportunities, and a second page for approvers to review the submitted data.
They key thing to remember with both pages is that our opportunities list needs to be used as a page-level context selector.
Doing this enables Workflow to do three things:
- When the approval process is triggered by the end user, the selected item in this list is detected and past into the process as the “subject” for later steps.
- Use this when looking up data in connected modules (assignees, approvers, etc) and when writing to connect Boolean line items.
- When later steps in the process are loaded for approvers (or other users in general) the page selector is automatically set to the correct item, ensuring all downstream users see exactly the right slice of model data.
The UX page used by end users to add new investment opportunities:
The page for end users contains a form to support the creation of new investment opportunities.
The UX form used to create a new opportunity:
One of the great features with UX forms, especially when the list being added to is a page-level context selector, is that when the new item is inserted into the list it is automatically selected at the page level. We can leverage this when we trigger our approval workflow, as we can have this selected item picked up automatically as the “subject” of our process.
The UX page used by reviewers to approve opportunities:
Note that both pages have the opportunities list as a page-level context selector but these have been set to “label” so they provide clarity to the user as to which one is selected, but removes the ability to select a different one. When our approver lands on the page to complete their review, there will be no question they are viewing the correct item.
Page-level context selector settings to prevent users from changing the opportunity their task is based on:
When new opportunities are submitted, our workflow will perform a number of tasks:
- Mark the opportunity as submitted
- Issue the L1 approval
- If L1 approval is successful, issue the L2 approval
- If the L2 approval is successful, mark the opportunity as ready to have funds released
- Notify the original submitter
The overall workflow template:
Step 1: a machine task to mark the opportunity as “submitted”:
The first step in the workflow updates our Submitted line item to indicate the opportunity has been submitted into an approval flow. This uses a standard Data Write Machine task with the Investment Data module selected and the Submitted line item identified.
You will also note that the setting for the Investment Opportunities dimension is set to Synced to workflow. This means that this value will be synced to the value passed into the workflow from the triggering page.
The Investment Data module with the targeted intersection highlighted:
If “The Old School Hall” is selected on the page, this will be used to identify the specific data point to update — the intersection of The Old School Hall and Submitted in the Investment Data module.
Step 2: a decision task for the L1 approval
The first approval step is a decision task with our approver page selected and some appropriate instructions for the approver provided in the task details. As we want the approver to be specified using our Investment Data module, we’ve selected to Assign to users from a line item in a module.
The configuration of Step 2, showing how it connects to the underlying module:
By editing the line item selection, we can specify the exact intersection in our module that we want workflow to look at to determine who our approver should be. The Investment Data module is selected, and the L1 approver line item is chosen as the user list formatted line item to define our first approver. We can also select the L1 approved Boolean line item to be automatically ticked when the item is approved. We can optionally set a L1 rejected Boolean line item should the item be rejected — I have chosen to omit this for now.
Again, to specify the exact data intersections to use, we have configured the Investment Opportunities to Sync to workflow so whichever selected opportunity is passed into the workflow will be used to identify our intersection.
The configuration of Step 2, showing available approval options:
The final part of configuring the L1 approval is to define how we want to handle rejections. We know that no matter what, all opportunities need some form of L1 approval, so we have switched off the option to Skip on blank approver. If for whatever reason, our line item to define our approver is blank, the workflow will now pause and alert an administrator.
We know that we want to stop our approval process if the L1 approver rejects the opportunity, so we have Continue workflow on rejection switched off, and we always want the reject option to show to our approvers, so we also have Hide reject switched off.
Step 3: a decision task for L2 approval
Our L2 approval is very similar to the L1 approval — it links to our approval UX page and has some appropriate instructions added for our L2 approvers.
It’s also configured to Assign to users from a line item in a module with the Investment Data module selected. This time we’ve chosen the L2 approver line item to define the approver and the L2 approved Boolean line item to be set when the approval is made. We’ve also set the Investment Opportunities dimension to Sync to workflow, as with our other steps, to allow Workflow to identify the correct intersections to use in the module.
Where this step differs is with its final configuration options. We want to provide approvers the ability to reject, and we want to stop the workflow should a rejection take place, so we have Continue workflow on rejection and Hide reject both switched off.
We know, however, that only certain opportunities are going to require an L2 approval with those that don’t being given a BLANK value for the L2 approver line item. This means we can set Skip on blank approver to on so that if Workflow gets to this stage and encounters a blank line item, the step is skipped, and the process continues down the “approved” path.
‼️ Note that the L2 approved Boolean will not be set in this instance as no explicit approval took place.
Step 4: a machine task to mark the opportunity as “approved”
Based on how we’ve configured previous steps in the workflow, the configuration of the machine task that updates an opportunity as “fully approved” is relatively simple. We’ve used a Data Write machine task to update the Scheduled for fund release Boolean line item in the Investment Data module, with the specific Investment Opportunity set to Synced to workflow.
Step 5: a notification step to send a final confirmation to the opportunity owner
The final step in the process is to close the loop and inform the submitter (or the owner of the investment opportunity) that their submission has been fully approved. We can do that with a simple notification step at the end of our template.
A message is provided, and the step is configured to Assign to users from a line item in a module.
As these steps don’t use a UX page, where we would normally identify the workspace and model, we need to provide that data in the configuration of the step. Once a workspace and model are set, we can define the line item to use.
The configuration of step 5, showing how it connects to the underlying module:
We’ve again selected the Investment data module and the Investment owner user list formatted line item. As with our other steps, we’ve also set the Investment Opportunities dimension to Synced to workflow so that Workflow can identify the specific data intersection needed to get the details of the user to be notified.
Summary
This simple workflow template combines the flexibility of the UX with the structured and defined business processes to support end-user-led planning. It removes the need for end users to tick Boolean checkboxes, trigger email links, or run model actions. It provides clear notifications and calls-to-action for approvers and uses model logic to influence the path of the process. Task handover is fast and automatic, with the potential to drive down the time it takes to submit data and pass it through an approval flow.
However, this process (and the workflow functionality that supports it) is designed for the submission and approval of single items. Users wishing to create and submit multiple opportunities must do each one in turn — create → submit, create → submit, etc.
It’s also worth considering the approver experience when looking at supporting a process like this. Each new submission will generate a new notification for approvers — if there are 50 new opportunities raised each day, and only a single L1 approver, then that approver will receive 50 emails a day that need to be managed and actioned.
Questions? Leave a comment!
Cloud works - Additional Details on Failures/ Error Logs in the Automated Email Alerts
Cloud Works has definitely made it easy to setup routine scheduled refreshes from model to model or simply updating daily dates in the Models, but I had a query regarding the notifications sent out and the level of detail they hold and if anyone has explored this further.
- Success Scenario - Scheduled Integration completes successfully and we get an email. [Short and Sweet just as expected]
[EXTERNAL] Success. Your CloudWorks process integration is complete.
2. Warnings/Failure Scenario -
Although the notification does highlight there are errors, and we need to investigate - it doesn't give sufficient insights like the number of rows impacted, the ID of the rows for debugging i.e. general log details that can help debug further.
We know the required details can be clicked into and checked further in the cloud works specific page by the Integration Admin or the Restricted Integration Admin - But having a flavor of these details in the Automated Email can help up to a great extent.
Eg : - If we know there is a known issue with 1/2 rows that have been generating the error for every run - looking at the email with those records is enough to help analyze and will not need any further login to the cloud works interface to check any further.

UX source model auto updater
Author: Abhishek Roy is a Certified Solution Architect , Sr. Platform Architect at Equinix and a member of their Anaplan CoE team.
This article primarily covers a less explored API functionality that the Anaplan UX is based on and adds an element of automation and easy configuration to the otherwise manual time-taking and error-prone activity of updating source models for UX Apps.
Problem statement
We are already aware of the fact that Anaplan UX pages are pointed to models that help them display the data / visualizations set up by the page builders.
In scenarios where a UX application needs to be duplicated and pointed to another model
(generally, a deployed variant / copy of the standard model) there are typically only two ways to achieve the same.
- Navigate to the Page settings and change the source model definition.
- Open Manage Models from the App default landing page and configure on page-by-page basis.
With both of the options comes the possibility of missing out repointing a couple or more pages and additionally it becomes very cumbersome for apps that have evolved over time and have close to 50+ pages pointed to a variety of models (e.g. Datahub, Spoke 1, Spoke 2 etc).
This solution has been created to address the automation and reduce the error possibility of this activity that most teams face as of today.
Solution overview
Anaplan Springboard API will be leveraged as part of automating the page updates for UX.
This is an API functionality that enables us to directly update parameters of the UX pages like the modelID & the workspaceID it is pointed to.
Sample API cURL’s for Springboard API
Here are few of the sample API cURL that involve using springboard functionality.
- API GET request to get a list of all the Pages and their details associated to an App
curl --location 'https://us1a.app.anaplan.com/a/springboard-definition-service/apps/{AppID}
The AppID can be easily retrieved from the URL while we are on the default landing page of any UX App. Response: The Page Identifier PageID can be retrieved from this request. - API GET request to get the Model details of any page using the PageID.
curl --location 'https://us1a.app.anaplan.com/a/springboard-definition-service/pagemodels/PageID
\--header 'Authorization: ••••••' \
The PageID can be obtained via API request as mentioned in step 1 or can also be taken from the URL when on the desired page. Response: - API PUT request to point the Pages to any intended Model
curl --location --request PUT 'https://us1a.app.anaplan.com/a/springboard-definition-service/pagemodels/{PageID}'\
--header 'Content-Type: application/json'\
--header 'Authorization: ••••••'
--data '{
"pageModels": [
{
"modelId": "ModelID",
"workspaceId": "WorkspaceID"
}],
"isAlmEnabled": true
}'
Response:
As showcased in the steps mentioned above — while it’s possible to get the redirection done via the API calls — this is on a page-by-page basis and will need the parameters like PageID, ModelID, WorkspaceID, etc. to be dynamic as these are identifiers that change whenever we are copying the models or Applications. Additionally, the details of the target model need to be supplied to the API for completing the execution of this request.
This solution takes the dynamic nature into consideration and has the API calls being generated with variables declared on the fly and bundles a looping and referential nature to ensure the same process is carried over for all the pages in the mentioned App that needs to be redirected to the target model.
Due to the size of this guide, we've created a demo video and a downloadable PDF with step-by-step instructions.
Please download the guide here:
And the video can be seen here:
Questions? Leave a comment!
…..Acknowledgements
I would like to take this opportunity to thank few key contributors who have been instrumental in ideating, implementing, and documenting this solution:
- Equinix Center of Excellence members: Sandeep Veeturi (@veeturi.sandeep), Manu Mathur (@manu.mathur), and Brett Harn (@brettharn1115).
- Anaplan: Rob Marshall (@rob_marshall) and Jon Ferneau (@JonFerneau) for their valuable time and insightful feedback throughout the solutioning. And Ginger Anderson (@GingerAnderson) for her constant support and helping us get this article to the finish line!

Bug when creating a delete list action
Hi there, not sure if this is just me but I noticed a possible bug when trying to create a delete action from list using selection for a while now. Specifically, when I'm trying to choose the specific list to delete from using the list dropdown, and start typing the first few letters of the list, it keeps jumping to other lists and showing lists that are irrelevant. It doesn't bother me too much, but wanted to put it here to see if anyone else is experiencing the same issue.
2025 Certificate Maintenance for Certified Master Anaplanners
The 2025 Certified Master Anaplanner Program has begun and there's incredible opportunity for 2025 Certified Master Anaplanners (CMAs) to demonstrate their thought leadership, technical expertise, evangelism, and mentorship capabilities across the entire Anaplan Ecosystem!
At their core, a Certified Master Anaplanner is someone who elevates others in the Anaplan ecosystem through the many ways they share their expertise. They do this through their mentorship, sharing their community perspective and technical architecture thought leadership, demonstrating innovative solutions, providing product feature insight, ideating and inspiring others in the Anaplan Community to define the future of the platform, leading COE development, developing innovative roadmaps to scale the platform for business decision-making, and championing for Anaplan in the competitive market.
Engagement Zones
The 2025 Contribution Activities list is attached to this post, accessible for CMAs to download:
Each contribution activity is mapped to four engagement zones to understand and communicate how Certified Master Anaplanners are driving impact within the Anaplan ecosystem. As mentors, thought leaders, Connected Planning evangelists, and technical experts, Certified Master Anaplanners are critical to elevating the broad Anaplan ecosystem and driving immense value through the many ways they share their expertise!
Each engagement zone is important to the Certified Master Anaplanner Community, the entire Anaplan Ecosystem, and the Anaplan Community! Our goal in highlighting these zones is to give each of you, as Certified Master Anaplanners, the opportunity to easily align your experience and personality with activities that are best suited for you to make a significant impact in a particular area of the Anaplan ecosystem. To that end, we ask that you select an engagement zone to focus on for your primary contribution activities or choose the unique blend of engagement zone activities you want to be recognized for. Will you aim to be one of the few Certified Master Anaplanners who complete activities in all four engagement zones during 2025?
We will be taking an agile approach towards contributions in 2025; therefore, we will review this list at least quarterly and share new contribution activities throughout the year as we determine a need.
Certification Maintenance Requirements
Certified Master Anaplanners must meet two key requirements during 2025 to renew their certifications for 2025. Unsurprisingly, the 2025 requirements are reflective of similar requirements in place during 2024.
- Complete Contribution Activities for 400 points (due December 15th, 2025)
- Questions about the Contribution Activity Requirement or Certified Master Anaplanner recertification status should be directed to MasterAnaplanners@Anaplan.com
- Complete Technical Requirement
- Pass a recertification exam by December 31st, 2025 - no extensions will be approved.
- The study guide and the recertification exam will become accessible on July 8th, 2025 (months earlier than the technical requirement has been accessible in prior years giving you more time to complete this requirement).
- Information on the recertification exam is available here: 2025 Recertification Information
- Questions about the recertification exam should be directed to certification@anaplan.com
Also included again this year is the mid-year check-in requirement. The goal for this requirement is to create greater visibility into where and how Certified Master Anaplanners are contributing, as well as to assist Certified Master Anaplanners in proactively planning how they will attain contributions throughout the year.
There will be a mid-year check-in requirement for Certified Master Anaplanners to complete half (200 points) of the annual points requirement by July 31st, 2025, or ensure they have activities totaling 200 points lined up and confirmed by that date.
Please also read through the 2025 Certified Master Anaplanner Program Terms and Conditions.
Ifyou are a Certified Master Anaplanner who has any questions about the requirements for annual certification maintenance, your status, or how to engage, please email
MasterAnaplanners@Anaplan.com.
If you are interested in the Certified Master Anaplanner Program and wish to learn more about how to become a Certified Master Anaplanner, please review the resources
here
and reach out to
Certificate@Anaplan.com with any questions.

Re: Anaplan & MS Access Transactional API using Python
Hi,
In your python code as you are using json option of request library you will need to post the list formatted data there not the string formatted json_dump. From your format Anaplan payload you could simply return your updates list and send that as json parameter of your request.post.
Re: Dark / Light Mode for Model Building
Wow. I didn't anticipate that my post would create such a high response rate. 😅
Dark Mode has been a personal preference of mine for quite some time, but I agree that the more pressing issue is the complete lack of contrast and visual highlights. The new design makes it particularly unclear where key elements, such as navigation, are located or how different areas within an app are intended to be used. This can be especially challenging for new users or visually impaired people, who may struggle to navigate and work efficiently within Anaplan. I can't understand how this change passed all the approvals at Anaplan.
Re: Dark / Light Mode for Model Building
And/or give us the ability to change the banner color to an appropriate brand color or to differentiate between different models (DEV/UAT/PROD).