Best Of
Hybrid approach to planning and execution: Anaplan for Salesforce and ADO
Author: Miki Sato is a Product Manager, Product Management Team (Data Management) at Anaplan.
In my previous post, I outlined the different ways Salesforce and Anaplan integrate — from the Embedded UX experience (Anaplan Tab) to data-level integrations such as the Anaplan Data Orchestrator (ADO) Salesforce Connector.
Since October 2025, we have added a new option — Anaplan for Salesforce, a native managed package designed for customers using Revenue Performance Management (RPM) applications.
With three different integration choices now available, many customers ask:
- How is Anaplan for Salesforce different from the Anaplan Tab?
- How does it differ from the ADO Salesforce Connector?
- When should each be used?
The answer is simple: each addresses a different part of the planning-to-execution workflow, and together they enable a complete closed loop.
Two ways to connect Salesforce and Anaplan
1️⃣ Anaplan for Salesforce
A native Salesforce managed package developed by Anaplan and available for RPM customers for the initial release.
Purpose-built for execution alignment:
- Real-time CDC streaming from Salesforce to Anaplan
- Scheduled bidirectional sync (as frequent as every 5 minutes)
- Deep integration with RPM workflows (Territory & Quota, Sales Forecasting)
- Native UX for sales reps and managers
This solution keeps day-to-day operational data synchronized so that forecasting and territory activities always reflect the most current opportunity and account details.
Figure 1: Salesforce-native interface of Anaplan for Salesforce, used to configure bi-directional data synchronization and real-time event-based updates.
2️⃣ ADO Salesforce Connector
Part of ADO, our enterprise data foundation layer.
Purpose-built for planning data management:
- Pulls Salesforce data alongside ERP, finance, supply chain, segmentation, and product data
- Offers transformation, validation, mapping, filtering, and governance
- Supports large historical datasets and lineage requirements
- Designed for stable, repeatable planning pipelines
ADO is also evolving toward bidirectional and near-real-time data flows, powered by future write-back capabilities and streaming support, strengthening its role as the enterprise planning data foundation.
Real‑world perspective: Combined approach
When I worked as a CRM product manager supporting territory and quota planning,
as well as sales pipeline management across multiple organizations,
I learned that no two teams ever operated on the same cadence.
- Some refreshed territories annually, while others adjusted mid-year during reorganizations
- Many revised quotas quarterly or monthly depending on performance
- Strategic or high-touch segments required weekly, sometimes ad-hoc, refinements
- And during quarter-end, pipeline updates continued until the final hours
This variability becomes especially critical in forecasting, where leaders cannot rely on yesterday’s batch data — they need current information.
Dan Koellhofer, head of our GTM Applications Group, emphasized this at Anaplan Connect London:
“Sales forecasting requires systems to be perfectly in sync at all times — especially in the final week of the quarter.”
— "Driving Growth with Anaplan Revenue Performance Applications", Anaplan Connect London 2025
Planning and execution run at fundamentally different speeds.
This is why many organizations adopt a hybrid model:
- ADO provides structured, multi-source data for planning cycles
- Anaplan for Salesforce keeps operational execution synchronized throughout the day
Together they minimize performance loss caused by cadence misalignment.
The following diagram shows how ADO and Anaplan for Salesforce work together to form a continuous planning-to-execution loop, with governed enterprise data feeding planning cycles and operational updates flowing back through Salesforce.
Figure 2: The Integrated Closed-Loop Data Flow using Anaplan for Salesforce and ADO Connector.
Figure 3: Summary comparison: ADO Salesforce Connector vs. Anaplan for Salesforce.
Key considerations for a combined approach
These considerations ensure that both systems operate without conflict and remain aligned as the integration of landscape evolves.
1️⃣ Ownership and required skills
- ADO Salesforce Connector
- Operated by data/integration specialists
- Transformation, scheduling, and monitoring occur in ADO
- Operated by data/integration specialists
- Anaplan for Salesforce
- Implemented within Salesforce
- Requires Salesforce Admin expertise (permission sets, package management, API usage)
- Requires coordination between the Salesforce Admin and Anaplan model builder
- Implemented within Salesforce
Clear ownership prevents configuration of drifts and ensures reliable operations.
2️⃣ Field mapping and data responsibilities
- ADO mapping is configured in ADO
- Mappings can be created in both Anaplan and Salesforce.
To avoid unintended overwrites:
- Define which system owns each object/field
- Prevent both tools from writing to the same field concurrently
- Establish a process for updating mappings when new Salesforce fields are introduced
3️⃣ Refresh cadence and scheduling alignment
While both ADO and Anaplan for Salesforce support frequent updates, their refresh mechanisms operate differently and must be aligned:
- ADO: Flexible scheduled orchestration, supporting high-frequency runs (as frequent as every 15 minutes) and parallel flows
- Anaplan for Salesforce: Salesforce-native event-driven updates (real-time CDC) plus frequent scheduled sync
As ADO evolves toward bi-directional integration, defining clear refresh and overwrite rules becomes even more critical:
- When should planning changes overwrite Salesforce?
- When should Salesforce updates take precedence?
- How should schedules be aligned once both systems can be written back?
4️⃣ Access control and security
Because authentication and integration users differ:
- Use separate integration users for ADO and Anaplan for Salesforce
- Ensure consistency between Salesforce field-level security and Anaplan access rights
- Review permissions whenever mappings or data flows change
5️⃣ Conflict resolution and governance
Define the rules for:
- How to handle conflicting edits
- Whether CRM edits or Anaplan-approved changes take precedence
- Who approves mapping or field ownership changes
- How exceptions are logged and reviewed
This prevents silent overwrites and ensures long-term reliability.
Looking ahead
ADO and Anaplan for Salesforce are not substitutes — they are complementary:
- ADO → multi-source, governed data foundation for planning
- Anaplan for Salesforce → High-frequency execution alignment
- Together → A frictionless planning → execution → performance cycle
We will continue to deepen the alignment between both solutions, providing customers with a more unified experience across data orchestration, RPM workflows, and Salesforce integration.
Further reading
Concept and strategy: Planning, execution, and RPM
- Connected Planning with Salesforce × Anaplan (previous post)
- Revenue Performance Management vs. Traditional SPM
- 2025 ISG Research Revenue Performance Management Buyers Guide
ADO: Planning data foundation and modern MDM
- Why Planning Needs a Different Kind of MDM — and How ADO Delivers
- Anapedia: Anaplan Data Orchestrator Documentation
Anaplan for Salesforce: Execution alignment
Sessions and talks referenced in this article
MikiS
Re: Accessible by design: Our new accommodations for certification exams
Hi
I have completed the recertification exam and Yes! Additional Resources such as planual, Community and Anapedia are available to use even at Kryterion test center
Thanks
Pujitha
PujithaB
Re: Building a Center of Excellence
Hi Anaplan Community Team -
The link to the workbook is broken again (still?) - How else may I access this content?
Thanks!
Re: Anaplan Way - The Four cornerstones
According to the latest information from the Anaplan Academy, the on demand course for The Anaplan way consists of five phases and four cornerstones.
Thank you!
Getting started with Anaplan APIs: What you need to know
Author: Dmitri Tverdomed is a Data Architect and Director at Zooss Consulting.
Following on from my recent article on best practices in data integration, today I take a closer look at Anaplan’s APIs: What they are, why do they matter, and how should you get started to unlock efficiencies for business?
In this article:
- Learn the basics of Anaplan APIs — including Bulk, Transactional, SCIM, and CloudWorks — and how they power automation, integration, and real-time data exchange.
- Explore practical use cases for each API type — from batch data uploads to real-time model updates and automated user provisioning.
- Pick up implementation tips — such as authentication methods, rate limits, chunking files, and managing access controls.
- Find out how to use Postman — and start experimenting quickly with API endpoints before scaling to production environments.
- Access region-specific Anaplan API URLs — including dedicated endpoints for Australia and global instances.
What are APIs?
API stands for Application Programming Interface. It is a set of rules and protocols that allow different software systems to communicate with each other. In simple terms, an API is like a bridge that lets two applications talk to each other.
Anaplan offers a set of various types of APIs that facilitate automation of data integration, log management, user management, deployments and more.
Let’s unpack the main types of Anaplan REST APIs and some real-world use cases where they shine.
Bulk APIs: The reliable workhorse for batch file imports and exports
Bulk APIs are designed for batch-based data transfers. If your use case involves large volumes of structured data moving in and out of Anaplan on a scheduled basis, this is your go-to.
Use cases
- Batch style uploads of transactional or master data.
- Best suited for asynchronous processing of large datasets.
- Trigger model import/export actions as part of a broader ETL flow.
Implementation considerations
- Authentication: Requires secure OAuth 2.0 or basic authentication setup
- File size limits: Typically handles files up to 1GB. Requires chunking of large datasets, 1MB to 50 MB for imports and 10M for exports.
- Action configuration: Must predefine imports/exports actions in Anaplan before calling via API.
- Error handling: Implement retry logic and detailed logging to handle failures and job status checks gracefully.
Transactional APIs: Real-time, direct cell updates
When the business demands up-to-the-minute updates, Transactional APIs offer a finer level of control. These APIs allow you to read and write individual cell-level values directly – no need to trigger import actions.
Use cases
- Real-time updates: pricing, stock levels, activation.
- Instant feedback loops between Anaplan and customer-facing systems.
- Best suited for synchronous processing of small datasets.
Implementation considerations
- Use for real-time, low-volume operations: Ideal for record-level updates, queries, or small data changes – avoid for bulk data tasks.
- Watch model performance: Frequent or excessive API calls can impact model responsiveness – batch operations where possible.
- Secure with OAuth 2.0: Implement secure authentication and enforce role-based access to control data exposure.
- Mind rate limits: Monitor API usage and use retry/backoff strategies to handle throttling or transient errors. Anaplan’s default rate limit is 600 calls per minute.
SCIM APIs: Manage users at scale
System for Cross-domain Identity Management (SCIM) APIs are essential for organisations needing to integrate user access with Identity & Access Management (IAM) platforms like Okta or Azure AD.
Use cases
- Automated user provisioning and de-provisioning.
- Syncing user roles and access between systems.
- Audit-ready compliance reporting.
Implementation considerations
- Attribute mapping: Ensure correct mapping of SCIM fields (e.g. userName, givenName, familyName, emails) to Anaplan’s user schema to prevent sync issues.
- Role and workspace assignment: SCIM can assign users to workspaces, but model role and selective access must be managed separately (e.g. via Anaplan UI or Bulk APIs).
- Audit and monitoring: Implement logging and monitor sync processes to detect failures, mismatches, or permission issues early.
- Exercise caution when implementing user de-provisioning functionality to avoid unwanted loss of access. Implement exception users and backups as safeguarding controls.
CloudWorks APIs: Next-level scheduling and integration
CloudWorks is Anaplan’s no-code integration engine that facilitates integrations to and from cloud storage providers, AWS S3, Azure Blob and GCP. With CloudWorks APIs, you can remotely control integrations and schedules without needing to manually log into the Anaplan UI.
Use cases
- Trigger CloudWorks jobs programmatically.
- Coordinate processes across multiple models.
- Build dashboard buttons that trigger integrations.
Implementation considerations
- Integration setup in UI first: All integrations (e.g. AWS S3 connections, model actions) must be configured in the CloudWorks UI before they can be triggered via API.
- Manage role-based access carefully. An Integration Admin role is required.
- OAuth 2.0 authentication required: CloudWorks APIs use Anaplan’s OAuth 2.0 flow, which ensures secure token management.
- Monitor execution and errors: Use the API to check job status, retrieve run history, and handle errors for auditability and robust automation.
Getting hands-on: Using Postman to explore Anaplan APIs
Postman is a powerful (and free) tool to start exploring Anaplan’s REST APIs. It is a website that runs from any web browser and doesn’t require additional installations or extensions.
Here’s a simple way to get started:
- If you are new to Postman, simply register for free account with your email.
- Visit Postman’s Anaplan workspace.
- Fork the official Anaplan collection into your workspace.
- Set variables (like your base URLs and workspace ID).
- Choose your authentication method: Basic Auth, OAuth, or CA certificate.
- Start testing endpoints, such as retrieving workspace and model metadata, running an action, checking task status.
Using Postman is a great way to prototype before building out full automations in Python, Java, or C#.
I will be elaborating on how to productionize an API-based application in your environment in the next article.
Region-specific URLs of your Anaplan instance
Based on the region of your Anaplan instance, you’ll need to select the appropriate URL for each of the API services.
This article on Anaplan’s help page goes into further detail: Uniform Resource Locators (URLs) and the table below provides the list of Global and Australia-based URLs:
API Type | Global | Australia |
|---|---|---|
Authentication | ||
Bulk | ||
Transactional | ||
CloudWorks | ||
SCIM | ||
Audit |
Above: Global and Australia-based URLs for Anaplan APIs
Real-world use cases for Anaplan API implementation
My team and I have implemented Anaplan APIs to solve a variety of real-world challenges:
- User access management — Automating user lifecycle processes with SCIM and reducing admin overhead.
- Integrations management — User friendly interface to manage Bulk API integrations with error management, notifications and monitoring
- Model health checks — Extracting metadata to assess build quality across workspaces.
- Activity tracking — Analysing user engagement and licence utilisation.
- Real-time triggers — Enabling UX buttons that export data to downstream ERP systems in seconds.
- ALM Automation — Running model syncs and revision tasks without manual intervention.
Final thoughts: Unlocking efficiency with Anaplan APIs
Getting started with Anaplan APIs doesn’t need to be overwhelming. With a bit of setup and the right use case, APIs can unlock huge efficiency gains and enable a level of flexibility that frontend UI or manual processes simply can’t match.
Whether you’re enhancing data integrations, automating user management, or creating custom experiences powered by real-time data, Anaplan’s APIs are a powerful part of your toolkit.
Questions? Leave a comment!
DmitriT
How I Built It: Codebreaker game in Anaplan
Author: Chris Stauffer is the Director, Platform Adoption Specialists at Anaplan.
My kids and I enjoy code-breaking games, and so I wondered if I could build one in Anaplan. Over the years, there have been many versions of code breaker games: Cows & Bulls, Mastermind, etc.
The object of this two-player game is to solve your opponent’s code in fewer turns than it takes your opponent to solve your code. A code maker sets the code and a code breaker tries to determine the code based on responses from the code maker.
I came up with a basic working version last year during an Anaplan fun build challenge. Here's a short demo:
If you would like to build it yourself, the instructions on how the game works and how to build it are below.
Game start: Code maker input
Player 1 code maker starts the game by going to the code maker page and choosing a secret four color code sequence using a grid drop down list. Code makers can use any combination of colors, including using two or more of the same color. You could set up model role and restricting access to this page using page settings to ensure the codebreaker cannot see this page.
Player 2 code breaker input
The code breaker chooses four colors in the first row attempting to duplicate the exact colors and positions of the secret code.
The code breaker simply clicks the first row in the grid, uses a list drop down to make color guesses and then clicks the Submit? button. A data write action writes a true boolean into the code breaker module to turn on the module logic checking, turns on DCA to lock the row submission, and provides an automatic calculated code maker response. Unlike the real board game, the code maker does not have to think about the response nor drop those tiny black and white pins into the tiny holes in the board — the calc module does the work automatically and correctly every time! I’ve been guilty of not providing the correct response to a code breaker attempt which can upset the game and the code breaker.
First guess
Below the code breaker selected blue-red-yellow-blue and clicked the Submit button. Since two guesses are the right colors AND in the right column position, the calculated response is “Black Black”, telling the code breaker that two guesses are in the correct position and are the correct color.
A black color indicates a codebreaker has positioned the correct color in the correct position. A white color indicates a codebreaker has positioned the correct color in an incorrect position. No response indicates a color was not used in the code.
Second guess
In the second row, the code breaker input green in Spot 1, but kept red-yellow-blue and now only has one color in the correct position (yellow spot 3), but gained insight that there are now three correct colors with two in the wrong position thus the black-white-white response.
Third guess
The code breaker selected blue-yellow-green-yellow and now has all the correct colors with only two out of position.
Fourth guess
You have to be a little lucky to break the code on the fourth guess, but hopefully by now you get the idea of how the game works in Anaplan.
The winner of the game is the player that solves the code in fewer guesses than the other player, so each player takes a turn as code maker and code breaker.
Conclusion
The model uses a lot of conditional formulas, text functions (&, MID, ISBLANK, FIND), and custom matching logic, to identify right color and wrong position matches (white) and right color right position matches (black). A very simple couple of UX pages makes it easy to allow for two player game. You’ll have to create the model roles for code breaker and code maker and set up the page security settings.
Attached is the line item export in case you are curious or want to build it yourself. There are probably multiple ways to build the logic or make the formulas more elegant, but it was for me a fun diversion.
Enjoy!
Re: Are there any plans to support actions to rename or delete native Versions?
I have dug around and haven't seen any idea exchange requests that seem related.
Just to make sure I am up to date on my understanding of the current state: versions can be created via an import action, which can be saved and ran again to import more versions. However, the only way to rename or delete a version is via the Versions setting menu, there is no way to update it via a UX page or API.
(As a sidenote, by my testing it seems that only a WSA can run the action to create Versions. Even if a non-WSA is full access or is marked with access to the action, they will get the message "You do not have access to run this action". This is similar to User list imports.)
You are interested in methods that would allow renaming and / or deleting Versions that could be performed from a UX page or triggered by the API, correct?
I assume that using fake versions is not an option in your case, as those could be controlled by an action.
All that to say, I think this would be valid idea exchange for you to submit! I imagine that there could be other model builders could benefit from the functionality. I know they have released improvements to the native User dimension recently.
Re: Boolean filters not showing up in Product Replenishment UX Page for L2 Sprint 3
It looks like the context at the top of the page is at a parent level within the list (Chocolates), rather than at the input level you are showing in the model (Nutzo Bar_EN). You should be able to select Nutzo at the top and view the booleans to input against.
tfearns
Re: Planning, pie, and people: What I’m thankful for in the Anaplan Community
I want to express my gratitude to @rob_marshall , @alexpavel , @DavidSmith ! Thank you so much for sharing your knowledge and for helping so many of us refine our skills!
Re: Planning, pie, and people: What I’m thankful for in the Anaplan Community
well said @MikhilA
I am thankful to @DavidSmith his ALM explanation blogs have helped me a lot.

















