Please note that this article is specific to an MS Windows Server implementation with a CA Certificate and uses Windows Batch Scripting file commands. A good reference for Windows Batch Scripting commands can be found here: https://en.wikibooks.org/wiki/Windows_Batch_Scripting
So, you’ve had your Anaplan environment up and running for a while and your end users are excited. They have asked for more and more functionality and you’ve given it to them—but all of the integrations are manually run on a daily, weekly, monthly, etc. cadence. The users are asking for help streamlining the process and inquiring about if automation is possible—automation of source data in, automation of model-to-model data sharing, automation of budgeting and forecasting data out (for external reporting or even back to your GL system).
Good news! It is possible, and you have been tasked with setting up these integrations! Your organization does not have Cloud options for which Anaplan has connectors (Dell Boomi, Informatica, Mulesoft, etc.), but you can work with your internal IT team members to accomplish your task using AnaplanConnect. You’ll need a server to run the processes, access to a scheduling tool, and some team members in IT to ensure data is staged between systems appropriately.
As you go through the documentation and create some test scripts in your UAT environment, you’ve got things working. You’ve followed the examples and created a parameter section at the top of each script for the integration user account/certificate credentials, the workspace ID, the model ID, etc.
The thing is, your Anaplan landscape is getting big. You’ve got a Development, UAT, and Production environment and multiple live models. You’ve got ALM (mostly) implemented between UAT and Prod, but the new AnaplanConnect v. 1.4.x paradigm requires the 32-character workspace and model ID’s to be present (and correct) within each script, and you can no longer just use the workspace and model names (like in AnaplanConnect v. 1.3). Therefore, even if the Prod and UAT models are named the same, the scripts must uniquely identify them with the 32-character ID.
How do you manage this? How do you successfully migrate these scripts between UAT and Prod without typos or copy/paste issues? Could you even tell what workspace or model the script is actually running against without some kind of lookup key? What to do?
Just like most applications have a file (or two) of common parameters used across the platform, you can create a configuration file specific to the AnaplanConnect parameters needed from your organization’s Anaplan Platform!
Any commonly used setting in your scripts can be centrally defined and managed in your Config file. Examples are:
Please see the file attachment for this article.
Depending on how you organize, nest, or schedule your scripts, be sure to include a command to call the configuration file (where all the common parameters are defined) prior to defining or running the AnaplanConnect operation in the script.
In each script, where you would have explicitly defined the parameter (like ModelID), simply reference the parameter name in your Config file. For example, I have a model called “HUB”. Previously in my scripts, I just defined the ModelID =”HUB” for each operation. Since that no longer works, and 32-character alpha-numeric names are not my thing, I reference the parameter that I named “HUB” in my Config file.
In the Config file:
In the AnaplanConnect script:
This %HUB% parameter is read into memory when the Config file is executed and is available to the AnaplanConnect script for use when needed.
You’ll need to have a configuration file for each workspace that contains information specific to that workspace but can include common server-level information (like extract file path or import file path) that may remain the same throughout the Prod/UAT/Dev instances. Just organize the scripts based on where the model exists and make sure the appropriate configuration file version is called as part of the execution process.
As part of the default example scripts provided in the manual, there is a “Pause” command at the conclusion of the execution section (“Do not edit anything below this line”). By replacing that “Pause” command with an ‘if’ statement and including a definition around whether or not you desire the “Pause” script command to be in effect, you can easily have your scripts in “debug” mode or “production run” mode.
First, replace that “pause” in the execution section of the script with a line that says,
Then, you can control this behavior from the configuration file instead of within each script. Add a couple of lines to the configuration file and comment out the state you don’t want (below shows “debug mode” – pause set TRUE).
Now I’m sure you’re wondering, “Are there more things I can do to streamline my scripts?” The answer is YES. But, those are for you to noodle on and figure out, depending on the needs of your organization. Happy noodling!
What do you think about AnaplanConnect? Let us know in the comments below.
Master Anaplanner Stacey Gibbens, PMP is a Senior Financial Systems Analyst with Simon Property Group. In this role, Stacey manages a suite of financial applications (including Anaplan), wearing any one of several hats depending on the situation and customer in question. This includes project management, application development, systems integrations, and expert-level troubleshooting.
Stacey understands that any information system is only as good as its user interfaces and data flows. Since implementation in 2015, the footprint of Simon’s Anaplan environment has grown from one production workspace comprised of a Data HUB, a Security model, and 5 user models to FOUR production workspaces, comprised of two Data HUBs, a Security model, and 14 user models.
She and her team have been able to leverage the straightforward Anaplan user interfaces, along with AnaplanConnect, scheduling tools, and a data warehouse, to reduce errors and eliminate duplicative data entry between multiple related (but stand-alone) applications as well as produce external, highly formatted Cognos reporting.