Anaplan CloudWorks API and Azure Logic Apps

You’ve already had a flavor of using Logic Apps with Anaplan API in this article, thanks to my colleague Sean Culligan.
In this post, we will do something slightly different by using Anaplan CloudWorks API with the famous Azure platform.

One way would have been to decline the previous articles using AWS or GCP and re-use the same code we wrote for those articles in the Azure similar components, as those could be assimilated to Azure Functions.
With some tweaks, the code we've delivered is still valid and can be used for that purpose.

In our previous examples, we wrote some lines of code (Python, Kotlin) that needed to be used and deployed on different platforms. But this time, I wanted to explore a no-code approach with the use of the Azure Logic Apps.

This functionality is quite promising: without a single line of code (but not without formulae), we should be capable of building production-proof data flows. Manipulating visual components and some variables should suffice.

This is quite a claim! Let's see how it went.

Note: I've explored Logic Apps briefly so I might have missed some Azure best practices. I'd like to apologize for that in advance and I strongly recommend you react in the comments. All contributions are helpful.

Also, we will not go through how to trigger an integration via API. This has been explained in this article and also it is described in the apiary.

Use case: event-based triggering

The purpose is to illustrate the triggering of a CloudWorks (CW) flow following an event on the Azure platform. The event-driven integration triggering is not supported today on the CloudWorks solution. Nevertheless, as we've already shown in our previous examples (using Amazon Web Services -AWS- or Google Cloud Platform -GCP-), this can be achieved by combining customization and CW APIs.

To achieve that, we need to address those main objectives:


Azure Logic Apps & Anaplan main concepts

Building a Logic App

Technical flow

This is what we want to achieve:



In order to have such a simple flow when many processings are at stake, we will have to nest some logic apps inside other ones or call individual processes inside a more general one.
For instance, the getIntegrationId box is a logic app on its own. The Until process involves 3 individual subprocesses.

The schema gives an idea of how I've implemented nested apps and how the output of an App is being used by another one.
The App3 is our final logic app (the same flow as the one above).


Nested Azure Logic App


The video below shows how I've built such a flow and demonstrate how it works.



After watching this video, you can continue this article by reading some further technical details on different pieces of that data flow.

Special focus

How to handle the token expiration time?

Objective: Getting connected to Anaplan

If you have been following us for a while, you must know that the primary step of using Anaplan APIs is to generate an authentication token.
Each API call needs to encompass that token and it only lasts 35 minutes. A best practice we recommend is to address longer integration running time when a single token is not enough to go through all the steps.

One possible way would have been to integrate the token generation call within each step: that will work but the readability of the flow will be impacted. Debugging in production mode will then become harder.

To address that, as I described above, I decided to create a Logic App on its own just for that purpose: recurrent API calls will update a file located in a blob storage container. The file's content will then be updated regularly with a valid token.

For subsequent Logic Apps that will require a token, we will then simply use the content of that file by calling a nested logic app.


Getting token flow


The first box is a recurrent schedule: every 15 mn, an HTTP request will be performed and Anaplan API server should respond with a valid token.

We've chosen 15 mn because as a token last 35 mn, if the first call fails for whatever reason, there is still time for the second call.

getClientId and getRefreshToken are secret values that are stored in a Key Vault. We will be using the refresh token to get an access token. This could be achieved by using basic credentials or certificate information as well.


Key Vault storing secrets


As displayed below, we used HTTPS requests to get vault secrets:



The authentication type is Managed Identity so you need to be sure that this feature is enabled in your Azure Instance.

A similar process is then done in parallel with the refresh token.

Thanks to those 2 values, you'll then be able to perform your request to the API server.


Get Access Token: POST request


Getting the information needed

Objective: Information lookup

After securing an always-valid token, we will now focus on getting the IDs of the objects we want to use.
Like all REST APIs calls with Anaplan APIs, a great amount of work is about getting that information and generating the URLs we need for the POST call.

Like the previous step, I'll be generating a Logic App for that purpose. It will then be reusable within another logic app by simply providing the name of the integration we want to find the ID for.

In my example, the following action getIntegrationId (called via an HTTP trigger)


Action getIntegrationId


actually refers to that logic app


getIntegrationId Logic App


After getting authenticated via the getBlobToken, we get the whole list of CW integrations:


Getting the list of available integrations


This list is then converted into an array:


Conversion of the JSON data into an array


on which I apply a filter based on the value of a parameter (the name of the integration I want to execute)


Filtering the array by the integration name


As we are supposed to get only one occurrence in that filtered array, we can then take the last (or first) element of the output and initialize a variable with that value.


Formula for Value: last(body('filteredList'))['integrationId']


The communication between this underlying logic app and the parent one will be done through the Response action:


Sending the output to the parent Logic App


Note that I send the tokenValue as well so that in the parent logic app, I don't need to add another action to call for that token.

Triggering Anaplan CW integration

Triggering the integration is quite simple: you need to perform a POST request based on the information you've gathered in the steps before.



Notice the formulas.
The important fields are:

  • the token value (formula: body('getIntegrationId')['tokenValue']) and
  • the integration ID (formula: body('getIntegrationId')['integrationId'])

Checking the progress of the CW integration

As you may know, every Anaplan action is an asynchronous action, meaning that once you've triggered it, you immediately get a task ID, and as we don't know how long it will take, you need to check via the task ID the intermediate status by polling it with potentially numerous Anaplan APIs calls.

To achieve that, I've implemented a simple Until loop which is native to Azure Logic App.


When expanded, this loop


Checking the progress of the CW integration


consists of checking the message property of the JSON response when querying the progress of the task.

The main steps are:

  1. Getting a valid tokenValue
  2. Get the latest status of the CW integration
  3. Update the endDate variable

This loop will continue until message will be different from "queued".

Sharing the final status

We will use the Response action in order to provide information on our CW triggering:


Sharing information for future needs

Thanks to that information, this logic app can then be integrated into another logic app or Power apps and provide good insight into how the job went.


This immersion in the Logic App environment is really promising: very advanced data processing can be achieved via this no-code environment.

You benefit from logging tools that are available out of the box, connectors proposed directly in the situation, giving you a more secure environment when this will move to production.

In many situations, it is much better to use this kind of solution than using code that will not leverage what the platform can offer.


Got feedback on this content? Let us know in the comments below.

Contributing authors: Joey Morisette and Christophe Keomanivong.