A primer on Anaplan integration using Azure Logic Apps

sean_culligan
edited December 2022 in Best Practices

Introduction

With the release of Anaplan Cloudworks, this flow may be better addressed via a blob storage connection in Cloudworks, or with a combination of a Cloudworks flow and a post request to the Cloudworks API. My colleague @christophe_keom published a guide covering this topic.

In my case, I wanted my flows to run end to end based on a trigger in my source system and more control over the flow end to end with enhanced error reporting so I went down this route. Cloudworks is likely a much simpler option if you have less complex requirements, but this approach does give a lot more flexibility and control.

To set the scene, for this example I am getting data from a variety of cloud applications and staging the data in SQL. When the SQL tables are updated, I run a data factory pipeline to copy the SQL data to storage blobs. The documentation from Microsoft explains how to set up the SQL to blob copy action. When the storage blob is updated, it triggers the run of my Anaplan imports. The flow below shows how I get the data from my blob containers and then load it into Anaplan via the API. My example is specific to me but the building blocks here are generic.

Trigger and flow summary

When a blob in my data folder is updated or modified this flow will run:

sean_culligan_0-1643847134423.png

After the trigger, the flow does the following:

  1. Sets variables that are common. In this example, I initialize a tokenValue variable which I will use to authenticate. I also set a workspace and model variable. This is useful if I want to point this flow to a different model or workspace.
  2. Checks that the blob file updated or modified is the one relevant to my flow
  3. If YES > get the blob content
  4. Authenticates with Anaplan and gets a tokenValue
  5. Posts the blob data to Anaplan
  6. Runs the Anaplan process to consume the data
  7. Checks the status of the Anaplan process and email updates to myself on a loop until the process completes
  8. If NO > does nothing

Triggers and variables

Here we can see the trigger and the variable setting steps. In these steps I am setting variables that I use later in the flows. The last step, at the bottom of the screenshot, is a condition.

sean_culligan_1-1643847134443.png

The condition checks the updated file and, depending on the name of the file, I run the import. If the file name is not valid for this flow, I do nothing.

Here you can see I am getting the blob file name and checking if it is equal to the string "AvazaTimesheetsAP".

sean_culligan_2-1643847134449.png

The conditional shows that if the file name does not meet my criteria, I do nothing. If it does meet the criteria, I do the import process.

sean_culligan_3-1643847134454.png

Generic steps for importing data using Logic Apps 

In my example flow, the aim is to illustrate the steps needed to import data using Anaplan's Bulk API and Logic apps. As such, I have not gone into detail in some areas and am not following some good practices which I will call out now:

  1. I am using basic authentication. This means I need to reset the passwords in my flows every 90 days or so. It is much better to use certification authentication as it will save the task of updating passwords and reduce the risk of failures. Also, when using logic apps you likely can use a keystore to retain the authentication credentials and have this apply across all your flows, rather than setting the variable flow by flow.  The data I am sending to Anaplan is small (under 5 MBs); therefore I am sending the data as a single chunk. The correct way to send data is to chunk the file and send the chunks. This flow is described in the API documentation and may be a bit fiddly to do in logic app and may be a bit fiddly to do in logic apps. It may be a function app in the middle of a flow is the way to go here.

With my two caveats above, let me describe the flow as it is pretty generic past this point.

sean_culligan_4-1643847134467.png

  1. Get the blob content
  2. Authenticate. As above, I am using basic authentication here. It is better to use certificate authentication. Also, if you deploy with a logic app, you can likely do something smarter using a keystore.

sean_culligan_5-1643847134479.png

  1. Parse the result above to get the JSON array into a sensible format. You could likely avoid this step as you can reference the output array directly, but it does give you a nice object with fields that you can see for the next step.

sean_culligan_6-1643847134489.png

  1. Set my token value variable, using the result from above

sean_culligan_7-1643847134494.png

The function for the ‘TokenValue’ is below:

 

body('Parse_JSON_authentication_token')?['tokenInfo']?['tokenValue']

 

As mentioned above, I could have avoided the parse JSON step and done this instead and saved a step.

 

 

body('Get_authentication_token')?['tokenInfo']?['tokenValue']

 

  1. Post the data as a single chunk. You can see how I use my variables to build up the URL. I have hard-coded the FileID as it is unique to this step, so a variable didn’t make sense. In the headers, I force the authorization header and bring the token value we received when we authenticate, and then in the body I post the file content from the blog. Note the content type needs to be application/octet-stream. Also, note the method; we use a PUT. A common failure I see is where the method is not updated and left as a GET.

sean_culligan_8-1643847134511.png

  1. Run the Anaplan process. Rule 1 with automated Anaplan imports is to wrap your imports into a process and call the process. Although you can call the imports directly, it is a bad idea. If you call the process then you have a generic container in Anaplan that you can adjust easily. You can tweak or create a new import to deal with new columns or format changes, or add additional model to model or model to list imports to create a multistep process. By calling a process, you remove dependencies and give yourself flexibility.

Off the soapbox now and let's carry on. As you can see below I build up the URL with some of my global variables. Again the processId is hardcoded here as this flow is unique to 1 processId. When we run the process, notice that I send a body parameter this is a fixed value as below. If you don’t include this you will get a failure.

sean_culligan_14-1643847906147.png

  1. Check the status of the process. When you run the import process Anaplan will return a taskId. You can use this taskId to monitor the process run. Here I parse the result from running the process to get the taskId.

sean_culligan_15-1643847940266.png

  1. Loop to monitor the run. This loop is checking the taskState of the import process, and loops until the taskState is complete.

sean_culligan_16-1643847967805.png

In this loop I am doing the following:

8.1. I pause for 10 seconds. This keeps my email notifications down.

8.2. Get the task status. Using the taskId returned when running the process, I query this endpoint and get the details of the task.

sean_culligan_17-1643847996057.png

8.3. I parse the return from above, and use the elements of the json to email myself details.

sean_culligan_18-1643848020850.png

Conclusion

Although there are improvements to be made, the building blocks show how it is relatively simple to build complex data integrations using Microsoft Logics Apps (or Power Automate). The logical flow of this process and API calls are generic across other platforms and technologies, even if the syntax and deployment will differ. As a result, consider reviewing this guide even if you are planning to use a different platform. This guide should give a starting point to investigating custom integrations and hopefully allay some of the fears around the complexity of developing a custom Rest API.

Certainly from my experience, I find the direct REST API integration to be less complex in many ways than Anaplan connect, as I don't need to deal with the complexities and maintenance associated with the provisioning of a Virtual Machine and maintaining the infrastructure> I also like being in control of the integration behavior end to end. 

 

Got feedback on this content, let us know in the comments below.

Contributing author Anne-Julie Balsamo.

Comments