RESTful API for automating recurring data loads
I am trying to understand if the RESTful API can help for a use case of automating a recurring monthly data load to a DataHub with a file that is being reloaded on a daily basis with up-to-date data.
When i review the documentation, i see the need for a manual UI step to create the action, so the file is uploaded in Anaplan, however what i am trying to avoid is the need to load the file again every time it is reloaded in order to push the import via the API afterwards.
Is there any way to specify a file to be loaded when running an import action in the API?
Thank you in advance for the cooperation and let me know if i can give any additional information on my case.
As it stands now you only have to create the import manually once but you do need set up the import as "Shared" or "Admin" so your CA Certificate authentication or Basic Authentication ID can access the file and action. The RESTful API cannot "push" data without an action at the Target Model, although I believe I saw on the roadmap that that some of those limitations will change. I use VS Code to run the APIs that give me the action IDs. But you can use just about any tool, including Postman, that can run an API.
Personally, I like to use Python but if I had a license for Informatica, Mulesoft, or Dell Boomi I would use one of those. Just keep in mind ALL import and exports use the API. Doesn't matter if the tool you use is AnaplanConnect, Python, Excel Add-In, or Informatica.
What method are you using to automate this load? Maybe I can give you an example.1
The means of automation i use is a Qlik Sense application with the QV REST connector. Using the API is not an issue at this point, as the QV connector has no restrictions for interacting with the RESTful API. I've created a full bridge of running an export with this approach that serves our purposes, so i'd aim to be able to create an import process aswell.
I've created the import action initially, the file i used then is hosted and reloaded on a daily basis in a corporate server location.
The problem i have is i cannot understand how i can specify (if even possible) the location the file currently resides in. I might be missing something from the documentation, but the only place i see a specification of the upload file is in the attached screenshot, marked in yellow. I'd appreciate your elaboration if possible.
Thank you in advance!
Have you tried putting the path with file name at the parameter marked with yellow in your ss like c/doc/upload.txt?
That might work.1
Hi Abhay, i'm not sure what part of the request that is, by looking at the curl script, it doesn't look like a header, i tried applying it as a parameter, and it returns a Bad request response. Any advice on how to add the path?
> I've created the import action initially, the file i used then is hosted and reloaded on a daily basis in a corporate server location.
>The problem i have is i cannot understand how i can specify (if even possible) the location the file currently resides in
This is not possible using just curl as you can't point to a location outside the PC where the curl command is running.
You would need to have some kind of processing script on the server where the file lies to be able to upload the file into Anaplan
Hi Anirudh, is this a curl-only restriction or an overall need for automating imports? Will i be able to avoid the need for a script if i do it another way?
I am also not aware of other way to adding path in curl but if your use case is to automate the file upload jobs you can try using Anaplan connect , its much easy to configure than APIs. is there any restriction why you are not using Anaplan connect?1
Hi Abhay, we are currently using Anaplan Connect, but we need to be able to trace the import logs during loads. Anaplan Connect does not have that possibility (not per my knowledge, at least), so we're exploring alternatives.