On-demand event based triggers

 

Is there a recommended way to utilize "callbacks" or "webhooks" with Anaplan?  

 

I have a use case where Anaplan needs to update Salesforce in real-time via an event based trigger.

 

Use case summary: End-user inputs data into a dashboard, then clicks a process button to "submit" the data to an export ready saved view. The data in the saved view needs to be transferred to Salesforce. Our go-to integration tool is MuleSoft, however I haven't found a way to update Salesforce in real-time. Every 5 minutes is the best MuleSoft can do. 

 

Related post from June:

https://community.anaplan.com/t5/Anaplan-Platform/Updating-Salesforce-with-Anaplan-data-in-real-time/td-p/139892

@rob_marshall @Misbah 

Best Answer

  • Michael,

     

    Yes its doable. I've seen a custom Anaplan data integrations using AWS.

     

    For example,

     

    Anaplan end user click a button, (boolean data write), the 'true' is exported (transactional api) to a location on an S3 bucket, monitored by AWS cloudwatch & lambda every 30 or 60 seonds.  If true, AWS then launches a process (large volume export of Anaplan data,  exported Anaplan data run through python forecast engine, with results loaded back into Anaplan. Worked great.

     

    Chris

Answers

  • @michael.chefer 

     

    No, as Anaplan is not built that way since everything is "stored" within Hyperblock.  First, why does it have to be in real-time as that will be an action which can "lock" the model.  Can this be run once a day?  Also, I am not an integrations person, but you might want to try the Tranactional API but I believe the user invoking/calling the API has to be a WSA. More information can be found here:  https://help.anaplan.com/cc1c1e91-39fc-4272-a4b5-16bc91e9c313-Use-the-transactional-APIs

     

    If that doesn't work, then you can create an export where the data gets exported from the view, then you have a "listener" that looks for the file in a certain directory.  If it finds it, then it uses the bulk API to upload the data to SFDC.  Again, this is not my area, so please take this with a grain of salt.

     

    Rob

  • @michael.chefer 

    Wish that existed. Unfortunately, not yet. Best you can do is set a listener (I prefer Python) and put it on a scheduler to look for the "request". I did this for big implementation last year and it worked fine. Just remember though that to avoid having the planner hit the "request" button again, you may need to use DCA to disable things until the results come back.

  • @michael.chefer  Anaplan although supports event based mechanism but the auth/invoke call has to be come from outside Anaplan as Anaplan doesn’t and cant invoke auth or request calls. Also when events are being triggered anaplan puts everything in queue meaning concurrent calls are queued until the first call is finished which in turn adds up the waiting time. Hence event based integration is not recommended at all even with Anaplan Transactional Apis. Scheduling is the best thing that you can do. I think the way you have set up is ok as long as the volumes are not that high but as soon as you have too many requests its going to blocking in nature. 
    Hope that helps

    Misbah

    Miz Logix

  • @Misbah 

     

     

    I hear what you're saying about putting everything on a scheduler. However for theoretical discussion and geeking it up, have you heard of AWS lambda ever being used? 

    I found it in this article...https://community.anaplan.com/t5/How-To/Building-a-Chat-Bot-Part-1-using-AWS/ta-p/115641

     

  • @michael.chefer 

    Interesting article. Still looks like something you have to schedule though.

    Come to think of it, during the Anaplan Live! event recently I bumped into a master anaplanner that was toying with the idea of using the notification feature to trigger an event. You'd have to have a service waiting for the email to show up but it's theoretically possible to trigger an event that way. Never tried it myself but worth a few experiments.

  • @michael.chefer 

     

    I have heard about it but never used it. Nevertheless, it doesn't change the way Anaplan processes the requests i.e., all the requests will be processed sequentially. 

     

     

  • To be honest you could do any combinations of all above. I personally prefer the trigger, as @JaredDolich was mentioning (boolean input cell), with combination of microservice using bulk API calls. You can do it on the on-premise server even without a scheduler by putting an infinite while loop in your script with for example few seconds delay.

    The idea would be as follows:

    1. have an export action with the boolean trigger prepared in Anaplan

    2. have a script running that runs an export, checks if the output is true and does everything else you need

    As mentioned you can also do it using any cloud provider. Basically options are there.

     

    This gives a lot of flexibility if you have some scripting skills. You could potentially allow users to trigger data refreshes in Anaplan for the data that is sourced from external systems. Obviously the question is when is it worth it, because from the other hand you loose control over the integrations. But I'll let you decide.

    If you want I can put a simple python code example, if needed.

  • How does anaplan supports events ?

    My understanding was that anaplan cannot produce and send events nor can consume events... 

    I would be interested in where we can find the information relative to event based anaplan mecanisms !

  • If you have an aws savvy person around you, you can do the following :

    - as suggested above : export to a file in aws s3 storage.

    - build a process in aws (we use ste functions) that triggers each time a new version of the file is detected

    - the process then executes an api call to mulesoft to launch the integration process to send the file to sf (provided mulesoft has an api to do that...)

    So you dont need a scheduler there, it would be event based on new version of the file being detected (it s quite standard behaviour in aws)