Continuously polling and downloading data(every minute) from operational model


Hi All , 


I have a requirement to continually poll(every 1 min) and export data from an operational Anaplan model .  

I expressed my concerns about locking / slowing down model therefore to test it out i wrote a script which runs every min and downloads data from an operational model . Data to be downloaded is relatively small 20 rows and 2 columns . Script mimics what a middle ware would do i.e authentication , get ids , run export , download data . I have the script running every min for an entire day in production model while users were logged in and going about their day to to activities . Script has been tested and cofirmed working . 


To my surprise no one reported any degradation / slowness of model . No one reported unexpected blue pop up stating something running . I didnt notice / feel slowness nor i noticed the blue pop thing . 


Question : is any one out there doing such kind of frequent polling ? Am i missing something here ?

personally i am not a big fan of polling operational model every min however i need a reason to back my unlikeness.


thoughts ?




  • @karank 


    Well I had posted a similar query a few weeks ago where in I wanted to use Keep Alive Scripts which would have basically kept my model alive and it wouldn't have let model hibernate when there was an inactivity for more than 60 minutes. But that approach was turned down by David Smith stating its wise to let model hibernate like any other computer.


    Although you are not doing it exclusively to Keep Alive your model but indirectly you are keeping your model up and running all the time, which might have consequences in the long run - (don't have anything to back that up with facts)




  • @Misbah Thanks for pointing out Misbah .

    My model can hibernate during non operational hrs no issues. But during operational time i need it polled every minute to to get up to date(near real time) information on a module.
  • @karank 




    Although the data set that is being exported is pretty small but running the script every minute is sort of aggressive. I would test it thoroughly with as many users in the system as there could be from the business (concurrency) and see if there are any issues. 





  • @karank 

    We do see this in the field


    I will say 1 min is quite aggressive!, but if the data is that small, as you've found, there shouldn't be an impact

    As a reminder, the blocking for data entry is only whilst the export "scope" is gathered; this ensures consistency of export. Once the scope is collected, the blocking is released and users can continue as normal.  So for incremental exports on for large data sets, we do try and limit the scope of the export each time:


    To make it as efficient as possible is to include some kind of "ready to export" flag from the users to said that they are ready for the data to be exported.

    You then couple this with an "already exported" flag to see if has already gone

    Then the export checks for ready to export and NOT already exported and filters accordingly


    So, if nothing has changed since the last export, nothing blocks as nothing needs to be sent.


    I also always include a "reset" flag/process to override the settings should the data need to be sent again


    I hope this helps


  • Thanks David 


    This module is  "Ready to export module" . If its ready and not locked as you outlined middle ware will trigger the final export .