Hi,
How incremental load works from any data source(other than anaplan) to data hub in anaplan?
In short, it doesn't
You will need to produce a file from the data source that only contains the new records
You should include a column that contains TRUE and import this as part of the data load to the hub.
Then use can use that flag to filter on the data when sending it down to spoke models
David
Can you elaborate more on your question, I assume you mean incremental load as what is newly created, what has changed since last data collection. ERPs/Source application has a creation_date, last_modified_date on the transaction tables. From the source, they can identify the incremental set of records and provide to us. On our side, if the record is new, it gets created, if it is existing, it gets updated with latest data. Anaplan's integration on complete collection (load) and incremental collection (load) is same as other applications.
It makes sense to collect only incremental data rather complete data everyday.
Thanks
Arun
Hey everyone, I wanted to share the Anaplan Python SDK with you. It's a Python Library that wraps all Anaplan APIs and makes it easier to interact with Anaplan programmatically. It is mostly designed with Data Integration Scenarios in mind, but it does support all APIs including ALM, SCIM, Audit, and Cloud Works. Please…
Hello team, I am logging into Anaplan through SSO. I would like to create a batch script to import a file into Anaplan. For the Spoke model, I would like to run an process via a batch script to import data from the Data Hub. Can anyone provide me script for both the condition.
Hello, I am receiving an 'Anaplan Upload Failed' Status Description when testing my integration with a Big Query dataset. The integration imports data from BQ to our Anaplan model. No other details given in the error log. I suspect that Cloudworks is not even picking up the file but am not sure what we did wrong on the set…