We have integrated SAP with Anaplan and data flows are scheduled from SAP to Anaplan data hub. However we keep getting memory warnings for workspace as i am sure this is to do with workspace memory. From an architecture standpoint how does Anaplan store data in datahub ? when we are sending 2-3 MB of data from SAP how does it multiply within datahub? Are there ways to optimize the dimensions of the datahub?
Also from a training perspective for Anaplan customers what is the best training to attend from a technical architecture ,data hub and modelling standpoint . When i log into the training academy i see several training like model builder 1, model builder 2 etc. Would appreciate any guidance on what is the best training to attend to have a handle on data hub and architecture to make sure future rollouts are done in an optimal way without more memory consumption?
Can you tell me what is the size of your datahub model?
Above screenshot helps in understanding the size of model
There will be several ways to optimize it but seeing the model helps (whether it is done by following best practices?) or provide some additional details where you want to specifically optimize (Identify the list/ module taking huge size and then show data linked to module or list then i can give better input)
For training if it's customer then i would say start from Anaplan Essentials for Customers then go to L1, L2 MB like that
For Datahub the training is provided in Level 2 MB -> Entire sprint is dedicated to Datahub you can refer it.
How does concurrent job load into the data hub during integration? Is it an issue if concurrent jobs load into the data hub ( same table or different table).. how does it behave ? We have several objects integrating into Anaplan data hub and i want to make sure i schedule them right , as some jobs can be concurrent is it an issue ? or is Anaplan capable of handling concurrent jobs into data hub model ?