Hi Anaplan Team
I have built a data hub to store all my data in that model and later transfer the same to my main model through anaplan connect.
I am facing an issue: Lot of time is taken to load just 1 file of 200 MB in txt format, approx 20 min is taken to load such files and I have multiple files of same size & format to be loaded in system.
Suggest me the best suitable alternative on how to load the files in quick time.
Hope this could be of any help to you.
I am afraid the link you provided is not working and the article on Building a Data Hub is not available anymore. Do you by any chance have a new link?
Have you tried converting the TXT files to excel CSV flat files and loading them in this way?
Also have you followed the best practice by creating a list dimension from the file that you are loading as to not load all the properties directly into the lists properties. Then using this list dimension in a module for which you are then loading the complete data set into?
Would be good to get an explanation of your process to load the data in currently.
Thanks for the prompt response! We are not able to follow that practice since we are not using code in the list but using a numbered list with combination of all the properties to identify the unique combinations of data.
Did you try to convert the TXT file to a CSV?
I believe the issue is due to the time to load will be due to the fact you're loading into a list and not a module.
How many rows of data are you loading in?
Why are you not using a code as it is best practice to always use a code. When using the combination of properties, that takes the system longer to load data because it is trying to figure out the unique row. If you already have that unique code, the system saves time on the load process. Since you are using a combination of properties, how many properties are you using? Is time and the transactional data part of the uniqueness? If so, this is making your list extra long where it doesn't need to be. You need to look at the data, study the data, and understand what makes it unique and have the source create a unique key to be used for the code. Are you deleting the list on every load and then reloading it? If so, this is not best practice and can be remedied with a code for the list and then the transactional data gets loaded to a SYS Transactional module dimensionalized by Time. Any changes get updated so no need to do a delete.
Can you post a portion of the list with properties?