how to do incremental load for transactional data?
There are actually two parts for delta loads, one from the source system and one from the Data Hub to the spoke model. First, on the source system, you will need to utilize a system column like last modified date or something similar that most ERP/EDW's have. Additionally, have the source system create an additional column (New or Delta), with either a True/False or better yet a 1/0 (1 = True, 0=False).
On the list, create a property called New or Delta, format it as a boolean. This actually goes against DISCO because you are creating a property, but since it will not be used anywhere else and will be cleared later, this will be ok as you are doing one import (to the list vs. importing to the list as well as a SYS Properties module). When the import is done, you will only have a segmented number of list members with a True, this will be your delta members.
When you set up the views to import into your spoke model, utilize this property to only filter the data that is True.
Now, you will need to setup a Process, in the Data Hub, to clear the New or Delta flag, and then load the data. To setup the clear, create a module "SYS Clear Delta" with two line items: "New" and "Clear Import Flag", both with a format of Boolean. The formula for the New line item will point to the list property. The Clear Import Flag can either be blank or hardcoded to False. Create a view with a filter on the New equal to True, but hide it in from the view (so only the Clear Import Flag is shown. Create an Action to import the "Clear Import Flag" to New or Delta list member property. When you set up your process to import the data into the Data Hub, the first action will be this "clear" action, then the 2nd action will be to insert the new data to the list, and then the 3rd action will be to import the transactional data to Transactional module.
Hope this helps,
There are many ways to do this, but it would be good to understand the type of data you are trying to load:
How can you uniquely identify the rows of transactional data? is there a code? unique reference?
What is the expected volume of data how many rows?
How regularly do you need to load it?
Do you need to load different versions?
Is it going to be loaded via a flat file CSV or an API integration?
Here is a basic example:
Once you have answered some questions I'll be able to help you more.
rows can be reffered uniquly
and volume is millions of data
and two different versions needto be loaded
it is from an API integration
Ok thats great @PujithaB
I would create a list against which the unique identifier is used.
First I would import these items into this list.
Next I would create a module dimensioned by this list to bring in the data with all the relevant line items.(this can be dimensioned by the version also).
Before adding this to the API you need to create import first and then you can link it to the API.
The link below highlights some best practice to follow also:
I hope this helps!