Register

Import Transactional Data into DataHub

jagalindo
Contributor

Import Transactional Data into DataHub

Dear Community, 

 

I have a question related to import transactional data (Net Sales) into Data Hub, with the following requirements:

  • The Net Sales and Qty Volumes are grouped by more than 10 attributes required for Sales Planning Purposes. Best practices in Anaplan is to concatenate those attributes that will be the key for the line items, but in this case are more than 10 attributes so I think is not a good choice. 
  • For this transactional it will be stored 3 years of data and it will be uploaded incrementally, means each month will be uploaded a delta. 

 

Main concerns are:

  • What should be the key for the line items? Concatenation of more than 10 attributes or a sequential number (taking into account that import is done based on deltas)? 
  • It should be the time a dimension or line item in the module?

Thanks for your support. 

 

 

10 REPLIES 10
Anilkumarch
Certified Master Anaplanner

Re: Import Transactional Data into DataHub

Hi @jagalindo load the data to the stagging list(Numbered list) using combination of properties. Create the module using stagging list as dimension and in the module create a unique key(based on dimensions in data) for create the base list. Now use the base list to store the data.

 

Regarding the time being used as dimension or line item, better use it as line item itself if there are not many data records varing by time for unique key dimension that was created.

jagalindo
Contributor

Re: Import Transactional Data into DataHub

Thanks @Anilkumarch 

 

the point of use stagging list is that the combination of properties will be more than 10 properties, this transactional data has many granularity required by the business to be imported in Anaplan. 

 

Best regards

Anilkumarch
Certified Master Anaplanner

Re: Import Transactional Data into DataHub

Is the all 10 properties will be vary for each record. I mean you only make a unique record if you combine 10 properties?

 

If yes, then i think there is no option than using 10 properties to import by combination of properties.

If No, then try to find the unique key properties and use those properties to form unique key. This will improve the performance of import than using all 10 properties.

rob_marshall
Moderator

Re: Import Transactional Data into DataHub

@jagalindo 

 

Please don't use combination of properties as it adds unnecessary overhead to the process.  The list can be numbered or not, that doesn't matter because you shouldn't be using properties on that list and using combination of properties requires you do use properties.  Please read this article which explains exactly why and how you should load your data.

 

Data Hubs: Purpose and Peak Performance 

 

The concatenation of ten properties is fine, in fact that is what you are supposed to do.  Just be careful to use the codes of those attributes/properties because you have limited space available (60 characters).  This will be your transactional list.  Now, create two modules:

  • Transactional data which will be dimensionalized by the transactional list and Time.  This is where your transactional data (Net Sales) gets loaded to.
  • The second module will be your Attribute or Properties module where you will, by using formulas, decipher the attributes from the code.

 

That will get your data loaded to the hub correctly.  Now, since your are doing an incremental load from the source, which is PERFECT, now you need to figure out which records/data have been loaded so you only load that data (the data that has changed) to your spoke model.  Please watch this video.

 

Delta Loading 

 

Rob

kmoss
New Contributor

Re: Import Transactional Data into DataHub

Hi @rob_marshall 

I am in a similar system as the original poster, but when I concatenate the codes for each of my 10 attributes from the source system, it exceeds 60 characters.  Is there any best practice way to handle that?

Thanks.

rob_marshall
Moderator

Re: Import Transactional Data into DataHub

@kmoss 

 

Are you using codes for the 10 attributes?  Can you respond with what your code looks like?

 

Rob

kmoss
New Contributor

Re: Import Transactional Data into DataHub

Yes, they are all "codes" but the problem is that the source system was designed where the code value for many of the attributes are effectively the names of the items.  One example "code" is BUSINESS_ACTIVITY_FACILITIES_ELECTRICAL which is 39 characters just in itself and that is only one of the 10 attributes which are being concatenated.

rob_marshall
Moderator

Re: Import Transactional Data into DataHub

@kmoss 

 

Ok, one last question...Is there a way on the source to create a mapping so your code isn't so long?  So, for business, have a table where the first one is 001, the second is 002, etc. and do the same for the rest?  So, when you are creating the "code" with the delimiters, and if you are doing this in SQL, you could do a "lookup" or join to these "Anaplan" code tables.  Sounds like a lot to do, but that will get you over the 60 character limit.  Just make sure your "Anaplan code" on the source system will be long enough, meaning if you have 4000 Businesses, make your code XXXX digits long, or add and extra one to be safe.  If you have 5 digits for every "piece", plus the delimiters, then you should be within the 60 character limit.

 

Hope that helps,

 

Rob

kmoss
New Contributor

Re: Import Transactional Data into DataHub

Unfortunately, this is not something I can do in the source system.  Is there any alternative best practice for this situation?