Imported data automatically multiply with 100

Dear everyone,

My company is in the 1st stage of building Anaplan. Recently, we've managed to automate the process of data integration which is to import data from our server (Microsoft AX 2012) into Anaplan. To be simplified, we have 2 step. Firstly, a batch job will download data from server (pre-scheduled) to a local file, and then a script will upload data to Anaplan. I double-check the output and it's correct without x100.

In Anaplan, there will be 2 actions: firstly, it will extract the transaction ID to a list, secondly it will import data to a module. And here comes the issue: the numerical value (money amount) is multiplied with 100 and I don't want that.

We've been trying but cannot find out the problem until now.

Anyone knows how to solve this?

Thank you so much!

Best regards,

Mike

Best Answer

  • Misbah
    Answer ✓

    @mikeng 

     

    Alright! In that case can you please check what is your decimal separator in the Import Action. If it is Comma (instead of Dot) and you have two decimal places in the source File, it will add those many digits into your number

     

    Misbah_0-1582273848864.png

    Misbah_1-1582273912983.png

    Hope this helps

    Misbah

     

     

Answers

  • @mikeng 

     

    That can happen because of duplicate entries in your transaction keys and when you upload the values into the module Anaplan will sum up all the numbers based on that key.

     

    Let's say this is your file - where you have duplicate Transaction keys

    Misbah_0-1582270581071.png

    First you take the transaction Keys from this file into the list - You will get some kick outs because of duplicate and your list will contain only 7 entries.

    Second When you import the data into module it will add up the numbers for the corresponding keys. Below Snip FYI

    Misbah_1-1582270708223.png

    Based on what you have described this could be the reason. Kindly check and let us know

     

    Thanks,

    Misbah

     

  • Thanks @Misbah for your very quick response.

    In my case, the transaction ID is unique, there's no chance for duplicates.

    I've already test many times in which I just uploaded again without deleting the old data for a randomly chosen date (with all the data within that date) and the results are still the same, which means the new data with the same transaction ID replace the old ones, no duplicates.

    Best regards,

    Minh

  • It works! Thank you!