How to find Delta value between current period record and most recent period records.
Hello All,
Need you help in calculating the credit for sales rep month by month for Orders he is working on.
Order# | Order Status | Initial Order Date | Order Change Date | Net Value | Delta | Payout | |
111 | Initial Order | 01-Jan-18 | 100 | 100 | 100 | Jan Credit | |
111 | Changed | 01-Feb-18 | 150 | 150-100 | 50 | Feb Credit | |
111 | Changed | 01-Apr-18 | 300 | 300-150 | 150 | April Credit | |
222 | Initial Order | 01-Feb-18 | 20 | 20 | 20 | Feb Credit | |
222 | Changed | 01-Mar-18 | 40 | 40-20 | 20 | March Credit | |
222 | Changed | 01-Apr-18 | 40 | 40-40 | 0 | April Credit |
I have created some sample data above for your reference
order 111# :
->Initial order came in the month of January and net value is 100 .100 value is considered for sales rep's commision calculation.
->There was a change in the order(order status is taken as reference) in the month of Feb. new net value is 150. So in this case Delta between the current net value and previous net value is considered for commision calculation.
->Then another changed record arrived in the month of April, So delta between April and the most previous period(Feb) is used for commision calculation.
Could you please help me to derivie this delta value between the current period and the most recent period of same orders.
The current period delta calculation should not impact any of the previos period's crediting and payout.
Thanks,
Ankit
Best Answers
-
Hi Ankit,
There is one way to do it that is very robust even if the list you are loading is not sorted. Here is what you will need to do:
- You probably already have a list of Orders and an import list for your data import to Anaplan. If not - you will need both. In my example they are called Orders and # Import Order Data respectively.
- A new list of sequential list items from 1 to maximum number of changes theoretically possible per Order. In my example it is called 1-10 List. You can include as many items as you want to this list, but this will directly impact the amount of space this solution takes up.
- You should already have a module that you use to store imported data. In my example it is called M 01
- A module where the payout is calculated. It will have Orders and 1-10 List as dimensions. In my example it is called M 02.
Below I will walk you through the logic. Please also check the screenshots attached.
- In M 01 you will rank all lines of each order by date. Date for Rank is either Initial or Change date. Ranking is made by Order.
- Im M 02 you will summarize Payouts for all your orders by 1-10 List item. After this is done it is very easy to LOOKUP an item from a previous rank (in this case same as the previous change date) and find the Delta
- Once Delta is calculated in M 02, it is very easy to LOOKUP in M 01
I hope this is helpful. The only risk you are running here it that you have more changes per order than items in your ranking list, so make sure you include a check for this.
Best regards,
Egor.
1 -
Unless I've missed something there is a really simple solution
1. Assuming you have a module that stores your Orders, I've added a line item for a date lookup with the following formula:
IF ISNOTBLANK(Order Change Date) THEN Order Change Date ELSE Initial Order Date
Order is a formatted list
2. Create a "Commission Payout" module dimensioned by month and add three line items
a. Changed Value: Order Details.Net Value[SUM: Order Details.Order, SUM: Order Details.Date Lookup]
b. Cumulate Changed: IF Changed Value = 0 THEN Changed Value + PREVIOUS(Cumulate Changed) ELSE Changed Value
c. Payout: Cumulate Changed - PREVIOUS(Cumulate Changed)
Voila!
David
4
Answers
-
Hello Egor,
Thank you so much for you help with detail explanation.
I have implemented this and it works wonderful.
One last Question, Does it will impact the overall performance of model as we have used the Rank function and we will be having lot many records.
Thanks Again
Ankit
0 -
Hi Ankit,
There should be no issues with performance, but you could run into a problem with size: if you have 100 000 Orders and each can be changed up to 100 times, M 02 module will instantly grow to 10 million cells. Hope this will not be an issue for your model.
Best regards,
Egor.
0 -
I forgot about DECUMULATE.
You can set the Payout formula to DECUMULATE(Cumulate Changed) to be even simpler
David
0 -
I'll just add some detail around the RANK function to highlight that it would be better to build a solution without it where possible. The RANK function performs a lot slower because the calculation has to be done in a single calculation, it can't calculate multiple parts of dimension and combine them, it wouldn't have the same order. So on larger dimensions this performance hit becomes noticeable.
Here's the info we provide in anapedia:
The RANK function could be slow to evaluate at large data volumes, due to performance characteristics of sorting an arbitrary set that takes O(n * ln(n)). An artificial limit of 10M cells is imposed to prevent ranking of large datasets that would slow down the server: if this is exceeded, the model is rolled back. Note that when used against datasets of more than 1 million cells, any change in source values could result in calculations that take a few seconds.
https://help.anaplan.com/anapedia/Content/Calculation_Functions/All/RANK.html
0 -
@Ankitjain . Could you let us know if the new proposed solution works for you
0 -
Hello David,
Thank you for the suggesting such a great sulution.
I have tried to implement it and it is working great.The only possible issue i can see is as we are crossing a order with month which will increase sparcity and this will fet worse if we maintain more then 1 year data.
Thanks,
Ankit
0 -
Hi @Ankitjain
I'm glad it's working for you.
In anwering your question, there is almost always a trade off between performance, sparsity, complexity etc. In terms of the calculation here, yes, it will increase sparsity, but not necessarily calculation time. Sparse structures do take up more cell count, but that doesn't mean the calculation time will be worse. In fact performance often improves with a sparse structure.
In this case, one simple step to mitigate the cell count is to turn off all of the summary options in the Payout module. This is a calculation module that doesn't need subtotals by time or the order hierarchy. That will help speed up the calculation and keep the size down.
You could also look at using Time Ranges to ensure that the calculaiton time frame is applicable only for the future years, not any past years
I hope that helps
David
0