Forecasting adjustment based on Holiday statistics (sum on specific time items)
I have a challenge and I am searching for the best solution. I have (a lot of) sales data and I am using that data to forecast on a daily level for the coming weeks. National holidays have a huge impact on most products the week before and the week after a holiday. I would like that impact to be reflected in the forecast. I can imagine that these kind of activities have been done before (promotions?), any suggestions are welcome!
I started with a new list (Holiday impact days) that counts backwards to -7 before the holiday and count forward to 7 after the holiday. I would then use that list to adjust the forecast before and after holidays.
So in an example:
- Monday 10 June is a Dutch holiday.
- Then Friday 7 June would be 3 days before the Holiday and be mapped to list item -3 of the Holiday impact days.
- The calculation Friday Actual / Friday Average would determine the impact of the 10 June holiday on Friday the 7th.
- The average of impacts of all Holidays on Fridays that are -3 from the Holiday would give me a nice figure on what holidays on Monday mean for the sales on Friday. However I cannot seem to be able to sum over the time dimension on that list (https://community.anaplan.com/t5/Idea-Exchange/Support-aggregation-over-time-dimension/idi-p/35440) Creating a line item that includes time, the 15 day list AND all of the products is not an option due to the size increase. I am not to fond of creating a new day hierarchy simply for this issue either, but it might be my next step.
This is a nice to have, as updating the forecast manually for each holiday is certainly possible; any solution that requires too much size is not an option.
Thanks for taking a look at my case,
Having tried to do this myself a few years ago, daily sales forecasting is tricky!
But in terms of the issue of summing out of time, the best bet is to map to and from a fake timescale
If you set up the fake timescale to mirror the names of the real timescale it is an easy automatic match using FINDITEM(Time, Fake Time Text), where Fake Time Text is NAME(ITEM(Fake Time)), and similarly, FINDITEM(Fake Time, Time Text), where Time Text is NAME(ITEM(Time))
You use a lookup to bring the data in from Real time and then use a SUM to map the data back from Fake to Real
You could build a time mapping module (by day), to map the Mondays, to Fridays, or vica versa and then use a LOOKUP to map the values, or you could use a OFFSET, LAG or LEAD with a parameter of 3.
You will need a few staging line items, but should need a massive multi-dimensional module
In terms of the calculations, I've built out an Event impact module before, where you calculate the effect of different types of event, using the average of Friday vs all of the non event Fridays for example, to calculate the % uplift. You can then apply that to the baseline Friday.
You can do that for the event itself to uplift the Monday event
What I would say though is be careful of exceptions. I tried to build in so many rules to prevent exceptions, that the baseline ended up too sanitised and the true exceptions were not extreme enough. A better approach is to let exceptions happen, flag then and have those be reviewed by the forecast owners
I hope that helps
Do you have historical data as in when there was national holiday last year how sales curve looked like pre and post holiday season ?
if you do then you can implement quadratic regression in anaplan where you independent variables are days and holiday . holiday variable can be true or false .
if you can share some sample data i can make something in excel /anaplan
In standard demand planning point solutions, these are called "Causal factors". It could be a simple boolean to indicate which are the holidays in past and future or it could be a range of numbers. You then calculate coefficient of correlation to determine how much impact does a holiday makes it on the sales of that same day (google this for formulas under linear regression) using the historical data. Determining the correlation coefficient x days before and after the holiday is also possible. Then apply this coefficient on the resultant forecast to bump up the numbers due to the causal factors defined in the future. Remember, casuals are needed in both past and future.
It is an out of the box configuration in point solutions, but we need to build it from scratch.
I will explore a few modelling options and get back to you
Thanks all for taking the time to answer my request.
My first next step was indeed to build the fake time (alternate day hierarchy) and map it to that one. With regards to your other (very welcome) comments, my experience is the same; building forecasts on exceptions does not work.
It is always difficult to balance between detailed and general, in this case for example, each holiday has a bit different effect on the products, but treating each holiday seperately will decrease the amount of data and cause for worse statistical results. Therefore I decided (for now) to treat all holidays the same, except when they are on different days in the week. Then the planner has the ability to overrule this standard adjustment for specific products. It should save the planner to adjust percentages for about 80% of the products.
I think this is plenty of information to work on for now. Big thanks for another complete and clear answer.0
It is a forecast after all, and there will be so much other "noise" around the specifics of day by day actuals
The purpose of the forecast should be to guide and inform the forecast owners and give a best estimate for forecast owners to review and adjust
A very wise ex boss of mine once said, "if you are going to make assumptions for others, keep them as simple as possible"
Just to add, I have built out an example, and will be posting the solution shortly
Apologies dear community - I have been reminded that I never actually posted the solution
I will get onto it right away
As promised and sorry it's a bit late
I've attached the PDF with the steps - There are a few, but it does highlight some useful techniques and best practices
Thank you @DavidSmith0