@Damian_Broj My client has a requirement to track model size and some other KPI's on daily bases. I have an Anaplan model which keeps track of all other model cell counts on daily bases . I have tried it couple of ways on 8 models across multiple workspaces
1 - Using the suggested method i.e get each line item cell count and data type and get the bytes and convert to GBs
Result :- 7 out of 8 models are way off with GB's displayed on Anaplan . Only 1 model comes very close .
2- Using the approximation method . I.e 1000000000 cells = 9.3 GB
Result :- 7 out of 8 models are very close only one 1 model is way off . Interestingly 1 that is off is the one which works with approach 1
May be i need to convince my client to analyse model size on cell counts rather than Gb's since there is no way to find the gap cell/space which comes from line items within Anaplan
As we’ve said previously, you can’t use cells to Gb. That does not account for the Gb that lists and their subsets take up. I’ve been sizing a few models recently and when I use 500b per list item I’m only slightly off. I found that if I used 93% of the 500b for the lists, I was spot on. That worked for a 52gb model just as for a 2gb model David
@DavidSmith i get that but cells ->bytes -> GB's does not give me what Anaplan shows as model size . For it to work i have to do some approximation as you highlighted above . So if both methods are approximation i am getting better results with second one with much easier calculations.
Perhaps we can have a way to show exact results size by items in Anaplan to take the guess work out of it altogether one day . This might be handy especially considering clients pay by space for Anaplan .
I'm not sure if this will help, but I've attached the sizing spreadsheet I use. As I mentioned, this is pretty accurate.
I tend to use the "error" % to get the totals correct to what is shown in the model, and then use the increase row tot flex the sizing for what-ifs. Certainly, that is pretty close to what I actually see when the lists are increased.
Also, the "error" was pretty consistent initially for different sized models
Yes, Time should be part of it and if your timescale is daily then your model size will be more than 315GB.
However you are not adding the size that list items take up i.e., (21000*500 Bytes) + (307*500 Bytes) + (9*500) - It won't add up that much but it definitely is one of the reasons that your model size grow when your lists grow