Hi everyone. Sorry to drag this topic back up to the top but I encountered a rather strange issue today.
My standard list hit 1B list item record limit even though there's only 12 million list items. So I checked all my standard lists and saw that the record limit is growing on all of them after each import. Did I completely miss this as a limitation of standard lists or is this something that has changed? Maybe part of the HyperModel changes? I was able to reset the list back to 1 but I thought that was only for numbered lists. Doesn't seem like a sustainable process. This list unfortunately changes every week (uses 30 minute increments - so 48 time intervals per day)
The 999,999,999 limit exists on all lists, and to my knowledge it always has. (Numbered or standard)
You have probably already thought about this, but is your list getting cleared and re-importing each time, or is it only updating the applicable data? If it is clearing and then rebuilding it then it might be worth exploring options to only update changes as that would help with this issue and probably greatly speed up any integrations.
If it is already set up where it is only updating changes then it sounds like you are up against a limitation and as part of a monthly process you might have to the admin reset the index.
I am not 100% about the Hyper models, but I am pretty sure that the answer is that this limitation still exists.
Listen to @jasonblinn , he has been taught well. That 1 billion -1 limit is on all lists, not just numbered lists. As Jason stated, the first question I would as is why are you wiping out the list and rebuilding it. Is there a way to only load delta records from the source? What does the member/code look like? Does it have time in it? Is there a way to load the data dimensionally to cut down the size of the list?
If you want to get on a call, ping me and we can set one up.
I always thought standard lists reset as you load new list items. Good to know.
It is true that this is being wiped and rebuilt but it's a function of the process - since we don't have a time dimension below day the UID requires a concatentation of this time interval and since it is a rolling 7 months snapped to the beginning and ending of the months, for 500 members (which are regularly changing too) it's a constant build and drop. I suppose with some creativity, I can slow it down by dropping a month and adding a month but it will inevitably reach that 1B limit again.
Thanks @rob_marshall I may take you up on that chat. I'm sure there's a better way!