Hi,
I'm importing OPEX data from a datahub to a spoke model with the same dimensions but are getting 4x the values for every cell in the target module. Does anyone know why this might be?
Best regards,
Johan
Hi @johan.marketoft
It could be because of a missing dimension (you mentioned that both the source and target module have the same dimensions) or it can be that one of the dimensions has a different level of granularity: for instance, perhaps time dimension (if applies) is in weekly buckets in the source and monthly buckets in the target?
Check the summary methods of the Line Items in the source module, too: set them to "none" and check the result of the data import.
This is what I can think of for now. Please, feel free to share some more details or a couple of screenshots and we will be able to find the root of the issue.
Cheers,
Alex.
When this has happened to me before, it's because I'm misaligned on the time dimension. For example, imported against year, but importing into month. From the 4x comment you made, I believe it would be a quarter vs. year mismatch.
Hi Does anyone successfully find the correct way to adapt the REST API scripts according to the new API v2 CA Certificate standards: adding the new parameter "encodedDataFormat": "v2", I tried to prefix the encodedData using the timestamp (also UTC), using 8+92 or 8+100 bytes but I did not find the right combination…
For auditing purpose I need RTO and RPO details. Could you please share the details and also along with the source URL.
A quick reminder of the Bulk Copy functionality. Bulk Copy allows you copy large volumes of data from one slice of a model to another in a single, optimised operation, instead of using formulas or imports. Use case: copy a version (RF1) into a prior year version (PY RF1) using a versions list to allow for year-on-year…