Exporting a file that ends up having over 100 chunks.
The first and last rows of the files are split across different chunks.
Is there a a way to prevent this?
If you are exporting the files via the REST API( Python) you can configure your script to divide the files into chunks and change the size of the chunks to avoid the issue
The API returns 93 chunks already, is there a way to set their size? If so, how do I know an appropriate file size that won't split it again?
I am attempting to connect to an azure SQL database vis at the MSSQL connector in data orchestrator but am having issues and I believe it is due to hyphens in database name (which is a fairly standard naming convention I will note!) Example below (fake credentials) - has anyone else encountered similar and is there a way…
Hi, for some reason my shipping costs are the same over every month and I don't know what to do to fix it.
Hello folks, Does anyone know how to work around the Polaris limitation of using the Formula or Ratio summary method alongside the Closing Balance within the same line item? I have a Business Unit list where the total (All BUs) must display the Corporate (1 of 20 leaf items under All BUs total) number of Unique Customers.…