I am trying to connect to Azure blob, the credentials work on cloudworks, but do not work on ADO. Has anyone else experienced this issue?
Hello @jstockwell ,
In Cloudworks, the Azure Blob connection utilizes a SAS token for credentials, where as in ADO, Azure Blob uses an Azure Secret Key which cannot be the SAS token. Are you able to test creating a secret key with the proper permissions to access the Azure Blob container and test your connection in ADO? For reference on the ADO Azure Blob connection: https://help.anaplan.com/import-data-from-azure-blob-storage-c1c54d4d-c645-4ffe-98e6-ee8675b1894a Future State: ADO Azure Blob connection will be enhanced with the ability to use OAuth2.0 for connection into Azure Blob.
@JonFerneau,
We have a client that can only connect to Azure Blob using either Secure Access via RBAC AuthN+Z or Secured Network Access via Secure Tunnel / IP Whitelisting, per internal governance guidelines.
Are either of these options for establishing an ADO to Azure Blob connection, or will they need to seek another route to get their data into ADO?
@DavidEdwards , Currently, Azure Blob does not support secure tunnel connection. They can restrict access to their Azure Blob container to just Anaplan IP addresses, but I am not sure if this meets their requirements. Azure Blob also now supports OAuth within ADO so the use of SAS token is no longer required for connection.
Is there any performance cost difference between a) SUM on a list line item in your data module, and b) dimensioning your data module by the list, then LOOKUP on a dimension? I am asking because I have a data module that is dimensioned by several large lists, creating sparsity. I can reduce sparsity by removing some of…
Hi I'm looking to speak to people who have used/are using the Docusign integration in their business. We are exploring the possibility, but we find the current setup quite limiting in terms of how an end user would interact with the integration, but also issues with concurrency and number field formats. Would appreciate…
A quick reminder of the Bulk Copy functionality. Bulk Copy allows you copy large volumes of data from one slice of a model to another in a single, optimised operation, instead of using formulas or imports. Use case: copy a version (RF1) into a prior year version (PY RF1) using a versions list to allow for year-on-year…