Best Of
[Polaris] Allow for the use of Suppress 0's functionality on saved views for Model to Model exports
In as much detail as possible, describe the problem or experience related to your idea. Please provide the context of what you were trying to do and include specific examples or workarounds: While understanding there are potential updates in the works, an alternative solution for Model to Model import challenges within Polaris could be allowing for the use of Suppress 0's in saved views instead of native filtering. With the updates to allow for more rows on the pivot, being able to use saved views with the suppress 0's functionality can replace the vast majority of filtering. Currently you are not allowed to save a view with suppress 0's.
How often is this impacting your users? Frequent
Who is this impacting? All Polaris Users
What would your ideal solution be? How would it add value to your current experience?
This minimize the need for exports with many dimensions to have to be made 'flat'. Making flat exports with lots of dimensions requires multiple staging modules and at times the use of the text fields or item names.
Please include any images to help illustrate your experience.
Estimated size of deployed model when making changes to dev model
After making changes in dev, we could not synchronise them with the production model because the latter model would have been larger than 130GB. This means that we had to turn off / change the newly developed functionalities because we were not able to estimate the size impact of the changes. It would therefore be great to have a way to, when making changes in dev, estimate the size impact in deployed model.
Benefit: save time, better space management
MagaliP
A reading list to broaden your abilities beyond Anaplan technical skills
Author: Tristan Colgate is a Certified Master Anaplanner and Managing Director at Fidenda.
It’s early January and, after a well-needed break over the holiday season, many of us will be returning to work with renewed energy and a desire to level-up our careers over the coming year. I hope this article gives some ideas for those of us who enjoy reading.
Whether you are working in-house at an organization implementing and supporting an Anaplan solution, or a consultant helping organizations do so from the outside, it’s important to develop skills outside of core Anaplan model building and solution architecture.
In my 25+ years of working with EPM technology, I’ve enjoyed reading around the subject to build up rounded skills that help me best serve my customers. In this blog, I’m glad to share some of the key resources that I’ve used in the past. I hope others enjoy some of these as much as I have.
By the way, the list is far from exhaustive and I’m always looking for new titles to read. Please share any recommendations in the comments.
Build a strong foundation of business domain knowledge
It’s important to have a sound grasp of the business process that your solution supports and be able to talk the same language as your business colleagues and customers. Topics like accounting and supply chain planning are vast and intimidating at first to those who haven’t either studied or lived and breathed them in business. The following are some great introductory texts to these large topics.
- Frank Wood’s Business Accounting by Wood and Sangster. I often wonder whether I should have taken an accounting qualification earlier in my career. Instead, I read this book, which gives a fantastic introduction to the topic for the uninitiated. If you’re delivering Anaplan solutions to finance teams there is a base level of accounting knowledge that you must have. This book gives you that and more. Amazon link
- Financial Planning and Analysis and Performance Management by Jack Alexander. Once you have the accounting fundamentals under your belt, it’s time to focus on one of the departments most likely to be able to benefit from Anaplan, FP&A. This book dives deep into the topic and gives a comprehensive overview of the key drivers of business performance, and how FP&A teams analyse those in support of business decision-making. Amazon link
- Group Accounts – a European Perspective by Pierce and Brennan. With Anaplan’s acquisition of Fluence and the availability of the FCR App, it is now important to understand the nuances of group reporting, otherwise referred to as financial consolidation. Consolidation projects are heavy on knowledge of technical accounting and it is essential to have the base knowledge this book provides before embarking on one. Note: it is written from a European perspective and US GAAP has some different concepts. Amazon link
- Supply Chain Management by Sunil Chopra. For anyone unfamiliar with this topic, this book provides a great introduction. Chopra takes the reader from basic principles and there are links to online resources including Excel models showing how the different concepts work. Reading this will give you the background knowledge to have meaningful conversations with supply chain practitioners in your business or customer. Amazon link
- It is not a book, but I can’t recommend highly enough subscribing to Secret CFO here. There’s a newsletter each week unpacking in nerdy detail a different topic relevant to the role of the CFO. I have found that this has helped me understand in better detail the CFO role and what keeps them up at night, so that I can better help them.
Sharpen your soft skills
I sometimes joke that the easy part about any Anaplan project is the bit where we get to sit down and build the solution. On a serious point, everything that happens up to that point is where the hard yards are because it all involves working with other people; whether that’s influencing a senior team to make the investment in Anaplan, or working with business stakeholders to understand their requirements and guide them through the design of a solution. This requires soft skills, and there several books that are very helpful here.
- The Trusted Advisor by Maister, Green & Galford. To wield influence and get things done in an organization it is essential that you build a reputation as someone who can be trusted to impart advice. This book takes the reader through important soft skills such as active listening in that pursuit. Amazon link
- The Five Dysfunctions of a Team by Patrick Lencioni. Whether we’re building an Anaplan Center of Excellence or working in a project team alongside business stakeholders, it’s important to not take team dynamics for granted. This is a particularly compelling book because it uses a story of a team and how they evolve to introduce the author’s model of what makes an effective team. You don’t need to be a leader to read the book — it’s equally useful if you work as part of a team. Amazon link
- Good to Great by Jim Collins. This book is about how you build a great company, but this can be scaled down to how to build great teams. It focuses on the important of having clarity of purpose, the right people in the team, a focus on identifying and facing down challenges, the importance of continual improvement, discipline and tracking progress. Reading this book gave me inspiration feeding into how I built our Anaplan consulting firm and could equally apply if you’re building an internal Anaplan COE. Amazon link
What would you add to my list? Leave a comment, and happy reading!
Multiple approvers in workflow when using, "select user from a line item in a module”
Currently we can assign only 1 approver when assigning using "select user from a line item in a module”. There is no option assign the task to multiple approvers and have one of them approve the task. Even if we create 2 line items each for an approver, since it is a drop down, it is currently giving an option to select only one approver.
The expectation is that when one approves, the workflow should not ask for further approvals and it should move to approved state.
Re: How I Built It: Customizing Summary Methods
Polaris Improvement: You can use HIERARCHYLEVEL top accomplish the results of your SYS Hierarchy Levels with only one line item, summary of Formula.
Inspired by the best: Master Anaplanner insights to close out the year
Hello, Anaplan Community!
What an amazing year it’s been! As we look back on 2025, I am absolutely blown away by the spirit of collaboration and support that defines our community. Every single day, you all show up for each other, sharing knowledge, solving complex challenges, and pushing the boundaries of what’s possible.
This year, you collectively posted over 1,500 questions and provided an incredible 4,000+ answers! That represents thousands of moments of connection, learning, and progress. Thank you for making this community the vibrant, helpful, and indispensable resource that it is.
⭐️ Community Champions of 2025!
I want to give a special shout-out to some of our most active and helpful members this year. Your contributions have not gone unnoticed, and your willingness to jump in and help others is the bedrock of this community. A huge round of applause for a few of our top commenters: @devrathahuja @Dikshant @rob_marshall @seymatas1 @Tiffany.Rice @Prajjwal88 @andrewtye @SriNitya @logikalyan and @alexpavel.
🏆️ Perspectives from Anaplan Certified Master Anaplanners
One of my favorite parts of this community is learning from the best of the best. To celebrate the year, I asked five of our brilliant Certified Master Anaplanners to share their biggest wins from 2025 and what they’re aiming for in 2026.
I hope their stories inspire you as much as they’ve inspired me!
Junqi Xue, Certified Master Anaplanner and Solution Architect at valantic
For Junqi, 2025 was all about building a deeper, more versatile skill set and giving back to the community. Here's what he had to say:
"Here are my learnings from 2025:
Built my foundational knowledge:
a. Completed trainings on ADO, Polaris, Workflow, and tried finding their use cases in our projects and proposals across functions, e.g. finance and supply chain.
b. Attended the workshop for the Anaplan SC APP, getting to understand the capabilities of Anaplan Apps more in details, advocated and shared the knowledge with my internal colleagues.
Enhanced capability: I implemented two complicated use cases of Optimizer (production planning, resource allocation and planning), including the analysis of the results.
Community growth: Some interesting new functions were suggested and added to the valantic Anaplan extension this year, namely the visualization of the line items, list items, or anything, that is used as filter in saved views. Along with the visualization of actions/processed used on the UX. As a result, now the model clean up is even easier, especially solving the pain point of not being blind anymore."
Junqi's target for 2026: "Gather more hands-on experience on the Polaris engine."
Julie Ziemer, Business Solutions Architect at Royalty Pharma, LLC
Julie and her team drove a true transformation in how their organization approaches planning. In her own words:
"This past year has been a truly transitional step forward in our Anaplan journey, driven by myself and my team. What began as an effort to modernize financial modeling has evolved into a full-blown transformation in how we plan, connect data, and collaborate across departments.
We've successfully built and streamlined the foundation of our models, providing a clearer, real-time view of performance and forecasts. This means significantly fewer spreadsheets (win!) and far more confidence in our numbers. Along the way, we rigorously tested, refined our logic, and constantly reminded ourselves that “version control” is not a lifestyle choice, but a life-saving best practice. The progress hasn’t just been about automation; it’s been about building a sustainable framework that enables us to work smarter, not harder.
Looking ahead, our team's focus is on expanding Anaplan’s role in financial reporting, bringing greater automation, transparency, and storytelling to our data. To ensure long-term success, we’re laying the groundwork for a Center of Excellence (CoE) to help our models, processes, and people thrive as a connected Anaplan community."
Julie's goals for 2026:
- "To ensure 100% adoption of Anaplan by all finance team members, completely moving away from legacy systems to establish a single source of truth for planning and reporting.
- To lead the integration of Anaplan beyond the finance function into a new business area, such as research or HR planning, to foster true connected planning across the entire organization."
Wenwei Liu, Anaplan Systems Architect at Atlassian
Wenwei’s journey from consulting to an internal role gave her a powerful new perspective. She shares:
"Through my transition from Anaplan Consulting to the internal Anaplan team, I gained a fundamentally different perspective on how Anaplan supports business connected planning from the customer side. A key accomplishment was contributing to a large-scale model rebuild project that deepened my understanding of the platform's capabilities and identified critical customer needs. This shift from external consultant to internal team member positioned me to bridge consulting expertise with product strategy, uncovering pain points and opportunities that will drive more customer-centric solutions."
Wenwei's goal for next year: "Leverage emerging AI capabilities and new Anaplan features to enhance the connected planning experience for our users, while improving efficiency in model building and internal support processes."
Dmitry Sorokin, Senior Software Engineer at lululemon
Dmitry focused on the power of collaboration and technical excellence to elevate his company's Anaplan ecosystem. He reflects:
"2025 was a year of collaboration and learning. I focused on simplifying complex model logic to reduce calculation times and enhance the user experience across our Anaplan Connected Planning ecosystem. I also partnered with engineering to streamline integration pipelines, improving data refresh speed and overall reliability."
Dmitry's goal for 2026: "I’m excited to continue mentoring new model builders while expanding my architecture expertise. I plan to explore new ways to leverage AI in model building, automated testing, and planning workflows."
Ekaterina Garina, EPM Consultant at Keyrus
Ekaterina spent her year strengthening Connected Planning by bridging the gap between finance and procurement for her clients. She explains:
"This year, I worked closely with clients to strengthen their FP&A capabilities in Anaplan, helping leaders gain clearer insights and make more confident, data-driven decisions. A major highlight was delivering equipment-level variance analysis that improved visibility into cost drivers and simplified performance explanations for stakeholders. Procurement reporting was enhanced through the introduction of clear price and volume impact breakdowns, enabling a deeper understanding of spend movements. These initiatives strengthened Connected Planning by aligning finance and procurement around shared outcomes and consistent insights."
Ekaterina's goal for 2026: "I look forward to continuing to deliver scalable, high-impact solutions that help organizations plan faster and respond more effectively to change."
Now, it's your turn!
What an inspiring collection of achievements! Now, I want to hear from YOU.
What was your biggest win of 2025? What are you looking forward to tackling in 2026? Share your story in the comments below!
Happy planning,
Ginger Anderson
Sr. Manager, Community & Engagement Programs
Re: TEXTLIST() vs [TEXTLIST:]
Ankit,
The function TextList, no matter how it is used, is bad on performance. Why, because it relies on Text and Text is evil due to the amount of "real" memory it uses. In the UX, text is approx 8 bytes, but in reality, it is (2 * the number of characters of the text string) + 48 bytes. So, the string ABC is really 54 bytes. And when you use TextList, those bytes start adding up very quickly. Take a look at the below, when you use text concatenation on 100 million cells

And that is only doing one concatenation, but with TextList, you can have multiple, just adding to it every character at a time which then blows the performance out.
Does this help?
Please let me know.
Rob
[Start Here] Anaplan Connect 2.0 and JDBC Connectivity
- What is Anaplan Connect?
- What is covered in this article?
- What is NOT covered in this article?
- What pre-requisites do I need?
- Deep Dive on Integration
- Integrating with SAP HANA Cloud
- Integrating with Snowflake
- Integrating with Azure SQL Database
- Integrating with Google Cloud (GCP) BigQuery
- Integrating with Oracle database (On-Prem)
- Integrating with Oracle Cloud Database
- Summary
We are kicking off this series, focused on Anaplan Connect 2.0, with 3 articles:
- Start Here - all you need to know to get started on your Anaplan Connect 2.0 journey,
- Advanced Scripts - ready-to-use scripts to take your Anaplan Connect 2.0 further, and
- Integration with On-Prem integrations platforms - connect your Anaplan data to virtually any On-Prem and SaaS platforms
What is Anaplan Connect?
Anaplan Connect is a Java based command line utility that facilitates automation of integrations between Anaplan and data sources. Data sources, currently, supported by Anaplan Connect include flat files and databases (on-prem & cloud). Database connectivity is established via Java Database Connectivity (JDBC).
The following resources will help you get started with Anaplan Connect installation and implementation.
- Data Integration – Part 1
- Data Integration – Part 2
- Introduction to Anaplan Connect
- Anaplan Connect – Data from a Flat File
- Anaplan Connect – Data from a JDBC Database
- Anaplan Connect - Guidelines
- How to leverage sample codes to generate authentication strings from CA Certs?
In this article, we will focus exclusively on integrating Anaplan with following databases using Anaplan Connect and JDBC:
- SAP HANA Cloud
- Snowflake
- Azure SQLDB
- Google Cloud BigQuery
- Oracle (On-Prem)
- Oracle Cloud Database
What is covered in this article?
Connectivity to both On-Prem & Cloud Databases via JDBC using Anaplan Connect is, essentially, the same. The differences are in required JDBC libraries (.jar), how JDBC string is constructed and some nuances in SQL queries. We will present JDBC connection strings for databases listed above, required JDBC libraries, and any relevant tips.
What is NOT covered in this article?
We will not be presenting basics of Anaplan Connect, scripting, and authentication.
Please refer to resources above to get started on Anaplan Connect.
What pre-requisites do I need?
- Knowledge of building Anaplan Connect scripts
- Java 8 installed (or supported JDK. Please refer to Anaplan Connect documentation for Java compatibility).
- Anaplan Connect 2.0 installed on a Windows/Linux/MacOS
- Database access & connectivity information you wish to connect to
- JDBC Driver(s)
- Whitelisting of api.anaplan.com, auth.anaplan.com
- Port 443 open for bi-directional communication
- Anaplan account with either basic authentication (username/password) or CA Certificate.
- Text editor (ex: Sublime Text, UltraEdit, etc…)
Deep Dive on Integration
Integrating with SAP HANA Cloud
In this section, we will cover connectivity to SAP HANA Cloud via JDBC using Anaplan Connect 2.0. We will begin by establishing and testing JDBC connectivity outside of Anaplan Connect, followed by configuring Anaplan Connect scripts to import data from a SAP HANA Cloud table into Anaplan.
Establishing JDBC connectivity
- Download & copy latest jdbc driver for SAP HANA Cloud to <anaplan-connect>/lib directory.
- Download root certificate ‘DigiCert Global Root CA’ from DigiCert. You can download this certificate in pem format here. Additional JDBC information can be found on SAP Help Portal here.
- Add ‘DigiCert Global Root CA’ certificate to Java VM Keystore using following command. Note: You may need to log into linux/MacOS as root to perform this step. Instructions to enable root user on a MacOs can be found here.
Linux/MacOs
keytool -import -trustcacerts -keystore $JAVA_HOME/jre/lib/security/cacerts -storepass <password> -alias DigiCertGlobalRootCA -import -file DigiCertGlobalRootCA.crt
Windows
keytool -import -trustcacerts -keystore "%JAVA_HOME%\jre\lib\security\cacerts" -storepass <password> -alias DigiCertGlobalRootCA -import -file DigiCertGlobalRootCA.crt
Ensure JAVA_HOME is set. You will need your Java VM Keystore password. Default value for Java VM Keystore is likely ‘changeit’.
- Once certificate has been added to the keystore, we will build jdbc connection string. JDBC syntax for SAP HANA Cloud is:
jdbc:sap://<host endpoint>:<port>
- You can obtain endpoint for your SAP HANA Cloud database in SAP HANA Database Explorer under database properties. Port is standard HTTPS port, 443.

- Based on information from the picture above, JDBC string would look something similar to:
jdbc:sap://430f7d5a4c5866c.hana.trial-us10.hanacloud.ondemand.com:443
Test JDBC Connection
- Once we create JDBC connection string to our SAP HANA Cloud database, we will test the connection using Java on the command line. Syntax to test JDBC connection is:
java -jar <jdbc driver> -u User1,Password123 -n 12345678-abcd-12ab-34cd-1234abcd.hana.hanacloud.ondemand.com:443 -o encrypt=true -c "SELECT 1 FROM SYS.DUMMY";
- Launch command prompt (Windows) or Terminal (MacOs) and change directory to location of <anaplan-connect>/lib directory. You should already have your jdbc driver (ex: ngdbc-2.8.12.jar) in this directory.
- Using your database information construct a command line syntax similar to the one shown below, resulting in one row from the SQL query:

- Now that we have successfully established jdbc connection to an SAP HANA Cloud database and tested it, we are ready to configure Anaplan Connect script.
Configure Anaplan Connect Script
In this scenario, I have a database table in SAP HANA Cloud named “Accounts”. This table has data that will be used to populate a LIST (Accounts) and a Module (AccountDetails) in Anaplan. There is also an associated Anaplan Process we will execute from Anaplan Connect script. Anaplan Connect script will connect to SAP HANA Cloud table “Accounts” via JDBC and then will execute a process, procLoadAccountDetails.
Two files need to be configured for Anaplan & JDBC integration:
- Jdbc-query.properties: This file will contain JDBC connectivity information as well as SQL query. Sample example-jdbc-query.properties file is available as a starter in <anaplan-connect>/examples directory.
- Anaplan Connect script: This script will reference the jdbc.properties file (instead of a flat file) to establish connection to the database and execute the SQL query. Output from the query is, then, uploaded to a file (data source) on Anaplan platform.
Configure JDBC Properties
Following steps outline instructions to configure jdbc.properties and Anaplan Connect script.
- Copy example-jdbc-query.properties file to <anaplan-connect> directory.
- Rename the file to something meaningful (ex: jdbc_SAPHANACloud_Select_query.properties)
- Depending on the type of authentication and OS being used, copy appropriate sample script from <anaplan-connect>/examples directory. For this article, I’ll be using sample_basic_auth_import.sh.
- Rename sample_basic_auth_import.sh to something meaningful (ex: jdbc_SAPHANACloud_accounts_select_basic_auth.sh).
- We will edit the jdbc properties file providing required JDBC connection and SQL statement. Update following variables in the jdbc properties file:
Jdbc.connection.url, jdbc.username, jdbc.password, & jdbc.query

- If you notice, the line for jdbc.params is commented out. This is because my SELECT statement does not have any parameters (ex: values for a ‘where’ clause).
Anaplan Connect Script
- We’ll create an Anaplan Connect script now to reference the jdbc properties file.
- In a text editor, open your Anaplan Connect script you copied earlier to <anaplan-connect> directory.
- Add following variables
- ProcessName, jdbcproperties
- Optional: Remove variable “ImportName”. You may choose to execute an Import Action instead of a Process. In such case, you may keep variable ImportName.
- Provide following information:
- AnaplanUser, WorkspaceId, ModelId, ProcessName, FileName, ErrorDump, jdbcproperties

- Modify the line “Operation =” to reference jdbcproperties variable and execute a process instead of an import action.

- Save your script and execute it from the command prompt.
- Successful run will show database connection, SQL query execution, number of records transferred from the database to Anaplan, and results from Process execution.

NOTE: when it comes to exports, we strongly recommend to use csv format. Other extensions like "xls'" might bring some issues.
Next section will present Anaplan Connect connectivity to Snowflake.
Integrating with Snowflake
Setting up Anaplan Connect script to integrate Anaplan with Snowflake is very similar to the steps outlined above for SAP HANA Cloud. Slight modification to jdbc properties file and Anaplan Connect script is all it takes. We will examine JDBC connection properties required to establish connection to a Snowflake database.
In this example, my snowflake database name is “AnaplanDB”, name of virtual Warehouse is “COMPUTE_WH”, and schema is “Public”. Similar to SAP HANA Cloud, I have a table named “Accounts” from which Anaplan Connect will extract data and import into a model via a process.
JDBC Driver
- Download latest JDBC driver for Snowflake by following instructions provided here. Current version of JDBC driver is 3.13.4 (snowflake-jdbc-3.13.4.jar).
- Copy JDBC driver to <anaplan-connect>/lib directory.
JDBC Connection String
Connection string (JDBC) for Snowflake is as follows:
"jdbc:snowflake://.snowflakecomputing.com/?"
Connection parameters include information such as database name, schema name, warehouse name, etc. A comprehensive list of connection parameters for JDBC can be found here.
You can obtain <account_name> from the URL of your Snowflake account.

Based on information from the image, JDBC string for my Snowflake database would be something like:
"jdbc:snowflake://he34739.us-east-1.snowflakecomputing.com/ warehouse=COMPUTE_WH&db=AnaplanDB&schema=public"
JDBC Properties
- Copy example-jdbc-query.properties file to <anaplan-connect> directory.
- Rename the file to something meaningful (ex: jdbc_Snowflake_Select_query.properties)
- Edit JDBC properties file created above and update values for
- Jdbc.connection.url, jdbc.username, jdbc.password, & jdbc.query
- Sample JDBC properties file is shown below

Anaplan Connect Script
- We’ll create an Anaplan Connect script now to reference the JDBC properties file.
- In a text editor, open your Anaplan Connect script you copied earlier to <anaplan-connect> directory.
- Add following variables
- ProcessName, jdbcproperties
- Optional: Remove variable “ImportName”. You may choose to execute an Import Action instead of a Process. In such case, you may keep variable ImportName.
- Provide following information:
- AnaplanUser, WorkspaceId, ModelId, ProcessName, FileName, ErrorDump, jdbcproperties

- Modify the line “Operation =” to reference jdbcproperties variable and execute a process instead of an import action.

- Save your script and execute it from the command prompt.
- Successful run will show database connection, SQL query execution, number of records transferred from the database to Anaplan, and results from Process execution. You may notice additional verbose information. If you choose to omit this information, you may experiment with JDBC connection parameter tracing.

Integrating with Azure SQL Database
Setting up Anaplan Connect script to integrate Anaplan with Snowflake is very similar to the steps outlined above for SAP HANA Cloud. Slight modification to JDBC properties file and Anaplan Connect script is all it takes. We will examine JDBC connection properties required to establish connection to a SQL database on Azure.
In this example, my Azure SQL database name is “AnaplanDB”, on a server labeled “anaplandi”, and schema is “dbo”.

Similar to SAP HANA Cloud & Snowflake, I have a table named “Accounts” from which Anaplan Connect will extract data and import into a model via a process.

JDBC Driver
- Login in Azure Portal, select your database, and choose Connection strings from the left-hand pane.
- Select JDBC on the right-hand pane. You will notice jdbc connection string under JDBC (SQL authentication) and a link to download JDBC driver for SQL server.

- Alternatively, you may download JDBC driver for SQL server from the link here.
- Once you download and unzip the driver zip file, select the jar file that corresponds to your Java version. I have Java 8 installed on my system, therefore, I will choose mssql-jdbc-9.2.1.jre8.jar.

- Copy JDBC driver (ex: mssql-jdbc-9.2.1.jre8.jar ) to <anaplan-connect>/lib directory.

JDBC Connection String
Constructing JDBC string for Azure SQL database is very simple. The connection string for your database is provided to you in Azure portal. Copy JDBC connection string provided in Azure portal under settings > Connection strings > JDBC.

JDBC Properties
- Copy example-jdbc-query.properties file to <anaplan-connect> directory.
- Rename the file to something meaningful (ex: jdbc_AzureSQL_Select_query.properties)
- Edit JDBC properties file created above and update values for
- Jdbc.connection.url, jdbc.username, jdbc.password, & jdbc.query
- Copy JDBC connection string from Azure portal and replace the value for jdbc.connection.url variable.
- Sample JDBC properties file is shown below

Anaplan Connect Script
- We’ll create an Anaplan Connect script now to reference the JDBC properties file.
- In a text editor, open your Anaplan Connect script you copied earlier to <anaplan-connect> directory.
- Add following variables
- ProcessName, jdbcproperties
- Optional: Remove variable “ImportName”. You may choose to execute an Import Action instead of a Process. In such case, you may keep variable ImportName.
- Provide following information:
- AnaplanUser, WorkspaceId, ModelId, ProcessName, FileName, ErrorDump, jdbcproperties

- Modify the line “Operation =” to reference jdbcproperties variable and execute a process instead of an import action.

- Save your script and execute it from the command prompt.
- Successful run will show database connection, SQL query execution, number of records transferred from the database to Anaplan, and results from Process execution.

Integrating with Google Cloud (GCP) BigQuery
Setting up Anaplan Connect script to integrate Anaplan with Google Cloud Platform’s (GCP) BigQuery is similar to the steps outlined above for SAP HANA Cloud. Creating JDBC connection string has some additional steps. We will present steps necessary to capture required information to create JDBC connection to GCP BigQuery. Once JDBC connection string is defined, rest of the Anaplan Connect steps are similar to other JDBC data sources.
Additional information on Anaplan & GCP BigQuery integration can be found on community here.
In this example, my GCP BigQuery dataset name is “anaplandi”, on a project labeled “celtic-spider-206221”, and the table is “Accounts”.

Anaplan Connect will extract data (using SQL SELECT) from Accounts table and import into a model via a process.
GCP Administration
Before we create JDBC connection and Anaplan Connect scripts, we must perform couple of administration tasks on Google Cloud Platform to create a service account and enable BigQuery API. Creating a service account will allow us to authenticate to GCP via a downloadable private key (JSON file).

- In GCP, go to API & Services Credentials Service Accounts Manage Service Accounts to create a service account.


- Once you create the service account, you will need to generate a private key that will contain information about client_email, project_id, and private key.
- Under “Actions” for the service account, select “Manage Keys”. From the drop down list, select “Create new key”.

- Select the downloadable file type (JSON/P12) and secure the file in a safe place. You will need this as part of JDBC connection string.

- Next, we will need to enable GCP BigQuery API.
- In GCP, go to API & Services > Dashboard and click on Enable API and Services.

- Search for and enable following APIs:
- BigQuery API, Google Cloud Storage JSON API


- You should see both APIs enabled in your dashboard.

JDBC Driver
- Download latest GCP BigQuery JDBC driver by Magnitude Simba from Google’s website.
- Copy JDBC driver and related .jar files to <anaplan-connect>/lib directory.
JDBC Connection String
GCP BigQuery JDBC driver by Magnitude Simba uses OAuth2.0 protocol for authentication & authorization (via Oauth APIs). In our example, we will use Google Service Account method. Instructions to setup a Google Service Account can be found here. Additional information on constructing jdbc connection string can be found on Simba documentation here.
You will need following information from GCP console to construct jdbc connection string to BigQuery:
- ProjectId, OAuthServiceAcctEmail, OAuthPvtKeyPath (json/p12 file you downloaded earlier).
- ProjectId can be found from list of projects in GCP console.

- You will find OAuthServiceAcctEmail under service accounts you created earlier.

- OAuthPvtKeyPath is the location of json/p12 file you created earlier.
Once you have above information, you can construct JDBC connection string using following syntax:
jdbc:bigquery://https://www.googleapis.com/bigquery/v2:443;ProjectId=<project_id>;OAuthType=0;OAuthServiceAcctEmail=<email address>; OAuthPvtKeyPath=<path_to_json/p12>;IgnoreTransactions=1;
Sample JDBC connection string may look something similar to:
We will use this JDBC connection string in JDBC properties file next.
JDBC Properties
- Copy example-jdbc-query.properties file to <anaplan-connect> directory.
- Rename the file to something meaningful (ex: gcpbigquery-jdbc-query.properties)
- Edit jdbc properties file created above and update values for
- Jdbc.connection.url & jdbc.query
- Since we’re using service account and OAuth credentials, we don’t need to provide username and password. Comment out variables jdbc.username and jdbc.password.
- Using example shown above, construct jdbc connection string and paste it for variable jdbc.connect.url.
- Sample JDBC properties file is shown below

Anaplan Connect Script
- We’ll create an Anaplan Connect script now to reference the jdbc properties file.
- In a text editor, open your Anaplan Connect script you copied earlier to <anaplan-connect> directory.
- Add following variables
- ProcessName, jdbcproperties
- Optional: Remove variable “ImportName”. You may choose to execute an Import Action instead of a Process. In such case, you may keep variable ImportName.
- Provide following information:
- AnaplanUser, WorkspaceId, ModelId, ProcessName, FileName, ErrorDump, jdbcproperties

- Modify the line “Operation =” to reference jdbcproperties variable and execute a process instead of an import action.

- Save your script and execute it from the command prompt.
- Successful run will show database connection, SQL query execution, number of records transferred from the database to Anaplan, and results from Process execution.

Integrating with Oracle database (On-Prem)
Setting up Anaplan Connect script to integrate Anaplan with On-Prem Oracle database is similar to the steps outlined for other JDBC sources above. Therefore, we will not be regurgitating the same information for Oracle database in this section. We will, however, provide you JDBC driver information and JDBC connection syntax required when connecting to Oracle database (On-Prem).
JDBC Driver
- Download latest jdbc driver (ex: ojdbc8.jar) for your Oracle database version from Oracle here.
- Copy jdbc driver to <anaplan-connect>/lib directory.
JDBC Connection String
JDBC connection string for Oracle database (On-Prem) is pretty straight forward. You will need following information: Oracle server name or IP address, port (generally 1521), service name. JDBC syntax is as follows:
Jdbc:oracle:thin:<server/ip>:<port>:<servicename>
Example: jdbc:oracle:thin:192.168.196.128:1521:xe
Integrating with Oracle Cloud Database
Setting up Anaplan Connect script to integrate Anaplan with Oracle Cloud Database is similar to the steps outlined for other JDBC sources above. For this scenario, we’ll be using Oracle Autonomous Database. In this section we will present the following:
- Download Client Credentials (Oracle Wallet) for authentication
- Required JDBC Drivers
- Constructing JDBC connection string
Download Client Credentials
- Log in to Oracle Cloud. Go to Overview > Autonomous Database > <database>.
- Under database details, click on DB Connection.

- Under Wallet type, select Instance Wallet and click Download Wallet. You may be prompted for a password. Download the wallet to the system that’s running Anaplan Connect.

- Wallet will name following naming convention: Wallet_<dbname>.zip (ex: Wallet_anaplandb.zip).
- Unzip the wallet to a directory. Path to this directory will be used in your jdbc connection string. As a best practice, you will want to secure access to this directory to only “integration user” as it contains private key information for authentication and access.
JDBC Driver
- You will require following JDBC drivers to connect to Oracle Cloud Autonomous Database.
- odbc8.jar, ucp.jar, oraclepki.jar, osdt_core.jar, osdt_cert.jar
- You may download these drivers from Oracle here.
- Copy above mentioned drivers to <anaplan-connect>/lib directory.
JDBC Connection String
JDBC connection string for Oracle Cloud Autonomous Database is relatively straight forward. You will need following information: TNS alias (found in tnsnames.ora in Wallet_<dbname> folder), path to Wallet_<dbname> directory. JDBC syntax is as follows:
Jdbc:oracle:thin:<TNS alias>?TNS_ADMIN=<path-to-wallet>
Example:
jdbc:oracle:thin:@anaplandb_high?TNS_ADMIN=/Anaplan/DataIntegration/OracleCloud/Wallet_anaplandb
TNS alias can be found in tnsname.ora file in Wallet_<dbname> directory.

Anaplan Connect
- Build a jdbc properties file with jdbc connection string, database username, password, and SQL SELECT query.

- Build an Anaplan Connect script referencing the JDBC properties file to execute either an Action or a Process.

- Run Anaplan Connect script

Summary
For a quick reference, the table below provides sample JDBC connection strings for different database solutions we covered in this blog.
|
Database |
JDBC Connection String |
|
SAP HANA Cloud |
jdbc:sap://430f7d5a4c5866c.hana.trial-us10.hanacloud.ondemand.com:443 |
|
Snowflake |
jdbc:snowflake://he34739.us-east-1.snowflakecomputing.com/?warehouse=COMPUTE_WH&db=AnaplanDB&schema=public |
|
Azure SQL |
jdbc:sqlserver://anaplandi.database.windows.net:1433;database=anaplandb;encrypt=true;trustServerCertificate=false;hostNameInCertificate=*.database.windows.net;loginTimeout=30; |
|
GCP BigQuery |
|
|
Oracle (On-Prem) |
jdbc:oracle:thin:192.168.196.128:1521:xe |
|
Oracle Cloud |
jdbc:oracle:thin:@anaplandb_high?TNS_ADMIN=/Anaplan/DataIntegration/OracleCloud/Wallet_anaplandb
|
Additional information on Oracle JDBC Thin Connections and Wallets can be found here.
Ready to move on to the next step? Let's deep dive into Advanced Scripts.
Got feedback on this content? Let us know in the comments below.
’Twas the sprint before go-live
Author: Brent Orr is a Certified Master Anaplanner and Global Data Architect at Accenture.
Set to the tune of 'Twas the night before Christmas'.
’Twas the sprint before go-live
’Twas the sprint before go-live, when all through the model,
Not a calc was circular, not even a toddle.
The blue cells were sparse, all formatted with care,
In hopes that St. Planual soon would be there.
The end users nestled all snug in their seats,
While visions of UX cards danced with their sheets.
And I with my laptop, and they with their asks,
Had just settled in for a few final tasks.
When out in Dev tenant there arose such a clatter,
I alt-tabbed from Slack to see what was the matter.
Away to the history log I flew like a flash,
Tore open the ALM pane — please don’t crash.
The source–target mapping on the new import flow
Gave the luster of horror to objects below.
When what to my bloodshot eyes should appear,
But “Dimension mismatch” and an ominous sneer.
With a grizzled old builder so lively and spry,
I knew in a moment it surely was… me (oh my).
More rapid than CloudWorks my fixes they came,
I whistled and muttered and called lists by name:
“Now L1! Now L2! Now L3 unite!
On Variant! On Style! On Category bright!”
To the top of the hierarchy — don’t you dare fall!
Now aggregate, aggregate, aggregate all!
As SYS modules soar when good builders try,
When they meet a hard problem and refuse to ask “Why?”,
So up to the blueprint my cursor then flew,
With a brain full of mappings, and a lookup or two.
And then, in a heartbeat, I heard on the chat
The pinging and dinging of “Can you just… fix that?”
As I turned to the UX and was spinning around,
Down fell a card with no context to be found.
It was dressed all in grids, from the headers to base,
And the layout was tangled, all over the place.
Its line items twinkled, its formats looked right,
But Sums mixed with Lookups? A terror at night.
A SELECT on a list hiding deep in the code —
I winced, for such sin might cause Prod to explode.
The eyes of the client — how they started to glow!
Their emails like snowflakes began to bestow:
“Just one tiny change,” they said with a grin,
“Can we pivot fake time and add Climate back in?”
I spoke not a word, but went straight to my work,
Untangling logic where gremlins might lurk.
I duplicated modules, but lean, not obscene,
Then pushed all the logic to SYS, nice and clean.
I mapped each product level, from fine up to broad,
With booleans so tidy they made users applaud.
Each LOOKUP and SUM had a crisp single role,
No SELECTs in target, no rogue top-level goal.
The UX got reshaped with a deft little click,
I turned off the totals that recalced not quick.
Then context selectors—just one, not a herd—
Drove every dimension with one simple word.
The performance returned, every calc running fast,
No more stalling rollups like ghosts of the past.
And users exclaimed as they tested with glee,
“This page actually works how we thought it would be!”
I flipped to ALM with a satisfied sigh,
Compared Dev to Prod with a critical eye.
Then pressed “Create Revision,” with fingers crossed tight,
And deployed to Production that cold winter night.
The jobs all completed with nary a hitch,
Not one lonely error, not one broken switch.
And I stepped from my desk into winter’s soft light,
Breathing out a blessing for models done right:
“May your logic stay elegant, your futures bright—
Happy planning to all, and to all a good night!”
BrentOrr
Warning when attempting to change a saved view used by an action
No small amount of problems have been caused in our model by someone unknowingly altering a saved view that is used by an import action. It seems reasonable to me to tell the builder who is about to delete, or overwrite a saved view that is used by an action so they can make a decision to proceed or not.
Ideally, it would also identify WHICH action(s) are using that saved view so you can know where to go if a change to the action is needed as well.
DavidE










