The key aspect will be in understanding the data and deciding to use a sparse or a dense engine to calculate with. If you take a dense data set in to Polaris or create a formula that makes data dense, then it will consume more memory. So under normal use with sparse data sets, Polaris will not have any memory issues even with the higher memory cost of its map-based storage technology.
The general rule is...
Which engine is more memory efficient depends on overall density:
When more than 33% of cells contain data, the Classic engine is more memory efficient
When fewer than 33% of cells contain data, Polaris is more memory efficient