Cloudworks - Data too Large

kjohnson
Occasional Contributor

Cloudworks - Data too Large

Hi all - has anyone ever ran into the situation where the dataset being brought in by Cloud Works is too big? Is there an error message that shows this? I'm suspecting this is the case for me. This is specifically for BQ -> Anaplan. I ran three processes, and the third one triggered was the first one done. The others have completed but no new data has been added. I checked history and there are no logs of changes.

5 REPLIES 5
Misbah
Moderator

@kjohnson 

 

If the file is more than 1 GB then yes it might have failed to load due to that. But if it is less than that then there may be other issues that you may want to look at.

 

We ran into issues where the actions were getting run but the load was not happening and we found out that it was an access issue from aws end.

 

Misbah
Miz Logix

kjohnson
Occasional Contributor

Thanks, Misbah. This pipeline has ran before, so I doubt it's access. I think we're trying to pull more than the max allowed.
Misbah
Moderator

@kjohnson 

 

 Any idea what size the load is?

kjohnson
Occasional Contributor

@Misbah I'm not quite sure. I don't have access to the client's BigQuery instance. I can say with certainty that this is the issue. We chunked the data up to smaller pieces and it ran. I'd love if Anaplan could fix the messaging on this to notify users when this data is too large from BigQuery side.

Misbah
Moderator

@kjohnson 

That would be a great idea! 

 

Please post that on idea exchange forum.

 

Misbah

MIZ LOGIX