@kauldheeraj Anaplan provides access to a Model if you have license for HyperCare. This model is based on the tenant of the organization and provides all the required info which you are looking for in a form of a Model wherein you can review Model size. This model is maintain by Anaplan platform team if you buy HyperCare license.
However I am not sure whether Anaplan has exposed APIs to access all the information you are currently looking at.
We have a very complex and a large number of workspaces and models. In order to manage and automate day-to-day housekeeping activities, we are trying to tap into the metadata of our workspaces, models, modules, lists and their sizes and building analytics around this data to give us quick insights into what and where to clean-up things. e.g. after spending several hours of analysis across a workspace, we can figure out what module is taking up a bulk of the space. We can get this insight constantly if we have the information available through an automated program, which at this time is mostly available through APIs, so that we can refresh this data at regular intervals and analyze the data through a pre-defined model or report.
Right now, we are having to go through each and every model and analyze the modules by looking at the module list, which is very cumbersome and can change with one set of data integration runs to the models.
You can get the size of the models through the API but not sure about the size of modules. However you need to write the scripts to scrap the data looping through the each workspace and Models.
1. Write the function to get the all the Workspace details (will be in JSON format)
2. Another function to get the Model details. - Will be in JSON format
For above two functions you need to use the request library to hit the get request which would require basic authentication.
3. Create a function to scrap the data which would take the workspace ID and model ID as input. In this function you need the selenium library to initiative the login to the Anaplan using the webdriver.Firefox() function and then hit a get request to the Anaplan end points.
4. Loop through the each workspace and model and locate the size of the model using the selenium functions and then convert the size into the required format(GB,MB etc) and save all the details to an excel file.
1. You would require workspace ID, Model ID and Model name from the models end points.
2. To scrap the data first you have to identify the Class name for the workspace by going to the any of the model and click on your account and then click manage models. The window listing all the models in that workspace will appear, then you can see at the top left "workspace size". Then try to find out the class name or class ID(right click and click on inspect) which is listing all the models.
3. Then use the selenium library to find the ID by element and pass the class ID which you found for the models and get the text which will give the model size.
4. Now you need a function to convert the size in one standard format. The model size can be in KB, MB or GB so you need to convert the size in to one format and write all the data to an excel file
Note : You need to loop through each workspace first to get the models.
Thanks @riyazpasha , I have the required data for models and workspaces, looping through each workspace. What I was expecting is to have the size information available in any of the API response. The information is available through the user interface as well as hypercare model at this time. I was trying to see if we could do it through APIs so that we don't need to go to different solutions for the data. Thanks for the detailed response.
How is that one data integration can make a big module small and the other way around? Sounds like you're clearing out lists and rebuilding them from scratch which doesn't sound like a particularly efficient modelling technique.
And if that is true then what will be gained from constantly looking at the module metadata if the following data load it's all going to change anyway.
The APIs will be able to give you model/workspace size which at least is a start point and from there it's the actual investigation as pointed out previously...