1
votes

I have a server with 48 GB memory and a sql server analysis service (tabular mode), 2016 standard version SP1 CU7 installed on it.

I can deploy a tabular model from visual studio.

I can manually run a XMLA script:

{
  "refresh": {
    "type": "full",
    "objects": [
      {
        "database": "MyCube"
      }
    ]
 }
}

But when i run that script from sql agent job, i get this error :

the JSON DDL request failed with the following error: Failed to execute XMLA. Error returned: 'There's not enough memory to complete this operation. Please try again later when there may be more memory available.'..   at Microsoft.AnalysisServices.Xmla.XmlaClient.CheckForSoapFault

The memory before porcessing is about 4GB, it increases during processing the cube, but when it hits about 18.5 GB, it fails.

Does anybody know a solution?

4

4 Answers

2
votes

Analysis Services Tabular instances in SQL Server 2016 are limited to 16GB of RAM as documented here if you are running Standard Edition. Enterprise Edition removes that cap.

2
votes

When you do a process full you keep a working copy of the cube and in the background you process a shadow copy. When the shadow copy is ready then it will replace the working copy. Basically this means that at processing time you need twice the amount of memory as the size of your cube. This can be an issue when you have the 16 GB limitation per instance with SSAS Standard edition.

One solution is to do a process with clearValues first, this empties the cube, and then to do the full process. More details here http://byobi.com/2016/12/how-much-ram-do-i-need-for-my-ssas-tabular-server/

Or another one is to play with the Memory \ VertiPaqPagingPolicy settings of the SSAS server. See more details here https://www.jamesserra.com/archive/2012/05/what-happens-when-a-ssas-tabular-model-exceeds-memory/ and here https://www.sqlbi.com/articles/memory-settings-in-tabular-instances-of-analysis-services/

And of course another solution is to upgrade to Enterprise Edition.

2
votes

To follow up on Greg comments, i am facing similar issue at work and the workaround was instead of doing a database refresh, i did table refresh instead. I created 2 SQL jobs. My tabular model had 40 tables. So based on the sizes of the tables, i refresh x amount of tables in one job and y amount of table in the other jobs. You can create more than 2 SQL jobs and has less tables per job if you wish. This will put less load on the memory.

0
votes

You can process small subsets of your data by partitioning your tables, this can be handled in SSMS. This Article provides a nice overview on how to achieve this.