1
votes

We have an SSAS tabular model that we want to add partitions to. The server is hosted on Azure with 100GB of memory (the highest tier). We manage to create 5 out of 20 partitions, but when we try to create the sixth partition we get the following error:

Failed to save modifications to the server. Error returned: 'Memory error: You have reached the maximum allowable memory allocation for your tier. Consider upgrading to a tier with more available memory. Technical Details: RootActivityId: b2ae04c9-f0eb-4f62-93f9-adcda143a25d Date (UTC): 9/13/2017 7:43:46 AM

The strange thing is that the memory usage is just around 17gb out of 100gb when we check the server monitoring logs.

I have seen a similar issue in Azure Analysis Services maximum allowable memory issue, but I don't think this is the same problem.

Another funny thing is that we have managed to process another model with the same type of data, but the tables used in that model are even bigger than the tables in this model. The server that is hosting that model has the same amount of memory as the server that is hosting the model that fails partitioning.

If it is of any help, we upgraded this server's tier, so perhaps there is a bug in Azure so it thinks we have the old pricing tier with the lower amount of memory?

1

1 Answers

0
votes

The strange thing is that our on-premise data gateway computer was the cause of this problem.. I don’t know why but we got rid of this error once we restarted the gateway computer...