4
votes

Hi i'm working on a large SSAS tabular model and it's so slow every time i change even the smallest thing it goes away and thinks about it for ages.

The model is massive and pretty sure that's the problem but I've inherited it like that so at the moment cant do anything about that.

Is there a way to stop SSAS loading all the data (or less data) while i'm developing the model in visual studio?

5
Do you use a lot of measures? Because this might be the problem (every time you change something like measure name, introduce calc table, whatever, sequence point algorithm is triggered). Recent SQL Server 2016 SP1 (released in march 2017) introduces performance optimizations for the models with a lot of measures. Please also check with SQL profiler what is taking so long and update your initial post here. ThanksKrystian Sakowski

5 Answers

9
votes

I encountered the same problem in my cube; apparently due to the hundreds of measures my model has. I tried the process clear method described by Vercelli but it did not help in my situation. To resolve this issue I did the following:

  • Open the project in VS
  • Go to Model in the Menu
  • Go to Calculation Options
  • Select Manual Calculation

This brought my time down from 7 - 10 minutes doing something as simple as hiding a measure to 3 to 5 seconds.

3
votes

If you process-clear your workspace DB no data will appear on Visual Studio. Navigate with SSMS to your workspace instance. The Database will appear as your tabular model followed by your userName and a GUID. Right-Click -> Process Database-> Process Clear.

Please try this with caution if your are not sure which DB you are processing.

PS: If your fact tables are partitioned, you can process-clear those and then process only some of the partitions in order to have some data to test.

1
votes

There's nothing you can do about it for the time being - I'm in a similar situation and have searched far and wide for a solution!

What sometimes helps is if you delete the workspace files (the ones with the guid at the end) from the SSAS\data directory. Doing this deletes all the data in the local version of the model which will bring the file size a fair bit. I do it every now and then and find it does usually help - it still keeps the table structures and relationships etc

1
votes

For big, mature models Visual Studio is not the best dev environment. Try to use Tabular Editor from GitHub created by Daniel Otyker. It's an editor that works in both on-line and off-line modes and is super fast. What's more, it lets you change things in bulk using either GUI or scripting via a bit of C# (nothing to be afraid of, though). There's a lot of documentation on GitHub and good examples that can very easily be customized. Also, Daniel has a video on YT that teaches how to use the editor to the best advantage. Hope this helps.

0
votes

When building SSAS cubes with Visual Studio 2019 and you are having performance issues while maintaining the cube, for example, 3-8 minute long delays between modifications to the cube. The way to fix this issue is to Turn Off Automatic Calculation for the model and set the model calculation to Manual Calculation.

To change the calculation method, from the Visual Studio 2019 Enterprise SSDT main menu bar "Model", select "Calculation Options" and select "Manual".

As you can see, it allows toggling between "Automatic Calculation" and "Manual Calculation". If you select "Manual Calculation", then you have a menupad offering the option to "Calculate Now".