I am testing SSAS tabular on my existing data warehouse. I read that compression of data in memory will be fantastic, up to 10 times. The warehouse weights about 600MB, analytical model has about 60 measures (mostly row counts and basic calculations). In sql server management studio I checked what is the esimated size of analytical database: ~1000MB. Not what I expected (was hoping for 100MB at most).
I check memory usage of msmdsrv.exe process using a simple Resource Monitor. To my surprise, after full processing of the database memory consumption of the msmdsrv process jumped from 200MB to 1600MB. I deployed second instance of the same model connected to the same source and it grew to over 2500MB. So estimated size was in fact correct.
Data Warehouse is quite typical - star schema, facts and dimensions, nothing fancy.
Why was the data not compressed in any way? How is it possible that it takes even more memory than the uncompressed source warehouse?
I will be most grateful for any tips on this mystery :)