Am I right that dataset size/refresh limitations (1GB Pro, 10GB
Premium) are also applied for dashboard or .pbix sharing? Since
dashboard or .pbix file will contain all the data. So in import mode I
won't be able to share dashboard based on table > 1/10GB, right?
When you share the report, is not a copy of the dataset, just the report front end. For example with a report with a 1GB dataset, if there are three users using the report, it will just be 1GB in the back end not 1GB * 3 in total in memory. If you share the report by deploying it to a new workspace or coping it again in the work space, then it will take up another 1GB space.
If question 1 answer is Yes, using DirectQuery mode is the only way we
can bypass 10 GB limitation, right?
Yes the only way to overcome the datset limitation is the use of direct query, with the option of aggregation mode to store a sub-set of the data.
Is it possible to bypass dataset limit with aggregated tables? Maybe
build visualizations based on Aggregated tables, and not include
not-aggregated huge fact tables.
This will depend on the amount of aggregation that you will apply to the granular data. If you can answer the organisation/business questions using a query that aggregates the data, then use that approach, other wise if you need a combination, then use the Aggregation Mode in Power BI. Best Practice is to only bring in the data that you need, both in term of columns, rows and granular detail.
Is it possible to bypass dataset limit with own PBI Report Server? For
example, import data from source that support only import mode (i.e.
ADLS) to PBI Desktop, build .pbix file of size 12 GB, download it to
custom Report Server and re-import (refresh) report there 8 times per
day. Is it possible to do without Premium subscription?
Yes, to some degree. Report server is limited to a 1GB size by default, as it stores the report/dataset in the Report Server DB. You can update this in the config, but when you open a large report you spin up a hidden SSAS instance to dump it into the in memory engine. If you have a number of users loading big reports with complex measures, you'll suck up RAM on the server. There is a good answer on the Power BI community site here
Direct Query/Live Connection is designed for large datasets that do not fit into Power BI, I recommend leveraging the data end be it Azure Tabular/Azure SQL DB/DWH etc, to do the heavy lifting and aggregating of data, as you'll burn time loading, data into Power BI on refresh schedules for large datasets.