This is more of a theorical and intuitive question. when I provided a list of columns to partition_on variable in dask_dataframe.to_parquet(), it created Directory like structure in the order of columns provided more like a nested structure.
However, the actual documentation of parquet says that it is a column store data structure and if we provide a list of columns to it then it creates a partitions based on those columns i:e all the rows(if not rowsize is not provided) of specified columns go in one Partition. Is dask to_parquet doing it right way?