I needed few clarification regarding inserting data into External Table.
I have created an external parquet table, which is partitioned by week pointing to a hadoop location, after this I have moved the data (a .csv file) to that location.
My doubt is since the table is partitoned by week , even if I just move the file to that directory , hive would not read and I have to use insert command ,compared to say when we have a hive table not partitioned , which will read directly from that hadoop path