What I do:
I built ETL processes with power query to load data (production machine stop history) from multiple Excel files directly into PowerBI.
On each new shift (every 8 hrs.) there is a new excel file generated by the production machine that need to be loaded to the data model too.
How I did it:
To do so, power query is processing all files found in a specific folder.
The problem:
During query refresh it need to process all the data files again and again (old files + new files).
If I remove the old files from the folder, power query removes the data also from the data model during the next refresh cycle.
What I need / My question:
A batch process copies new files into the folder while removing all the old files.
Is there a possibility to configure powery query in a way that it keeps the existing data inside the data model and just extend it with the data from the new files?
What I would like to avoid:
I know building a database would be one solution but this requires a second system with new ETL process. But power query does already a very good job for preprocessing the data! Therefore and if possible, it would be highly appreciated if this problem could be solved directly inside power query / power bi.