I have a Python script that reads in data from a .txt file, processes it and then outputs a .txt file. The difficulty is that I have thousands of .txt input files that I need to run through the Python script, so I have turned to Microsoft Azure to run them simultaneously in the cloud.
I'm very new to Azure and cloud computing, but so far I have managed to run a Python script through the Azure Data Factory. It uses a custom Batch Service to run "python main.py", which outputs data to an output blob container in my Azure Storage account.
How do I change this to iterate over all the input .txt files that I have stored in an input blob container?
Thanks