0
votes

I am trying to download and save large weather forecasts model output into Azure storage account. The data is available from NOAA/NCEP websit ftp://ftp.ncep.noaa.gov/pub/data/nccf/com/hrrr/prod/hrrr.20200220/conus/ Based on the documentation I have read, there are potentially Azcopy, Azure CLI, and Python SDK I can use. I started with Azure CLI, and try to do it with

az storage blob upload 
--container-name "hrrr"  
--file "ftp://ftp.ncep.noaa.gov/pub/data/nccf/com/hrrr/prod/hrrr.20200220/conus/hrrr.t00z.wrfsfcf36.grib2"  
--name "hrrr.t00z.wrfsfcf36.grib"
--account-name "MyStorageAccountName" 
--account-key "AccountKey"

Which does not work. I could not find other documentation that is close to what I am trying to do. Any solutions? Ultimately, I am hoping to have a script running automatically that fetch data every hour from the NCEP/NOAA to get download the newest forecast into my Azure storage account.

1
Use an FTP command-line program to download the file from the FTP server. Then use az storge to upload the file to Azure Storage. Windows and Linux include the ftp program.John Hanley
yeah, I know. But I need to fetch data periodically, so I need more automatic approachharmony

1 Answers

0
votes

You can use a fairly simple Logic App to do this. Make it a 'Recurrence' trigger set to the schedule you want.

Actions in Logic App:

  1. FTP - List files in folder
  2. For each file - 'Get file content' then 'Create blob' in storage account.