I have a R Notebook created in databricks with some code running in it. The R script creates a csv file that I need to upload to a specific storage account blob container. To access the blob storage I have a SAS (Shared Access Signature) String. I found out an R package 'AzureStor' that connects R to Azure Storage. Is there a specific command I can use to connect to blob storage using this SAS string and write the csv file directly to the container folder? I am new to databricks and trying to automate an upload process to Azure blob storage.
1 Answers
As I known, there are two ways to write a csv file from R Notebook in Databricks to Azure Blob Storage, as below.
Please refer to my answer for the other SO thread How do I upload a R dataframe as a CSV file on Azure blob storage? to use
AzureStor
installed byinstall.packages("data.table")
to write a R dataframe as a csv file into Azure Blob. The sample code is like as below.library(AzureStor) df <- data.frame(Column1 = c('Value 1', 'Value 2', 'Value 3'), Column2 = c('Value 1', 'Value 2', 'Value 3')) account_endpoint <- "https://<your account name>.blob.core.windows.net" account_key <- "<your account key>" container_name <- "<your container name>" bl_endp_key <- storage_endpoint(account_endpoint, key=account_key) cont <- storage_container(bl_endp_key, container_name) w_con <- textConnection("foo", "w") write.csv(df, w_con) r_con <- textConnection(textConnectionValue(w_con)) close(w_con) upload_blob(cont, src=r_con, dest="df.csv") close(con)
First, please follow the offical document
Data > Data Sources > Azure Blob Storage
to mount a container of Azure Blob Storage to DBFS, then you can try to use the functionfwrite
ofdata.table
to write data to the directory of Azure Blob container mounted. Note: you can try to run Python script by R packagereticulate
, please refer to my answer for the SO thread Reading csv files from microsoft Azure using R to know how to use it.